WO2014020966A1 - Fundus analysis device, fundus analysis program, and fundus analysis method - Google Patents

Fundus analysis device, fundus analysis program, and fundus analysis method Download PDF

Info

Publication number
WO2014020966A1
WO2014020966A1 PCT/JP2013/063628 JP2013063628W WO2014020966A1 WO 2014020966 A1 WO2014020966 A1 WO 2014020966A1 JP 2013063628 W JP2013063628 W JP 2013063628W WO 2014020966 A1 WO2014020966 A1 WO 2014020966A1
Authority
WO
WIPO (PCT)
Prior art keywords
region
fundus
layer
unit
layer region
Prior art date
Application number
PCT/JP2013/063628
Other languages
French (fr)
Japanese (ja)
Inventor
雅博 渋谷
Original Assignee
株式会社トプコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社トプコン filed Critical 株式会社トプコン
Publication of WO2014020966A1 publication Critical patent/WO2014020966A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/102Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for optical coherence tomography [OCT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10101Optical tomography; Optical coherence tomography [OCT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30041Eye; Retina; Ophthalmic

Definitions

  • the present invention relates to a technique for analyzing a fundus image formed by using optical coherence tomography (OCT).
  • OCT optical coherence tomography
  • OCT that forms an image representing the surface form and internal form of an object to be measured using a light beam from a laser light source or the like has attracted attention. Since OCT has no invasiveness to the human body like X-ray CT, it is expected to be applied particularly in the medical field and the biological field. For example, in the field of ophthalmology, an apparatus for forming an image of the fundus oculi, cornea, etc. has entered a practical stage.
  • Patent Document 1 discloses an apparatus to which OCT is applied.
  • the measuring arm scans an object with a rotary turning mirror (galvanomirror)
  • a reference mirror is installed on the reference arm
  • the intensity of the interference light of the light beam from the measuring arm and the reference arm is dispersed at the exit.
  • An interferometer is provided for analysis by the instrument.
  • the reference arm is configured to change the phase of the reference light beam stepwise by a discontinuous value.
  • Patent Document 1 uses a so-called “Fourier Domain OCT (Fourier Domain OCT)” technique.
  • Fourier Domain OCT Frier Domain OCT
  • a low-coherence beam is irradiated onto the object to be measured, the reflected light and the reference light are superimposed to generate interference light, and the spectral intensity distribution of the interference light is acquired and subjected to Fourier transform.
  • the form of the object to be measured in the depth direction (z direction) is imaged.
  • this type of technique is also called a spectral domain.
  • the apparatus described in Patent Document 1 includes a galvanometer mirror that scans a light beam (signal light), thereby forming an image of a desired measurement target region of the object to be measured. Since this apparatus is configured to scan the light beam only in one direction (x direction) orthogonal to the z direction, the image formed by this apparatus is in the scanning direction (x direction) of the light beam. It becomes a two-dimensional tomogram in the depth direction (z direction) along.
  • a plurality of two-dimensional tomographic images in the horizontal direction are formed by scanning (scanning) the signal light in the horizontal direction (x direction) and the vertical direction (y direction), and based on the plurality of tomographic images.
  • a technique for acquiring and imaging three-dimensional tomographic information of a measurement range is disclosed. Examples of the three-dimensional imaging include a method of displaying a plurality of tomographic images side by side in a vertical direction (referred to as stack data), a method of rendering a plurality of tomographic images and forming a three-dimensional tomographic image, and the like. Can be considered.
  • Patent Documents 3 and 4 disclose other types of OCT apparatuses.
  • Patent Document 3 scans the wavelength of light applied to an object to be measured, acquires a spectral intensity distribution based on interference light obtained by superimposing reflected light of each wavelength and reference light
  • an OCT apparatus for imaging the form of an object to be measured by performing Fourier transform on the object is described.
  • Such an OCT apparatus is called a swept source type.
  • the swept source type is a kind of Fourier domain type.
  • Patent Document 4 the traveling direction of light is obtained by irradiating the object to be measured with light having a predetermined beam diameter, and analyzing the component of interference light obtained by superimposing the reflected light and the reference light.
  • An OCT apparatus for forming an image of an object to be measured in a cross-section orthogonal to is described. Such an OCT apparatus is called a full-field type or an en-face type.
  • Patent Document 5 discloses a configuration in which OCT is applied to the ophthalmic field.
  • the apparatus described in this document has a function of photographing the fundus and forming a fundus image, and a function of measuring a fundus using OCT and forming a tomographic image. Further, this apparatus analyzes a tomographic image and specifies an image region corresponding to a layer tissue constituting the fundus.
  • Specific target layer tissues include inner boundary membrane, nerve fiber layer, ganglion cell layer, inner reticular layer, inner granular layer, outer reticular layer, outer granular layer, outer boundary membrane, photoreceptor layer, retinal pigment epithelial layer, etc. There is.
  • obtaining an image area corresponding to the layer tissue is synonymous with obtaining an image area corresponding to a boundary position between adjacent layer tissues.
  • a fundus camera has been widely used as an apparatus for observing the fundus before OCT is applied (see, for example, Patent Document 6).
  • An apparatus using OCT has an advantage over a fundus camera or the like in that a high-definition image can be acquired and a tomographic image can be acquired.
  • Age-related macular degeneration is a disease that occurs because the function of the macular portion of the retina is reduced due to aging. Cause symptoms such as invisible.
  • Age-related macular degeneration is thought to develop by the following mechanism.
  • Normal retinal cells repeat metabolism. Waste products generated by metabolism are normally digested in the retinal pigment epithelium and disappear. However, when the function of the retinal pigment epithelium decreases due to aging, undigested waste products accumulate between the Bruch's membrane and the retinal pigment epithelium layer. When photographing the fundus in this state, the waste is recognized as a white mass called drusen. When waste is accumulated, a weak inflammatory reaction occurs. Then, a specific chemical substance (chemical mediator) is produced and promotes healing of inflammation. However, chemical mediators have factors that promote the generation of blood vessels, and new blood vessels (new blood vessels) grow from the choroid. When new blood vessels break through the Bruch's membrane and invade below or above the retinal pigment epithelial layer and proliferate, blood and blood components are exuded so severely that the macular function decreases.
  • An object of the present invention is to provide a technique capable of effectively detecting drusen based on a tomographic image of the fundus.
  • the invention according to claim 1 is directed to a retinal pigment epithelium layer based on a storage unit that stores a tomographic image depicting a fundus layer structure and a pixel value of a pixel of the tomographic image.
  • a layer region specifying unit for specifying a corresponding layer region in the tomographic image, an approximate curve calculation unit for obtaining an approximate curve based on the shape of the layer region, and the approximate curve in the depth direction of the fundus than the layer region
  • a protruding region specifying unit that specifies a protruding region that is located and a distance between the approximate curve in the depth direction and the layer region is equal to or greater than a predetermined threshold; the protruding region; and the approximate curve and the layer
  • the fundus analysis apparatus includes a connected region specifying unit that specifies a connected region surrounded by a region and a form information generating unit that generates form information representing the form of the connected region.
  • the invention according to claim 2 is the fundus analysis apparatus according to claim 1, wherein the approximate curve calculation unit calculates a plurality of characteristic parts in the layer region based on the shape of the layer region. A characteristic part specifying unit to be specified is included, and the approximate curve is obtained based on the plurality of characteristic parts.
  • the invention according to claim 3 is the fundus analysis apparatus according to claim 2, wherein the characteristic part specifying unit is the deepest part in the layer region in the depth direction based on the shape of the layer region. Is defined as a characteristic part, a straight line passing through the deepest part and in contact with the layer region is obtained, and a contact point between the layer region and the straight line is defined as a further characteristic part.
  • the invention according to claim 4 is the fundus analysis apparatus according to claim 3, wherein the characteristic part specifying unit rotates a straight line passing through the deepest part around the deepest part.
  • the contacts are sequentially identified by the above.
  • the invention according to claim 5 is the fundus analysis apparatus according to claim 3, wherein the characteristic part specifying unit rotates a straight line passing through the deepest part around the deepest part to provide a contact point. A further contact is specified by specifying and rotating a straight line passing through the specified contact around the contact.
  • the invention according to claim 6 is the fundus analysis apparatus according to any one of claims 2 to 5, wherein the approximate curve calculation unit is a distance between two adjacent characteristic parts.
  • the invention according to claim 7 is the fundus analysis apparatus according to any one of claims 2 to 6, wherein the approximate curve calculation unit calculates a free curve based on the plurality of characteristic parts. And obtaining the approximate curve based on the free curve.
  • the invention according to claim 8 is the fundus analysis apparatus according to claim 7, wherein the approximate curve calculation unit obtains a spline curve as the free curve.
  • the invention according to claim 9 is the fundus analysis apparatus according to claim 8, wherein the approximate curve calculation unit obtains a cubic spline curve as the spline curve.
  • the invention according to claim 10 is the fundus analysis apparatus according to any one of claims 7 to 9, wherein the approximate curve calculation unit is substantially linear in the layer region.
  • the straight part specifying unit for specifying a part and the free curve are located in the depth direction with respect to the straight part, the free curve is deformed so that the part of the free curve matches the position of the straight part.
  • a first deformation unit, and the approximate curve is obtained based on a deformation result by the first deformation unit.
  • the invention according to claim 11 is the fundus analysis apparatus according to any one of claims 7 to 10, wherein the approximate curve calculation unit is more in the free curve than the layer region.
  • a curved part specifying part for specifying a part located in the reverse direction of the depth direction; and a second deforming part for deforming the free curve so as to match the specific part with the position of the layer region,
  • the approximate curve is obtained on the basis of the deformation result obtained by the above.
  • the invention according to claim 12 is the fundus analysis apparatus according to any one of claims 7 to 11, wherein the approximate curve calculation unit is in the vicinity of an end of a frame of the tomographic image.
  • a protrusion determining unit that determines whether the layer region protrudes in a direction opposite to the depth direction, and when it is determined that the layer region protrudes, the free curve portion corresponding to the protruding portion is defined as the frame.
  • a third deforming portion that deforms the free curve by replacing with an extension line of the free curve from the center side, and the approximate curve is obtained based on a deformation result by the third deforming portion.
  • the invention according to claim 13 is the fundus analysis apparatus according to any one of claims 1 to 12, wherein the projecting region specifying unit includes each of the approximate curves in the depth direction. A distance between a point and the layer region is calculated, it is determined whether the calculated distance is equal to or greater than the predetermined threshold, and a set of points on the approximate curve determined to be equal to or greater than the predetermined threshold and the layer An image region sandwiched between regions is specified as the protruding region.
  • the invention according to claim 14 is the fundus analysis apparatus according to any one of claims 1 to 13, wherein the morphological information generation unit analyzes the tomographic image to obtain the fundus Including a nipple region determination unit that determines whether a nipple region corresponding to the optic disc is included in the tomographic image, and the connection region specified by the connection region specifying unit when it is determined that the nipple region is included
  • the shape information is generated by excluding a connected region located in the vicinity of the nipple region.
  • the invention according to claim 15 is the fundus analysis apparatus according to claim 14, wherein the papillary region determination unit analyzes the tomographic image and corresponds to a predetermined feature layer of the fundus.
  • the invention according to claim 16 is the fundus analysis apparatus according to claim 15, wherein the predetermined feature layer is an inner boundary film that forms a boundary with the vitreous body in the fundus, and the papillary region determination The unit determines whether or not there is a portion depressed in the depth direction in the feature layer region, and determines that the nipple region is included when it is determined that the depressed portion exists.
  • the invention according to claim 17 is the fundus analysis apparatus according to any one of claims 14 to 16, wherein the papillary region determination unit is configured such that the eye to be examined is the left eye or the right eye.
  • the determination is made by analyzing at least the vicinity of the left end of the frame of the tomographic image, and the subject's eye is the right eye. In some cases, the determination is performed by analyzing at least the vicinity of the right end of the frame.
  • the invention according to claim 18 is the fundus analysis apparatus according to any one of claims 1 to 17, wherein the light from the light source is divided into signal light and reference light, and the eye to be inspected.
  • the storage unit stores a tomographic image formed by the image forming unit.
  • the invention according to claim 19 corresponds to a retinal pigment epithelial layer based on a storage unit that stores a three-dimensional tomographic image depicting a layer structure of the fundus and a pixel value of a pixel of the three-dimensional tomographic image.
  • a layer region specifying unit for specifying a layer region in the tomographic image, an approximate curved surface calculating unit for obtaining an approximate curved surface based on the shape of the layer region, and the approximate curved surface being positioned in the depth direction of the fundus than the layer region.
  • a protruding area specifying unit that specifies a protruding area in which a distance between the approximate curved surface and the layer area in the depth direction is equal to or greater than a predetermined threshold; and the protruding area, and the approximate curved surface and the layer area
  • the fundus analysis apparatus includes a connected region specifying unit that specifies an enclosed connected region, and a morphological information generating unit that generates morphological information representing the form of the connected region.
  • a computer having a storage unit that stores a tomogram depicting a layer structure of the fundus oculi corresponding to a retinal pigment epithelium layer based on a pixel value of a pixel of the tomogram.
  • a layer region specifying unit for specifying a layer region in the tomogram, an approximate curve calculating unit for obtaining an approximate curve based on the shape of the layer region, the approximate curve being positioned in the depth direction of the fundus than the layer region, and A protruding area specifying unit that specifies a protruding area whose distance in the depth direction between the approximate curve and the layer area is equal to or greater than a predetermined threshold, including the protruding area, and a connection surrounded by the approximate curve and the layer area
  • It is a fundus analysis program that functions as a connected region specifying unit that specifies a region and a form information generating unit that generates form information indicating the form of the connected region.
  • a computer having a storage unit that stores a three-dimensional tomogram depicting a fundus layer structure based on pixel values of pixels of the three-dimensional tomogram.
  • a layer region specifying unit for specifying a layer region in the tomographic image
  • an approximate curved surface calculation unit for obtaining an approximate curved surface based on the shape of the layer region, and the approximate curved surface being positioned in the depth direction of the fundus than the layer region
  • a protruding area specifying unit that specifies a protruding area in which the distance between the approximate curved surface and the layer area in the depth direction is equal to or greater than a predetermined threshold, the protruding area, and the approximate curved surface and the layer area.
  • the invention according to claim 22 is a fundus analysis method for analyzing a tomographic image depicting a layer structure of the fundus oculi, which corresponds to a retinal pigment epithelial layer based on pixel values of pixels of the tomographic image.
  • Identifying a layer region in a tomogram obtaining an approximate curve based on the shape of the layer region, the approximate curve being located in the depth direction of the fundus than the layer region, and the approximate curve and the Identifying a protruding region whose distance in the depth direction between the layer region is equal to or greater than a predetermined threshold; and specifying a connected region that includes the protruding region and is surrounded by the approximate curve and the layer region; Generating form information representing the form of the connected area.
  • the invention according to claim 23 is a fundus analysis method for analyzing a three-dimensional tomographic image depicting a layer structure of the fundus oculi, wherein the retinal pigment epithelial layer is based on pixel values of pixels of the three-dimensional tomographic image. Identifying the layer region in the tomographic image corresponding to the step, obtaining an approximate curved surface based on the shape of the layer region, the approximate curved surface is located in the depth direction of the fundus than the layer region, and Identifying a projecting region in which a distance between the approximate curved surface and the layer region in a depth direction is equal to or greater than a predetermined threshold; and a connecting region including the projecting region and surrounded by the approximate curved surface and the layer region.
  • drusen can be detected effectively based on a tomographic image of the fundus.
  • the fundus analysis apparatus may be a computer that analyzes a tomographic image (two-dimensional tomographic image, three-dimensional tomographic image) of the fundus, or forms a tomographic image of the fundus using optical coherence tomography.
  • the OCT apparatus which can do is also possible.
  • the latter OCT apparatus includes the former computer. Therefore, the latter OCT apparatus will be described in detail below.
  • the fundus analyzer as the OCT apparatus may be of any type as long as it can form a tomographic image of the fundus.
  • the spectral domain type will be described in detail. Since the feature of the present invention is centered on the processing of analyzing a tomographic image of the fundus, other types of OCT apparatuses such as a swept source type and an inrez type can be similarly configured.
  • a measurement operation for forming a tomographic image using OCT may be referred to as OCT measurement.
  • a fundus imaging apparatus other than the fundus camera such as an SLO, a slit lamp, an ophthalmic surgical microscope, or the like
  • OCT configuration according to this embodiment. It is also possible to combine devices.
  • the configuration according to this embodiment can be incorporated into a single OCT apparatus.
  • the SLO Sccanning Laser Ophthalmoscope
  • the SLO is an apparatus that images the fundus surface by scanning the fundus with laser light and detecting the reflected light with a highly sensitive element such as a photomultiplier tube.
  • the fundus analysis apparatus 1 includes a fundus camera unit 2, an OCT unit 100, and an arithmetic control unit 200.
  • the retinal camera unit 2 has almost the same optical system as a conventional retinal camera.
  • the OCT unit 100 is provided with an optical system for acquiring a fundus tomographic image.
  • the arithmetic control unit 200 includes a computer that executes various arithmetic processes and control processes.
  • the fundus camera unit 2 shown in FIG. 1 is provided with an optical system for obtaining a two-dimensional image (fundus image) representing the surface form of the fundus oculi Ef of the eye E to be examined.
  • the fundus image includes an observation image and a captured image.
  • the observation image is, for example, a monochrome moving image formed at a predetermined frame rate using near infrared light.
  • the captured image is, for example, a color image obtained by flashing visible light, or a monochrome still image using near infrared light or visible light as illumination light.
  • the fundus camera unit 2 may be configured to be able to acquire images other than these, such as a fluorescein fluorescent image, an indocyanine green fluorescent image, a spontaneous fluorescent image, and the like.
  • the fundus camera unit 2 is provided with a chin rest and a forehead for supporting the subject's face. Further, the fundus camera unit 2 is provided with an illumination optical system 10 and a photographing optical system 30.
  • the illumination optical system 10 irradiates the fundus oculi Ef with illumination light.
  • the photographing optical system 30 guides the fundus reflection light of the illumination light to an imaging device (CCD image sensor (sometimes simply referred to as a CCD) 35, 38).
  • the imaging optical system 30 guides the signal light from the OCT unit 100 to the fundus oculi Ef and guides the signal light passing through the fundus oculi Ef to the OCT unit 100.
  • the observation light source 11 of the illumination optical system 10 is composed of, for example, a halogen lamp.
  • the light (observation illumination light) output from the observation light source 11 is reflected by the reflection mirror 12 having a curved reflection surface, passes through the condensing lens 13, passes through the visible cut filter 14, and is converted into near infrared light. Become. Further, the observation illumination light is once converged in the vicinity of the photographing light source 15, reflected by the mirror 16, and passes through the relay lenses 17 and 18, the diaphragm 19 and the relay lens 20. Then, the observation illumination light is reflected at the peripheral portion (region around the hole portion) of the aperture mirror 21, passes through the dichroic mirror 46, and is refracted by the objective lens 22 to illuminate the fundus oculi Ef.
  • An LED Light Emitting Diode
  • the fundus reflection light of the observation illumination light is refracted by the objective lens 22, passes through the dichroic mirror 46, passes through the hole formed in the central region of the perforated mirror 21, passes through the dichroic mirror 55, and is a focusing lens. It is reflected by the mirror 32 via 31. Further, the fundus reflection light passes through the half mirror 39A, is reflected by the dichroic mirror 33, and forms an image on the light receiving surface of the CCD image sensor 35 by the condenser lens.
  • the CCD image sensor 35 detects fundus reflected light at a predetermined frame rate, for example. On the display device 3, an image (observation image) based on fundus reflection light detected by the CCD image sensor 35 is displayed.
  • an observation image based on fundus reflection light detected by the CCD image sensor 35 is displayed.
  • the photographing light source 15 is constituted by, for example, a xenon lamp.
  • the light (imaging illumination light) output from the imaging light source 15 is applied to the fundus oculi Ef through the same path as the observation illumination light.
  • the fundus reflection light of the imaging illumination light is guided to the dichroic mirror 33 through the same path as that of the observation illumination light, passes through the dichroic mirror 33, is reflected by the mirror 36, and is reflected by the condenser lens 37 of the CCD image sensor 38.
  • An image is formed on the light receiving surface.
  • On the display device 3 an image (captured image) based on fundus reflection light detected by the CCD image sensor 38 is displayed.
  • the display device 3 that displays the observation image and the display device 3 that displays the captured image may be the same or different.
  • an infrared captured image is displayed. It is also possible to use an LED as a photographing light source.
  • the LCD (Liquid Crystal Display) 39 displays a fixation target and an eyesight measurement index.
  • the fixation target is an index for fixing the eye E to be examined, and is used at the time of fundus photographing or OCT measurement.
  • a part of the light output from the LCD 39 is reflected by the half mirror 39A, reflected by the mirror 32, passes through the focusing lens 31 and the dichroic mirror 55, passes through the hole of the perforated mirror 21, and reaches the dichroic.
  • the light passes through the mirror 46, is refracted by the objective lens 22, and is projected onto the fundus oculi Ef.
  • the fixation position of the eye E can be changed by changing the display position of the fixation target on the screen of the LCD 39.
  • As the fixation position of the eye E for example, a position for acquiring an image centered on the macular portion of the fundus oculi Ef, or a position for acquiring an image centered on the optic disc as in the case of a conventional fundus camera And a position for acquiring an image centered on the fundus center between the macula and the optic disc. It is also possible to arbitrarily change the display position of the fixation target.
  • the fundus camera unit 2 is provided with an alignment optical system 50 and a focus optical system 60 as in the conventional fundus camera.
  • the alignment optical system 50 generates an index (alignment index) for performing alignment (alignment) of the apparatus optical system with respect to the eye E.
  • the focus optical system 60 generates an index (split index) for focusing on the fundus oculi Ef.
  • the light (alignment light) output from the LED 51 of the alignment optical system 50 is reflected by the dichroic mirror 55 via the apertures 52 and 53 and the relay lens 54, passes through the hole of the aperture mirror 21, and reaches the dichroic mirror 46. And is projected onto the cornea of the eye E by the objective lens 22.
  • the cornea-reflected light of the alignment light passes through the objective lens 22, the dichroic mirror 46, and the hole, part of which passes through the dichroic mirror 55, passes through the focusing lens 31, and is reflected by the mirror 32.
  • the light passes through 39A, is reflected by the dichroic mirror 33, and is projected onto the light receiving surface of the CCD image sensor 35 by the condenser lens.
  • the light reception image (alignment index) by the CCD image sensor 35 is displayed on the display device 3 together with the observation image.
  • the user performs alignment by performing the same operation as that of a conventional fundus camera. Further, the arithmetic control unit 200 may perform alignment by analyzing the position of the alignment index and moving the optical system (auto-alignment function).
  • the reflecting surface of the reflecting rod 67 is obliquely provided on the optical path of the illumination optical system 10.
  • the light (focus light) output from the LED 61 of the focus optical system 60 passes through the relay lens 62, is separated into two light beams by the split indicator plate 63, passes through the two-hole aperture 64, and is reflected by the mirror 65, The light is focused on the reflecting surface of the reflecting bar 67 by the condenser lens 66 and reflected. Further, the focus light passes through the relay lens 20, is reflected by the perforated mirror 21, passes through the dichroic mirror 46, is refracted by the objective lens 22, and is projected onto the fundus oculi Ef.
  • the fundus reflection light of the focus light is detected by the CCD image sensor 35 through the same path as the corneal reflection light of the alignment light.
  • a light reception image (split index) by the CCD image sensor 35 is displayed on the display device 3 together with the observation image.
  • the arithmetic control unit 200 analyzes the position of the split index and moves the focusing lens 31 and the focus optical system 60 to perform focusing as in the conventional case (autofocus function). Alternatively, focusing may be performed manually while visually checking the split indicator.
  • the dichroic mirror 46 branches the optical path for OCT measurement from the optical path for fundus imaging.
  • the dichroic mirror 46 reflects light in a wavelength band used for OCT measurement and transmits light for fundus photographing.
  • a collimator lens unit 40, an optical path length changing unit 41, a galvano scanner 42, a focusing lens 43, a mirror 44, and a relay lens 45 are provided in this order from the OCT unit 100 side. It has been.
  • the optical path length changing unit 41 is movable in the direction of the arrow shown in FIG. 1, and changes the optical path length of the optical path for OCT measurement. This change in the optical path length is used for correcting the optical path length according to the axial length of the eye E or adjusting the interference state.
  • the optical path length changing unit 41 includes, for example, a corner cube and a mechanism for moving the corner cube.
  • the galvano scanner 42 changes the traveling direction of light (signal light LS) passing through the optical path for OCT measurement. Thereby, the fundus oculi Ef can be scanned with the signal light LS.
  • the galvano scanner 42 includes, for example, a galvano mirror that scans the signal light LS in the x direction, a galvano mirror that scans in the y direction, and a mechanism that drives these independently. Thereby, the signal light LS can be scanned in an arbitrary direction on the xy plane.
  • the OCT unit 100 is provided with an optical system for acquiring a tomographic image of the fundus oculi Ef.
  • This optical system has the same configuration as a conventional spectral domain type OCT apparatus. That is, this optical system divides low-coherence light into reference light and signal light, and generates interference light by causing interference between the signal light passing through the fundus oculi Ef and the reference light passing through the reference optical path. It is configured to detect spectral components. This detection result (detection signal) is sent to the arithmetic control unit 200.
  • a wavelength swept light source is provided instead of a light source that outputs a low coherence light source, and an optical member that spectrally decomposes interference light is not provided.
  • a known technique according to the type of optical coherence tomography can be arbitrarily applied.
  • the light source unit 101 outputs a broadband low-coherence light L0.
  • the low coherence light L0 includes, for example, a near-infrared wavelength band (about 800 nm to 900 nm) and has a temporal coherence length of about several tens of micrometers. Note that near-infrared light having a wavelength band invisible to the human eye, for example, a center wavelength of about 1040 to 1060 nm, may be used as the low-coherence light L0.
  • the light source unit 101 includes a super luminescent diode (Super Luminescent Diode: SLD), an LED, and an optical output device such as an SOA (Semiconductor Optical Amplifier).
  • SLD Super Luminescent Diode
  • LED an LED
  • SOA semiconductor Optical Amplifier
  • the low coherence light L0 output from the light source unit 101 is guided to the fiber coupler 103 by the optical fiber 102, and is divided into the signal light LS and the reference light LR.
  • the reference light LR is guided by the optical fiber 104 and reaches an optical attenuator (attenuator) 105.
  • the optical attenuator 105 automatically adjusts the amount of the reference light LR guided to the optical fiber 104 under the control of the arithmetic control unit 200 using a known technique.
  • the reference light LR whose light amount has been adjusted by the optical attenuator 105 is guided by the optical fiber 104 and reaches the polarization adjuster (polarization controller) 106.
  • the polarization adjuster 106 is, for example, a device that adjusts the polarization state of the reference light LR guided in the optical fiber 104 by applying a stress from the outside to the optical fiber 104 in a loop shape.
  • the configuration of the polarization adjuster 106 is not limited to this, and any known technique can be used.
  • the reference light LR whose polarization state is adjusted by the polarization adjuster 106 reaches the fiber coupler 109.
  • the signal light LS generated by the fiber coupler 103 is guided by the optical fiber 107 and converted into a parallel light beam by the collimator lens unit 40. Further, the signal light LS reaches the dichroic mirror 46 via the optical path length changing unit 41, the galvano scanner 42, the focusing lens 43, the mirror 44, and the relay lens 45. The signal light LS is reflected by the dichroic mirror 46, is refracted by the objective lens 22, and is applied to the fundus oculi Ef. The signal light LS is scattered (including reflection) at various depth positions of the fundus oculi Ef. The backscattered light of the signal light LS from the fundus oculi Ef travels in the same direction as the forward path in the reverse direction, is guided to the fiber coupler 103, and reaches the fiber coupler 109 via the optical fiber 108.
  • the fiber coupler 109 causes the backscattered light of the signal light LS and the reference light LR that has passed through the optical fiber 104 to interfere with each other.
  • the interference light LC generated thereby is guided by the optical fiber 110 and emitted from the emission end 111. Further, the interference light LC is converted into a parallel light beam by the collimator lens 112, dispersed (spectral decomposition) by the diffraction grating 113, condensed by the condenser lens 114, and projected onto the light receiving surface of the CCD image sensor 115.
  • the diffraction grating 113 shown in FIG. 2 is a transmission type, other types of spectroscopic elements such as a reflection type diffraction grating may be used.
  • the CCD image sensor 115 is a line sensor, for example, and detects each spectral component of the split interference light LC and converts it into electric charges.
  • the CCD image sensor 115 accumulates this electric charge, generates a detection signal, and sends it to the arithmetic control unit 200.
  • a Michelson type interferometer is used, but any type of interferometer such as a Mach-Zehnder type can be appropriately used.
  • any type of interferometer such as a Mach-Zehnder type can be appropriately used.
  • another form of image sensor for example, a CMOS (Complementary Metal Oxide Semiconductor) image sensor or the like can be used.
  • CMOS Complementary Metal Oxide Semiconductor
  • the configuration of the arithmetic control unit 200 will be described.
  • the arithmetic control unit 200 analyzes the detection signal input from the CCD image sensor 115 and forms a tomographic image of the fundus oculi Ef.
  • the arithmetic processing for this is the same as that of a conventional spectral domain type OCT apparatus.
  • the arithmetic control unit 200 controls each part of the fundus camera unit 2, the display device 3, and the OCT unit 100. For example, the arithmetic control unit 200 displays a tomographic image of the fundus oculi Ef on the display device 3.
  • the arithmetic control unit 200 controls the operation of the observation light source 11, the imaging light source 15 and the LEDs 51 and 61, the operation control of the LCD 39, the movement control of the focusing lenses 31 and 43, and the reflector 67. Movement control, movement control of the focus optical system 60, movement control of the optical path length changing unit 41, operation control of the galvano scanner 42, and the like are performed.
  • the arithmetic control unit 200 performs operation control of the light source unit 101, operation control of the optical attenuator 105, operation control of the polarization adjuster 106, operation control of the CCD image sensor 115, and the like.
  • the arithmetic control unit 200 includes, for example, a microprocessor, a RAM, a ROM, a hard disk drive, a communication interface, and the like, as in a conventional computer.
  • a computer program for controlling the fundus analyzer 1 is stored in a storage device such as a hard disk drive.
  • the arithmetic control unit 200 may include various circuit boards, for example, a circuit board for forming a tomographic image.
  • the arithmetic control unit 200 may include an operation device (input device) such as a keyboard and a mouse, and a display device such as an LCD.
  • the fundus camera unit 2, the display device 3, the OCT unit 100, and the calculation control unit 200 may be configured integrally (that is, in a single housing) or separated into two or more housings. It may be.
  • Control system The configuration of the control system of the fundus analysis apparatus 1 will be described with reference to FIGS. 3 and 4.
  • the control system of the fundus analysis apparatus 1 is configured around the control unit 210.
  • the control unit 210 includes, for example, the aforementioned microprocessor, RAM, ROM, hard disk drive, communication interface, and the like.
  • the control unit 210 is provided with a main control unit 211 and a storage unit 212.
  • the main control unit 211 performs the various controls described above.
  • the main control unit 211 controls the focusing drive unit 31A, the optical path length changing unit 41, and the galvano scanner 42 of the fundus camera unit 2, and further the light source unit 101, the optical attenuator 105, and the polarization adjuster 106 of the OCT unit 100. To do.
  • the focusing drive unit 31A moves the focusing lens 31 in the optical axis direction. Thereby, the focus position of the photographic optical system 30 is changed.
  • the main control unit 211 can also move an optical system provided in the fundus camera unit 2 in a three-dimensional manner by controlling an optical system drive unit (not shown). This control is used in alignment and tracking. Tracking is to move the apparatus optical system in accordance with the eye movement of the eye E. When tracking is performed, alignment and focusing are performed in advance. Tracking is a function of maintaining a suitable positional relationship in which the alignment and focus are achieved by causing the position of the apparatus optical system to follow the eye movement.
  • the main control unit 211 performs a process of writing data to the storage unit 212 and a process of reading data from the storage unit 212.
  • the storage unit 212 stores various data. Examples of data stored in the storage unit 212 include tomographic image data, fundus image data, and eye information to be examined.
  • the eye information includes information about the subject such as patient ID and name, and information about the eye such as left and right eye information indicating whether the eye is the left eye or the right eye.
  • the storage unit 212 stores various programs and data for operating the fundus analysis apparatus 1.
  • the image forming unit 220 forms image data of a two-dimensional tomographic image depicting the layer structure of the fundus oculi Ef based on the detection signal from the CCD image sensor 115.
  • This process includes processes such as noise removal (noise reduction), filter processing, dispersion compensation, and FFT (Fast Fourier Transform) as in the conventional spectral domain type optical coherence tomography.
  • the image forming unit 220 executes a known process corresponding to the type.
  • the control unit 211 causes the storage unit 212 to store image data of the two-dimensional tomographic image formed by the image forming unit 220.
  • the image forming unit 220 includes, for example, the circuit board described above. In this specification, “image data” and “image” based thereon may be identified.
  • the image processing unit 230 performs various types of image processing and analysis processing on the tomographic image of the fundus oculi Ef. For example, the image processing unit 230 executes various correction processes such as image brightness correction. The image processing unit 230 performs various types of image processing and analysis processing on the image (fundus image, anterior eye image, etc.) obtained by the fundus camera unit 2.
  • the image processing unit 230 performs known image processing such as interpolation processing for interpolating pixels between two-dimensional tomographic images to form a three-dimensional tomographic image of the fundus oculi Ef.
  • image data of a three-dimensional tomographic image means image data in which pixel positions are defined by a three-dimensional coordinate system.
  • image data of a three-dimensional tomographic image there is image data composed of voxels arranged three-dimensionally. This image data is called volume data or voxel data.
  • the image processing unit 230 When displaying an image based on volume data, the image processing unit 230 performs a rendering process (such as volume rendering or MIP (Maximum Intensity Projection)) on the volume data, and views the image from a specific line-of-sight direction. A pseudo three-dimensional tomographic image is formed. This pseudo three-dimensional tomographic image is displayed on a display device such as the display unit 240A.
  • a rendering process such as volume rendering or MIP (Maximum Intensity Projection)
  • stack data of a plurality of two-dimensional tomograms is image data of a three-dimensional tomogram.
  • the stack data is image data obtained by three-dimensionally arranging a plurality of two-dimensional tomographic images obtained along a plurality of scanning lines based on the positional relationship of the scanning lines. That is, stack data is image data obtained by expressing a plurality of tomographic images originally defined by individual two-dimensional coordinate systems by one three-dimensional coordinate system (that is, by embedding them in one three-dimensional space). is there.
  • the control unit 211 causes the storage unit 212 to store the three-dimensional tomogram (volume data, pseudo three-dimensional tomogram, stack data, etc.) formed as described above.
  • the image processing unit 230 performs an analysis process for obtaining a drusen state by analyzing a tomographic image depicting the macular of the fundus oculi Ef and its surrounding region.
  • the image processing unit 230 includes a layer region specifying unit 231, an approximate curve calculating unit 232, a protruding region specifying unit 233, a connected region specifying unit 234, and a form information generating unit 235.
  • the layer region specifying unit 231 specifies an image region corresponding to the retinal pigment epithelium layer in the tomographic image based on the pixel value of the pixel of the tomographic image of the fundus oculi Ef. This image area is called a layer area.
  • the layer region specifying unit 231 specifies the layer region by executing, for example, the same process as in Patent Document 5. This process will be briefly described.
  • the layer area specifying unit 231 first performs preprocessing such as tone conversion processing, image enhancement processing, threshold processing, contrast conversion processing, binarization processing, edge detection processing, image averaging processing, image smoothing processing, and filter processing. Is applied to the tomographic image to clarify the layer structure depicted by the tomographic image.
  • the layer region specifying unit 231 analyzes pixel values (for example, luminance values) of pixels constituting the pre-processed tomographic image one by one along the depth direction (z-axis direction) of the fundus oculi Ef, A pixel corresponding to a boundary position between adjacent layers is specified.
  • a pixel corresponding to the boundary position of the layer can be specified using a filter (for example, a differential filter) having a spread only in the depth direction.
  • pixel edge detection may be performed using an area filter that extends in two directions, the depth direction and the direction orthogonal thereto.
  • the layer region specifying unit 231 specifies image regions corresponding to several layers of the fundus oculi Ef through such processing. Furthermore, the layer area specifying unit 231 specifies an image corresponding to the retinal pigment epithelium layer from among the specified image areas. An example of this processing will be described.
  • the tomographic image it is known from a number of clinically acquired tomographic images of the fundus how many bright layers counted from the retinal surface side correspond to the retinal pigment epithelium layer. Therefore, the tomographic image to be analyzed first identifies the retina surface, counts the number of bright layers from the retina surface side, and the layer corresponding to the predetermined count value is the target layer region.
  • the layer region in the tomographic image to be analyzed may be specified based on the standard distance from the retina surface to the retinal pigment epithelium layer in the depth direction.
  • the layer region can be specified in consideration of the difference. For example, when the retinal pigment epithelium layer is depicted Nth brightly among the brightly depicted layers, the Nth brightest image region corresponding to the layer identified in the tomographic image to be analyzed is specified. It can be a layer region. Note that the method of specifying the layer region is not limited to the one exemplified here, and any method may be used as long as the target layer region can be specified.
  • the tomographic image to be analyzed is a two-dimensional or three-dimensional tomographic image.
  • the layer region is specified as a substantially curved image region (ignoring the thickness of the layer region).
  • the layer region is specified as a substantially curved image region (ignoring the thickness of the layer region).
  • the layer region specifying unit 231 generates information on the specified layer region, for example, position information (coordinate values) of the layer region in the tomographic image. Note that the layer region may be extracted from the tomographic image. Moreover, you may produce
  • the identification result of the layer region (curved image region) in the tomographic image is input to the approximate curve calculation unit 232. Based on the shape of this layer region, the approximate curve calculation unit 232 obtains a curve that approximates this shape. This curve is called an approximate curve.
  • the approximate curve represents the estimated shape of the retinal pigment epithelium layer when it is assumed that drusen does not exist in the cross section indicated by the tomographic image.
  • the specified layer region is depicted in the tomographic image as a curve convex in the depth direction at least globally.
  • irregularities corresponding to drusen are drawn in the specified layer region.
  • the approximate curve represents the global shape of the layer region ignoring such irregularities.
  • the approximate curve calculation unit 232 is provided with a feature site specifying unit 232a, a feature site interpolation unit 232b, a free curve calculation unit 232c, and a correction unit 232d.
  • the correction unit 232d includes a set of the straight part specifying unit 232e and the first deforming unit 232f, a set of the curved part specifying unit 232g and the second deforming unit 232h, and a set of the protrusion determining unit 232i and the third deforming unit 232j. And are provided.
  • the feature part specifying unit 232a specifies a plurality of feature parts based on the shape of the layer region based on the pixel values of the pixels in the specified layer region. An example of this processing will be described with reference to FIGS. 5A to 5C.
  • the characteristic part specifying unit 232 a specifies the deepest part P0 in the layer region 300 in the depth direction (+ z direction) based on the shape of the layer region 300.
  • This process can be executed, for example, by referring to the coordinate value of each pixel in the layer region 300, specifying the pixel having the maximum z coordinate value, and setting it to the deepest part P0.
  • a straight line orthogonal to the depth direction is moved from the + z direction to the ⁇ z direction, and the position in the layer region that first contacts the straight line is set as the deepest part. Also good.
  • the deepest part P0 specified in this way is a characteristic part of the layer region 300.
  • the characteristic part specifying unit 232a obtains a straight line passing through the deepest part P0 and in contact with the layer region 300, and a contact point between the straight line and the layer region 300 is set as a characteristic part.
  • a specific example of this process will be described.
  • the straight line L passing through the deepest part P0 is rotated around the deepest part P0.
  • FIG. 5B shows a case where the straight line L is rotated clockwise r.
  • the straight line L When the straight line L is rotated in this way, the straight line L contacts the layer region 300 at a certain stage as shown in FIG. 5C.
  • the straight line L at this time corresponds to the above-mentioned “straight line passing through the deepest part P0 and in contact with the layer region 300”.
  • the contact P1 is a characteristic part of the layer region 300. Since all the other characteristic parts are located on the ⁇ z side with respect to the deepest part P0, for example, it is sufficient to rotate the straight line L from a position that passes through the deepest part P0 and is orthogonal to the z coordinate axis. In addition, by rotating the straight line L in the reverse direction (counterclockwise), it is possible to specify a feature part located on the opposite side of the feature part P1 with respect to the deepest part P0.
  • the straight line L that always passes through the deepest part P0 can be considered, and the contact point between the straight line L and the layer region 300 can be specified sequentially.
  • the first contact point, the second contact point, the third contact point,... According to the rotation of the straight line L are sequentially specified as the contact point (characteristic part).
  • the rotation center of the straight line L can be changed sequentially. That is, first, the first contact point is specified by rotating the straight line L around the deepest part P0. Next, the second contact point is specified by rotating the straight line L similarly around the first contact point. Subsequently, the straight line L is similarly rotated around the second contact point to specify the third contact point. In this way, a plurality of contact points (characteristic portions) are specified sequentially.
  • the number of characteristic parts specified as described above is arbitrary, but the accuracy of the following processing improves as the number increases. On the other hand, the resource required for processing increases as the specific number of feature parts increases.
  • the feature part interpolation unit 232b performs a process of adding a feature part of the layer region. This process is not always executed, for example, but is executed as necessary. Hereinafter, two types of interpolation processing will be described.
  • the first interpolation process follows the request for the free curve calculation process in the subsequent stage. That is, in order to obtain a free curve based on a feature part, there is a minimum number of feature parts necessary for solving the equation. This interpolation process is executed when the number of feature parts specified by the feature part specifying unit 232a is less than the minimum number. Note that the minimum number of characteristic portions necessary for the calculation of the free curve is set in advance according to the type of free curve to be obtained. As a specific example, when the free curve is a cubic spline curve, the minimum number is set to 4. With the above preparation, the first interpolation process is performed as follows, for example.
  • the feature part interpolation unit 232b determines whether the number of feature parts specified by the feature part specifying unit 232a is less than a predetermined minimum number. When the specified number is equal to or greater than the minimum number, the first interpolation process is not executed.
  • the feature part interpolation unit 232b adds new feature parts by the number obtained as a difference between the minimum number and the specified number.
  • the new feature portion for example, a point closest to the layer region (the closest point in the depth direction, that is, the z direction) is selected on a broken line formed by connecting the feature portions obtained up to the present time. . That is, the first feature part added is the point closest to the layer region on the polygonal line formed by connecting the feature parts specified by the feature part specifying unit 232a.
  • the i ( ⁇ 2) th feature part is added to the feature part specified by the feature part specifying unit 232a and the feature part added up to the (i ⁇ 1) th part. It is the point closest to the layer region on the polygonal line formed by connecting.
  • the feature part interpolation unit 232b adds an arbitrary point on the layer region, for example, an intersection of the end (edge) of the frame (drawing frame) of the tomographic image and the layer region as a new feature part. be able to. This process can also be applied when the layer region protrudes at the end of the frame.
  • the above is an example of the first interpolation processing.
  • the second interpolation process will be described.
  • This interpolation processing complies with a request for accuracy (accuracy) of approximation of a free curve. Specifically, if the interval between two adjacent characteristic parts is too wide, an error between the layer region and the approximate curve between the characteristic parts becomes large.
  • the second interpolation processing is to add a new feature portion between the feature portions when the interval between the feature portions is wide. An example of this processing will be described below. If only one feature part is specified by the feature part specifying unit 232a, an arbitrary point on the layer area (for example, the intersection of the frame end and the layer area) is newly added as in the first interpolation process. First added as a special feature.
  • the feature part interpolation unit 232b obtains the distance between two adjacent feature parts for the feature part specified by the feature part specifying unit 232a.
  • This distance is, for example, a distance in a cross-sectional direction (that is, the scanning direction of the signal light LS) orthogonal to the depth direction (z direction). Note that it may be a spatial distance between two characteristic parts in the cross section indicated by the tomographic image.
  • the characteristic part interpolation unit 232b determines whether the obtained distance is equal to or greater than a predetermined value. If the distance is less than the predetermined value, the second interpolation process for the pair of feature parts is not executed, and the process proceeds to a process for another pair as necessary.
  • the feature part interpolation unit 232b adds a new feature part between the two feature parts.
  • the position where the new feature part is added is, for example, an intermediate position between the two feature parts. Further, the additional position may be determined in consideration of factors such as the relative positions of the two characteristic parts in the z direction and the positions of other characteristic parts. In addition, when the distance between two feature parts is twice or more the predetermined value, two or more new feature parts may be added between these feature parts.
  • the feature part interpolation unit 232b repeatedly performs the above-described interpolation processing until the distance between all adjacent feature part pairs becomes less than a predetermined value. The above is an example of the second interpolation processing.
  • the free curve calculation unit 232c obtains a free curve based on a plurality of feature parts.
  • the plurality of feature parts are a feature part specified by the feature part specifying unit 232a and a feature part added by the feature part interpolation unit 232b as necessary.
  • a free curve is generally a smooth curve defined to pass through several points on a plane in a predetermined order.
  • free curves are spline curves and Bezier curves.
  • a spline curve, particularly a cubic spline curve is used as the free curve.
  • a spline curve is generally a smooth curve that passes through a plurality of given control points, and is a curve obtained using individual polynomials for sections (segments) having two adjacent control points at both ends. It is.
  • the nth order spline curve uses an nth order polynomial.
  • the n-th order Bezier curve is generally an n-1th order curve obtained from n control points. In this embodiment, the calculation of the free curve is performed using each characteristic part as a control point.
  • the correction unit 232d corrects the free curve obtained by the free curve calculation unit 232c.
  • the first correction process is performed by the straight part specifying unit 232e and the first deforming unit 232f.
  • the second correction process is performed by the curved part specifying unit 232g and the second deforming unit 232h.
  • the third correction process is performed by the protrusion determination unit 232i and the third deformation unit 232j.
  • the approximate curve represents the estimated shape of the retinal pigment epithelium layer.
  • First correction process straight part specifying unit, first deforming unit
  • the first correction process will be described.
  • the first correction process when a part of the layer region is (substantially) linear, the portion of the free curve is corrected. Due to the characteristics of the calculation of the free curve, the free curve is arranged in the depth direction (+ z direction) and convex in the depth direction (+ z direction) in the region where the layer region is linear. There is. When such a free curve is adopted as an approximate curve, there is a risk that this linear portion will be detected as drusen.
  • the first correction process is for avoiding such a situation.
  • an example of this correction processing will be described.
  • the linear part specifying unit 232e specifies a substantially linear part in the layer region specified by the layer region specifying unit 231. This process can be performed, for example, by calculating a slope (differential value) at each point in the layer region and searching for a section where the slope is substantially constant. Note that “the slope is substantially constant” includes not only a section where the slope is completely constant but also a case where the change in the slope is less than a predetermined threshold. The part of the layer region specified in this way is called a straight part.
  • the first deformation unit 232f determines whether or not the free curve is located in the depth direction (+ z direction) from the straight line portion. This process is performed, for example, by comparing the z-coordinate value of the straight line portion with the z-coordinate value of the free curve portion corresponding to the straight line portion. When it is determined that the free curve is located in the reverse direction ( ⁇ z direction) of the depth direction with respect to the straight line portion, the first correction process is not executed.
  • the first deforming unit 232f deforms the free curve so that the corresponding part of the free curve matches the position of the straight line part.
  • the corresponding part and its vicinity can be modified so that the corresponding part of the free straight line is replaced with the straight part.
  • a new control point is set at both end positions of the straight part, and a new free curve on one or both sides of the corresponding part is calculated taking this new control point into account, and the straight part is obtained.
  • one or more control points are newly set on the straight line portion, and a new free curve is obtained by taking these control points into consideration.
  • the straight part is approximated by a free curve.
  • the number of newly added control points can be arbitrarily set based on, for example, the length of the straight line part.
  • the curved part specifying unit 232g specifies a part located in the reverse direction ( ⁇ z direction) of the depth direction with respect to the layer region in the free curve. This process can be easily performed, for example, by comparing the z coordinate value of the free curve and the z coordinate value of the layer region.
  • the second deforming unit 232h deforms the free curve so that the part of the free curve specified by the curve part specifying unit 232g matches the position of the layer region. Similar to the first correction process, this process may be a replacement of the deformation with a target (layer region) or a free curve calculation that takes into account a new control point based on the target.
  • FIG. 6A shows an example where the layer region protrudes at the end of the frame.
  • the layer region 300 protrudes in the direction opposite to the depth direction ( ⁇ z direction) at the right end of the frame F of the tomographic image.
  • points 310 and 320 are obtained as characteristic portions of the layer region 300.
  • the characteristic part 320 is specified by the characteristic part specifying unit 232a.
  • the feature part 310 that is the intersection of the layer region 300 and the end of the frame F is added by the feature part interpolation unit 232b.
  • FIG. 6B indicates a free curve based on these characteristic portions 310 and 320 and the like. Since the free curve 400 passes through the two characteristic portions 310 and 320, as shown in FIG. 6B, the free curve 400 is inclined in the protruding direction ( ⁇ z direction) of the protruding portion. That is, the feature part 310 set (for convenience) at the end of the frame F affects the shape of the free curve 400, and as a result, the protrusion amount of the protrusion part cannot be measured accurately. In order to cope with such a situation, a third correction process is executed. Hereinafter, an example of the third correction process will be described.
  • the protrusion determining unit 232i determines whether or not the layer region protrudes in the reverse direction ( ⁇ z direction) of the depth direction in the vicinity of the end of the tomographic frame. This processing can be executed, for example, by calculating the slope (differential value) at each point of the layer region or the free curve (at least in the vicinity of the end of the frame) and obtaining its shape.
  • the new feature part (310) is more than the feature part (320) adjacent thereto.
  • the layer region protrudes in the direction opposite to the depth direction ( ⁇ z direction) in the vicinity of the frame end of the tomographic image.
  • the third deforming portion 232j replaces the portion of the free curve corresponding to the protruding portion with an extension line of the free curve from the center side of the frame. To deform the free curve. An example of this processing will be described below.
  • the third deforming portion 232j sets a control point (reference control point) 410 at a predetermined position on the center side of the frame with respect to the feature portion 320 adjacent to the feature portion 310 adjacent to the feature portion 310 set at the intersection of the layer region and the frame end.
  • a control point reference control point
  • the reference control point 410 for example, an intermediate point in the horizontal direction of the frame is used.
  • the third deforming unit 232j sets a predetermined number of control points between the reference control point 410 and the feature part 320 so that the reference control point 410 and the feature part 320 are equally divided. Thereby, a set of points including a plurality of control points including the reference control point 410 and the characteristic part 320 is obtained.
  • the number of points included in this set is equal to or greater than the minimum number of characteristic parts necessary for the calculation of the free curve. Further, the interval between adjacent points can be set relatively wide. In the example illustrated in FIG. 6C, a set including three control points 410, 420, and 430 and a characteristic portion 320 is set.
  • the third deformation unit 232j obtains a free curve having a plurality of points included in the obtained set as control points.
  • a cubic spline curve that passes through the four control points 320, 410, 420, and 430 shown in FIG. 6C and reaches the end of the frame.
  • an estimated curve 500 shown in FIG. 6D is obtained.
  • the estimated curve 500 is a free curve defined at and near the protruding portion of the layer region 300, and indicates the estimated position of the baseline for measuring the protruding amount of the protruding portion.
  • the third deforming unit 232j includes a portion of the free curve 400 on the frame center side from the feature portion 320 (interpolation curve) 400a and a portion of the estimated curve 500 on the frame end side of the feature portion 320 (extrapolation). Curve) 500a is connected at the position of the characteristic part 320 (see FIG. 6E). The curve thus obtained is used as a deformation result of the free curve 400.
  • the third deforming portion 232j is a frame that is a predetermined distance (for example, very close to the adjacent feature portion) from the feature portion (320) adjacent to the feature portion (310) set at the intersection of the layer region and the frame end.
  • the inclination of the layer area at the center position (reference position) is calculated.
  • the third deforming unit 232j obtains a line segment connecting the reference position and the frame end and having the calculated inclination.
  • transformation part 232j substitutes the site
  • the correction line segment 510 shown in FIG. 6F corresponds to this.
  • the protruding area specifying unit 233 specifies the protruding area in the tomographic image based on the approximate curve obtained by the approximate curve calculating unit 232 and the layer area.
  • the protruding region is an image region in which the approximate curve is located in the depth direction (+ z direction) of the fundus oculi Ef with respect to the layer region, and the distance between the approximate curve and the layer region in the depth direction is equal to or greater than a predetermined threshold. .
  • the protruding region indicates a region where the layer region is greatly protruded in the ⁇ z direction with respect to the approximate curve.
  • the entire image region protruding in the ⁇ z direction with respect to the approximate curve can be specified as the protruding region.
  • approximation is performed in order to avoid natural unevenness and noise of the retinal pigment epithelium layer.
  • the protruding area specifying unit 233 is provided with a distance calculating unit 233a, a distance determining unit 233b, and an image area specifying unit 233c.
  • the distance calculation unit 233a calculates the distance in the depth direction between each point on the approximate curve and the layer region. This process does not need to be performed for all points (pixels) on the approximate curve, and may be performed at predetermined pixel intervals (for example, every 5 pixels). In the distance calculation process, for example, the number of pixels between a point (pixel) on the approximate curve and a corresponding point (pixel) on the layer area is counted, and the interval between adjacent pixels is set as a unit distance and the count result. Can be done on the basis. Alternatively, it may be obtained based on the measurement magnification of the image and the distance in the image between the pixels of the distance measurement target.
  • the distance calculated by the distance calculation unit 233a may be a distance obtained by converting a distance in the image (a distance defined in the xyz coordinate system or a pixel interval) into a distance in real space. The distance may be used as it is.
  • the distance determination unit 233b determines whether each distance calculated by the distance calculation unit 233a is equal to or greater than a predetermined threshold.
  • This threshold is set in advance based on, for example, many clinical cases. Further, the threshold value can be set in consideration of the measurement accuracy of the apparatus.
  • the distance determination unit 233b gives identification information (for example, a flag or a tag) to a point (pixel) on the approximate curve where the distance is determined to be greater than or equal to a predetermined threshold.
  • the image region specifying unit 233c is sandwiched between a layer region and a set of pixels on the approximate curve to which identification information is given by the distance determination unit 233b, that is, a set of points on the approximate curve whose distance is determined to be equal to or greater than a predetermined threshold.
  • the specified image area is specified as a target protruding area.
  • FIG. 7 shows an example of the protruding area specified in this way.
  • the protruding region 600 is an image region sandwiched between the layer region 300 and the approximate curve 400, and is an image region in which the distance between the layer region 300 and the approximate curve 400 is equal to or greater than a predetermined threshold value.
  • the portion of the base close to the feature parts 330 and 340 that are the common points the distance is less than a predetermined threshold value. Is not determined as a protruding region. In the range shown in FIG. 7, there is only one peak in the layer region, but when there are two or more peaks, there may be a portion that is not determined as a protruding region other than the vicinity of the characteristic part. is there.
  • connection area specifying unit 234 specifies the connection area including the protruding area specified by the protruding area specifying part 233 and surrounded by the approximate curve and the layer area. This process is executed by, for example, a labeling process for pixels. This labeling process is, for example, a labeling process of 4 connections (near 4) or 8 connections (near 8).
  • FIG. 8 shows a connected area specified when the above processing is applied to the example of FIG.
  • the protruding region 600 shown in FIG. 7 is an image region obtained by excluding a portion where the distance between the layer region 300 and the approximate curve 400 is less than a predetermined threshold from the image region sandwiched between the layer region 300 and the approximate curve 400. is there.
  • the connection area specifying unit 234 specifies the connection area including the protruding area 600 and surrounded by the approximate curve 400 and the layer area 300, that is, the connection area indicated by reference numeral 700 in FIG.
  • the protruding region 600 excluding the skirt portion close to the feature parts 330 and 340 is specified, but in the step shown in FIG. 8, the connecting region 700 including the skirt portion is specified. Is done. By performing such stepwise processing, the entire image region considered to correspond to drusen is preferably detected.
  • a connected region including each protruding region is specified.
  • each of the connected regions sandwiched between the layer region and the approximate curve is specified, and a portion with a large protrusion amount for each connected region, that is, a portion where the distance between the layer region and the approximate curve in the depth direction is equal to or greater than a predetermined threshold By determining whether or not exists, it is possible to specify the target connected region.
  • Such modifications are also included in the gist of the present invention.
  • the form information generation unit 235 generates form information representing the form of the connected area specified by the connected area specifying unit 234.
  • the form information includes, for example, the number, size, distribution state, and the like of the identified connected areas.
  • the form information generation unit 235 includes a nipple region determination unit 235a, a distribution image formation unit 235b, a count unit 235c, and a size calculation unit 235d.
  • the nipple region determination unit 235a analyzes the tomographic image and determines whether the tomographic image includes a nipple region corresponding to the optic nerve head of the fundus oculi Ef.
  • the optic disc is depicted in a tomographic image as a dent in the depth direction (+ z direction) in the fundus oculi Ef.
  • the nipple region determination unit 235a determines whether or not such a shape is depicted in a tomographic image.
  • the nipple region determination unit 235a analyzes the tomographic image, identifies a feature layer region corresponding to a predetermined feature layer of the fundus oculi Ef, and determines whether the nipple region is included based on the shape of the feature layer region.
  • the characteristic layer to be identified is an arbitrary layer tissue constituting the fundus oculi Ef, that is, an arbitrary layer tissue constituting the retina, a choroid, or a sclera.
  • the layer structure constituting the retina includes the inner boundary membrane, nerve fiber layer, ganglion cell layer, inner reticular layer, inner granular layer, outer reticular layer, outer granular layer, outer boundary membrane, photoreceptor layer, retinal pigment epithelial layer There is.
  • FIG. 9 shows an example of a tomographic image in which a part of the optic nerve head is depicted.
  • the inner boundary film region 800 is shown together with the layer region 300 and the approximate curve 400. It is assumed that the depressed portion 810 in the + z direction at the right end portion of the inner boundary membrane region 800 is caused by the depressed shape of the optic nerve head.
  • this portion may be detected as the connection region 700.
  • the nipple region determination unit 235a determines whether or not there is a portion depressed in the depth direction (+ z direction) in the inner boundary membrane region, and determines that the nipple region is included when it is determined that the depressed portion exists. As an example of this processing, the nipple region determination unit 235a obtains the amount of depression in the + z direction of the inner boundary membrane region 800 at the frame end of the tomographic image G. This amount of depression is calculated, for example, as the difference between the point located most in the ⁇ z direction and the point located most in the + z direction in the inner boundary film region 800.
  • the nipple region determination unit 235a determines that the tomogram G includes a nipple region. As another process, it is determined whether or not the inner boundary membrane region 800 has a curved portion (for example, a portion curved with a curvature equal to or greater than a predetermined value), and when the curved portion exists, it is determined that the nipple region is included. Is also possible.
  • the measurement target for OCT measurement of the fundus oculi Ef is the optic nerve head or the macula and its vicinity. It is not particularly necessary to determine the nipple area in the tomographic image in which the optic nerve head is a measurement target.
  • the optic nerve head may be depicted near the left end of the frame, and the eye to be examined may be the right eye.
  • the optic disc may be depicted near the right edge of the frame.
  • the storage unit 212 stores left and right eye information indicating whether the eye E is the left eye or the right eye.
  • the left and right eye information may be manually input by the user or may be automatically acquired from an electronic medical record or the like. Further, when the OCT measurement is performed, it is automatically determined whether the eye E is the left eye or the right eye based on the positional relationship between the device optical system and the chin rest (forehead), and the determination result is
  • the left and right eye information can also be stored in association with the tomographic image.
  • the nipple region determination unit 235a recognizes whether the eye E is the left eye or the right eye based on the left and right eye information.
  • the nipple area determination unit 235a determines whether or not the nipple area is included by analyzing at least the vicinity of the left end of the frame of the tomographic image.
  • the nipple region determination unit 235a determines whether or not the nipple region is included by analyzing at least the vicinity of the right end of the frame of the tomographic image.
  • the amount of data provided for the processing can be reduced, so that the processing time is shortened and the processing resources are reduced. Can be achieved.
  • nipple region determination unit 235a determines whether the scanning line corresponding to the tomographic image passes through at least a part of the optic nerve head, that is, at least a part of the optic nerve head is depicted in the tomographic image. Can be determined. By such determination processing, it is possible to determine whether the nipple region is included in the tomographic image.
  • the morphological information generation unit 235 excludes the connected region located in the vicinity of the nipple region from the connected regions specified by the connected region specifying unit. Then, form information is generated.
  • the excluded connection region is, for example, a connection region located closest to the nipple region. Moreover, you may make it exclude the connection area
  • the distribution image forming unit 235b forms a distribution image representing the distribution state of the connected area specified by the connected area specifying unit 234. For example, a distribution image representing the distribution of connected regions in a single tomographic image can be formed. In this process, for example, the connected area is presented in a mode (display color, display brightness, etc.) different from other image areas.
  • a distribution image representing the distribution state of the connected region in the xy plane orthogonal to the depth direction can be formed.
  • processing for forming a distribution image in the xy plane will be described.
  • a plurality of tomographic images are obtained, for example, by executing a three-dimensional scan described later.
  • the three-dimensional scan is a scan form in which the irradiation position of the signal light LS is scanned along a plurality of linear scanning lines arranged in the x direction and in the y direction, for example.
  • a plurality of tomographic images in a cross section along each scanning line are obtained by the three-dimensional scanning.
  • the distribution image forming unit 235b forms a distribution image of the connected area in the xy plane based on the connected area specified for each of these tomographic images.
  • each connected region is an image region that extends in the x direction (scan line direction) and the z direction.
  • the plurality of tomographic images are arranged in the y direction.
  • the connected regions in each tomographic image are combined to obtain a two-dimensional distribution (distribution in the xy plane) of the connected regions.
  • the pixels between these connected regions may be set as the connected regions. This processing is particularly effective when the interval between adjacent tomographic images (scan line interval) is sufficiently narrow.
  • the distribution image forming unit 235b forms a distribution image by expressing the pixel values of the pixels corresponding to the connected region and the other pixels, for example, differently.
  • a binary image can be formed by distinguishing and expressing connected regions and other regions by binary values, which can be used as a distribution image.
  • the distribution image 900 represents the distribution state of the connected region T when the fundus oculi Ef is viewed from the incident direction ( ⁇ z direction) of the signal light LS.
  • the counting unit 235c counts the number of connected areas specified by the connected area specifying unit 234. In this process, for example, the numbers 1, 2,. And the maximum number assigned is used as the number of connected regions. In the labeling process executed in the process for specifying the connected area, the process for counting the connected areas may be performed in parallel.
  • the size calculating unit 235d calculates the size of each connected area specified by the connected area specifying unit 234.
  • Examples of the index representing the size of the connected region include area, diameter (diameter, radius, etc.), volume, and the like.
  • an example of the size calculation process of the connected area will be described.
  • Each connected region is a set of a plurality of pixels determined to be connected.
  • An area (unit area) is set in advance for each pixel.
  • This unit area may be arbitrarily set for the distribution image or tomographic image. For example, in consideration of the measurement magnification, the area in the real space corresponding to one pixel can be set as the unit area.
  • the size calculation unit 235d calculates the product of the number of pixels included in each connected region and the unit area to obtain the area of the connected region.
  • the size calculator 235d first calculates the area as described above.
  • the size calculation unit 235d sets the diameter (or radius) of the circle having this area as the diameter of the connection region. It is also possible to search for the longest line segment included in the connection area and adopt the length of this line segment as the diameter of the connection area. In addition to these, any distance that can characterize the connection region can be adopted as the diameter.
  • the size calculator 235d calculates the volume of the connected region by integrating the distance over each connected region.
  • the size calculation method is not limited to the above. Further, the index (dimension) indicating the size is not limited to the above.
  • the image processing unit 230 that functions as described above includes, for example, the aforementioned microprocessor, RAM, ROM, hard disk drive, circuit board, and the like.
  • a storage device such as a hard disk drive, a computer program for causing the microprocessor to execute the above functions is stored in advance.
  • the user interface 240 includes a display unit 240A and an operation unit 240B.
  • the display unit 240A includes the display device of the arithmetic control unit 200 and the display device 3 described above.
  • the operation unit 240B includes the operation device of the arithmetic control unit 200 described above.
  • the operation unit 240B may include various buttons and keys provided on the housing of the fundus analysis apparatus 1 or outside.
  • the fundus camera unit 2 has a housing similar to that of a conventional fundus camera
  • the operation unit 240B may include a joystick, an operation panel, or the like provided on the housing.
  • the display unit 240 ⁇ / b> A may include various display devices such as a touch panel provided on the housing of the fundus camera unit 2.
  • the display unit 240A and the operation unit 240B do not need to be configured as individual devices.
  • a device in which a display function and an operation function are integrated such as a touch panel
  • the operation unit 240B includes the touch panel and a computer program.
  • the operation content for the operation unit 240B is input to the control unit 210 as an electrical signal. Further, operations and information input may be performed using a graphical user interface (GUI) displayed on the display unit 240A and the operation unit 240B.
  • GUI graphical user interface
  • Examples of the scanning mode of the signal light LS by the fundus analyzing apparatus 1 include a horizontal scan, a vertical scan, a cross scan, a radiation scan, a circle scan, a concentric scan, and a spiral (vortex) scan. These scanning modes are selectively used as appropriate in consideration of the observation site of the fundus, the analysis target (such as retinal thickness), the time required for scanning, the precision of scanning, and the like.
  • the horizontal scan is to scan the signal light LS in the horizontal direction (x direction).
  • the horizontal scan also includes an aspect in which the signal light LS is scanned along a plurality of horizontal scanning lines arranged in the vertical direction (y direction). In this aspect, it is possible to arbitrarily set the scanning line interval.
  • the aforementioned three-dimensional tomographic image can be formed by sufficiently narrowing the interval between adjacent scanning lines (three-dimensional scanning). The same applies to the vertical scan.
  • the cross scan scans the signal light LS along a cross-shaped trajectory composed of two linear trajectories (straight trajectories) orthogonal to each other.
  • the signal light LS is scanned along a radial trajectory composed of a plurality of linear trajectories arranged at a predetermined angle.
  • the cross scan is an example of a radiation scan.
  • the circle scan scans the signal light LS along a circular locus.
  • the signal light LS is scanned along a plurality of circular trajectories arranged concentrically around a predetermined center position.
  • a circle scan is an example of a concentric scan.
  • the signal light LS is scanned along a spiral (spiral) locus while the radius of rotation is gradually reduced (or increased).
  • the galvano scanner 42 is configured to scan the signal light LS in directions orthogonal to each other, the signal light LS can be scanned independently in the x direction and the y direction, respectively. Further, by simultaneously controlling the directions of the two galvanometer mirrors included in the galvano scanner 42, the signal light LS can be scanned along an arbitrary locus on the xy plane. Thereby, various scanning modes as described above can be realized.
  • a tomographic image on a plane stretched by the direction along the scanning line (scanning locus) and the fundus depth direction (z direction) can be acquired.
  • the above-described three-dimensional tomographic image can be acquired.
  • the region on the fundus oculi Ef to be scanned with the signal light LS as described above, that is, the region on the fundus oculi Ef to be subjected to OCT measurement is called a scanning region.
  • the scanning area in the three-dimensional scan is a rectangular area in which a plurality of horizontal scans are arranged.
  • the scanning area in the concentric scan is a disk-shaped area surrounded by the locus of the circular scan with the maximum diameter.
  • the scanning area in the radial scan is a disk-shaped (or polygonal) area connecting both end positions of each scan line.
  • FIG. 11 illustrates an example of the operation of the fundus analysis apparatus 1.
  • a tomographic image of the fundus oculi Ef is obtained by performing OCT measurement of the eye E.
  • the layer region specifying unit 231 specifies an image region (layer region) corresponding to the retinal pigment epithelium layer in the tomographic image based on the pixel value of the pixel of the tomographic image of the fundus oculi Ef.
  • the information obtained by the layer region specifying unit 231 is sent to the approximate curve calculation unit 232.
  • the characteristic part specifying unit 232a specifies a plurality of characteristic parts based on the shape of the layer area.
  • the feature part interpolation unit 232b performs a process of adding a feature part of the layer region as necessary.
  • the free curve calculation unit 232c obtains a free curve based on the feature part specified in step 3 and the feature part added in step 4.
  • the correction unit 232d corrects the free curve obtained in step 5 as necessary. Accordingly, an approximate curve representing the estimated shape of the retinal pigment epithelium layer when it is assumed that drusen does not exist in the cross section indicated by the tomogram is obtained.
  • the distance calculation unit 233a calculates the distance in the depth direction between each point on the approximate curve obtained in Step 6 (Step 5 when correction is not performed) and the layer region.
  • the distance determination unit 233b determines whether each distance calculated in step 7 is equal to or greater than a predetermined threshold.
  • the image area specifying unit 233c specifies an image area sandwiched between a set of points on the approximate curve whose distance is determined to be greater than or equal to a predetermined threshold in step 8 and the layer area. This image area becomes a protruding area.
  • connection area specifying unit 234 specifies the connection area including the protruding area specified in step 9 and surrounded by the approximate curve and the layer area.
  • the nipple region determination unit 235a analyzes the tomographic image and determines whether the tomographic image includes a nipple region corresponding to the optic nerve head of the fundus oculi Ef. When it is determined that the nipple region is included, the form information generation unit 235 excludes the connected region located in the vicinity of the nipple region from the connected regions specified in Step 10.
  • the form information generation unit 235 generates form information indicating the form of the connected area specified in step 10 (excluding the area excluded in step 11).
  • the image processing unit 230 generates analysis result presentation information including the form information generated in step 12.
  • the main control unit 211 causes the display unit 240A to display the analysis result based on the analysis result presentation information.
  • This analysis result includes information indicating the presence / absence of drusen in the fundus oculi Ef and information indicating the size and distribution of drusen present in the fundus oculi Ef.
  • An analysis report can be printed out based on the analysis result presentation information.
  • analysis result presentation information, information obtained by the above processing, and information on a patient and an eye to be examined can be transmitted to an external apparatus or recorded on a recording medium.
  • the fundus analysis apparatus 1 includes a storage unit 212, a layer region specifying unit 231, an approximate curve calculating unit 232, a protruding region specifying unit 233, a connected region specifying unit 234, and a form information generating unit 235.
  • the storage unit 212 stores a tomographic image depicting the layer structure of the fundus oculi Ef.
  • the layer region specifying unit 231 specifies a layer region in the tomographic image corresponding to the retinal pigment epithelium layer based on the pixel value of the pixel of the tomographic image.
  • the approximate curve calculation unit 232 obtains an approximate curve based on the shape of the layer region.
  • the protruding region specifying unit 233 specifies a protruding region in which the approximate curve is located in the depth direction of the fundus oculi Ef with respect to the layer region, and the distance between the approximate curve and the layer region in the depth direction is equal to or greater than a predetermined threshold.
  • the connection area specifying unit 234 specifies the connection area including the protruding area and surrounded by the approximate curve and the layer area.
  • the form information generation unit 235 generates form information representing the form of the connected area.
  • a fundus analysis apparatus 1 it is possible to effectively detect an image region considered to correspond to drusen. More specifically, by specifying a protruding region that protrudes greatly, an image region that is considered to be drusen can be suitably detected by removing small irregularities caused by noise or the like. Further, by detecting the whole connected region including such a protruding region, an image region that is considered to correspond to drusen can be detected without omission. Therefore, according to this embodiment, it is possible to effectively detect drusen based on the tomographic image of the fundus oculi Ef.
  • the approximate curve calculation unit 232 may include a feature part specifying unit 232a that specifies a plurality of feature parts in the layer region based on the shape of the layer region. In that case, the approximate curve calculation unit 232 obtains an approximate curve based on the plurality of identified characteristic parts. By considering a plurality of characteristic parts, an approximate curve can be obtained with higher accuracy and accuracy.
  • the characteristic part specifying unit 232a specifies the deepest part in the layer area in the depth direction based on the shape of the layer area as a characteristic part, obtains a straight line that passes through the deepest part and touches the layer area, It is possible to configure the contact point with the straight line as a further characteristic part. Further, the feature part specifying unit 232a can be configured to sequentially specify the contact points by rotating a straight line passing through the deepest part around the deepest part. Further, the characteristic part specifying unit 232a specifies a contact by rotating a straight line passing through the deepest part around the deepest part, and further rotates a straight line passing through the specified contact around the contact. It can be configured to be specific.
  • the approximate curve calculation unit 232 may include a feature part interpolation unit 232b.
  • the feature part interpolation unit 232b obtains a distance between two adjacent feature parts among the feature parts specified by the feature part specifying unit 232a. Furthermore, the feature part interpolation unit 232b adds a new feature part between the two feature parts when the distance is equal to or greater than a predetermined value. By performing such an interpolation process, an approximate curve can be obtained with higher accuracy and accuracy.
  • the approximate curve calculation unit 232 can obtain a free curve based on a plurality of characteristic parts, and can obtain an approximate curve based on the free curve.
  • the free curve may be a spline curve, particularly a cubic spline curve.
  • the approximate curve calculation unit 232 is provided with a straight line part specifying unit 232e and a first deformation unit 232f.
  • the straight part specifying unit 232e specifies a substantially straight part in the layer region.
  • the first deforming unit 232f deforms the free curve so that the part of the free curve matches the position of the straight line part.
  • the approximate curve calculation unit 232 obtains an approximate curve based on the deformation result by the first deformation unit 232f.
  • the approximate curve calculation unit 232 is provided with a curve part specifying unit 232g and a second deformation unit 232h.
  • the curved part specifying unit 232g specifies a part located in the reverse direction of the depth direction with respect to the layer region in the free curve.
  • the second deforming unit 232h deforms the free curve so that the specific part is matched with the position of the layer region.
  • the approximate curve calculation unit 232 obtains an approximate curve based on the deformation result by the second deformation unit 232h.
  • the approximate curve calculation unit 232 includes a protrusion determination unit 232i and a third deformation unit 232j.
  • the protrusion determination unit 232i determines whether the layer region protrudes in the direction opposite to the depth direction in the vicinity of the end of the frame of the tomographic image.
  • the third deforming portion 232j replaces the portion of the free curve corresponding to the protruding portion with an extension of the free curve from the center side of the frame. Deform the curve.
  • the approximate curve calculation unit 232 obtains an approximate curve based on the deformation result by the third deformation unit 232j.
  • the protruding area specifying unit 233 can specify the protruding area by performing the following processing, for example. First, the protruding area specifying unit 233 calculates the distance between each point on the approximate curve in the depth direction of the fundus oculi Ef and the layer area, and determines whether the calculated distance is equal to or greater than a predetermined threshold. Then, the protruding area specifying unit 233 specifies an image area sandwiched between a set of points on the approximate curve determined to have a distance greater than or equal to a predetermined threshold and the layer area, and sets this image area as the protruding area. By performing such processing, it is possible to suitably specify the protruding region.
  • the form information generation unit 235 may include a nipple region determination unit 235a.
  • the nipple region determination unit 235a analyzes the tomographic image and determines whether or not the nipple region corresponding to the optic nerve head of the fundus oculi Ef is included in the tomographic image.
  • the shape information generation unit 235 generates the shape information by excluding the connected region located in the vicinity of the nipple region among the connected regions specified by the connected region specifying unit 234. This process can prevent erroneous detection due to the curved shape of the retinal pigment epithelium layer in the vicinity of the optic nerve head, so that it is possible to more effectively detect an image region considered to correspond to drusen.
  • the nipple region determination unit 235a analyzes a tomographic image to identify a feature layer region corresponding to a predetermined feature layer of the fundus oculi Ef, and determines whether the nipple region is included based on the shape of the feature layer region. Configured.
  • the predetermined feature layer may be an inner boundary film that forms a boundary with the vitreous body in the fundus oculi Ef.
  • the nipple region determination unit 235a determines whether there is a portion that is depressed in the depth direction of the fundus oculi Ef in the feature layer region (inner boundary membrane region), and when it is determined that there is a depressed portion, the nipple region Is determined to be included. By this process, the nipple region can be detected effectively.
  • the nipple region determination unit 235a can execute different processes upon receiving input of left and right eye information indicating whether the eye E is the left eye or the right eye. That is, when the eye E is the left eye, the nipple region determination unit 235a performs the above determination by analyzing at least the vicinity of the left end of the frame of the tomographic image. On the other hand, when the eye E is the right eye, the nipple region determination unit 235a performs the above determination by analyzing at least the vicinity of the right end of the frame of the tomographic image. By this process, it is possible to shorten the time required for the analysis process and reduce resources.
  • the fundus analyzer 1 divides the light from the light source unit 101 into the signal light LS and the reference light LR, and superimposes the signal light LS passing through the fundus oculi Ef of the eye E and the reference light LR passing through the reference optical path.
  • An optical system that generates and detects the interference light LC, and an image forming unit (image forming unit 220, image processing unit 230) that forms a tomographic image of the fundus oculi Ef based on the detection result of the interference light LC. Also good.
  • the storage unit 212 stores the tomographic image formed by the image forming unit.
  • the tomographic image formed by the image forming unit 220 is a two-dimensional tomographic image
  • the tomographic image formed by the image processing unit 230 is a three-dimensional tomographic image.
  • “line” in the process for the two-dimensional tomographic image is read as “surface”.
  • a curved line is read as a curved surface
  • a straight line is read as a flat surface. Since the difference in processing based on the difference in the dimensions of the tomographic images is merely such a convenience, the processing for both tomographic images can be said to be substantially the same in concept.
  • the fundus analysis apparatus 1 includes a storage unit 212, a layer region specification unit 231, an approximate curve calculation unit 232, a protruding region specification unit 233, a connection region specification unit 234, and a form information generation unit 235.
  • the storage unit 212 stores a three-dimensional tomogram depicting the layer structure of the fundus oculi Ef.
  • the layer region specifying unit 231 specifies a layer region in the tomographic image corresponding to the retinal pigment epithelium layer based on the pixel value of the pixel of the three-dimensional tomographic image. This layer region is generally a curved surface (which may include singular points).
  • the approximate curve calculation unit 232 obtains an approximate curved surface based on the shape of the layer region.
  • the protruding region specifying unit 233 specifies a protruding region in which the approximate curved surface is located in the depth direction of the fundus oculi Ef with respect to the layer region, and the distance between the approximate curved surface and the layer region in the depth direction is equal to or greater than a predetermined threshold.
  • the protruding region is generally a three-dimensional region.
  • the connection area specifying unit 234 specifies the connection area including the protruding area and surrounded by the approximate curved surface and the layer area.
  • the connected region is generally a three-dimensional region.
  • the form information generation unit 235 generates form information representing the form of the connected area. According to such a fundus analysis apparatus 1, it is possible to effectively detect drusen based on a three-dimensional tomographic image of the fundus oculi Ef.
  • drusen image area considered is detected based on the distance between the layer area corresponding to the retinal pigment epithelium layer and the approximate curve (approximate curved surface), but the sensitivity of OCT measurement is improved.
  • an image region corresponding to a Bruch's film (referred to as a film region) may be detected by improving the detection, and drusen may be detected based on a distance between the layer region and the film region. Since drusen occurs between the Bruch's membrane and the retinal pigment epithelial layer, it is possible to grasp drusen's morphology with higher accuracy and accuracy by executing the processing according to this modification. .
  • the connected region is specified based on the distance between the film region and the layer region, and the form information is generated.
  • an approximate curve (approximate curved surface) is obtained as in the above embodiment, and the connected region is specified based on the layer region and the approximate curve (approximate curved surface) to obtain the form information.
  • the drusen form when the membrane region is specified, the drusen form can be grasped with high accuracy and high accuracy, and when the membrane region is not specified, an approximate curve (or approximate curved surface) is used. You can grasp the form of drusen.
  • the position of the reference mirror 114 is changed to change the optical path length difference between the optical path of the signal light LS and the optical path of the reference light LR, but the method of changing the optical path length difference is limited to this. Is not to be done.
  • the optical path length difference can be changed by moving the fundus camera unit 2 or the OCT unit 100 with respect to the eye E to change the optical path length of the signal light LS. It is also effective to change the optical path length difference by moving the measurement object in the depth direction (z-axis direction), especially when the measurement object is not a living body part.
  • the fundus analysis program causes a computer having a storage unit that stores a tomographic image depicting the layer structure of the fundus to execute an operation described later.
  • An example of this computer is the arithmetic control unit of the above embodiment.
  • This fundus analysis program may be stored in the computer itself, or may be stored in a server or the like connected to be communicable with the computer.
  • the computer specifies a layer region in the tomographic image corresponding to the retinal pigment epithelium layer based on the pixel value of the pixel of the tomographic image.
  • the computer obtains an approximate curve based on the shape of the layer region.
  • the computer determines a projection region in which the approximate curve is located in the depth direction of the fundus than the layer region, and the distance in the depth direction between the approximate curve and the layer region is equal to or greater than a predetermined threshold. Identify.
  • the computer (connected area specifying unit) specifies the connected area including the protruding area and surrounded by the approximate curve and the layer area.
  • a computer (form information generation part) produces
  • the computer can display, print, and transmit the generated form information.
  • drusen can be detected effectively based on a tomographic image of the fundus as in the above embodiment.
  • the fundus analysis program according to this embodiment may cause a computer to execute any function that the fundus analysis apparatus 1 of the above embodiment has.
  • This fundus analysis program causes a computer having a storage unit that stores a three-dimensional tomographic image that describes the layer structure of the fundus to perform an operation described later.
  • An example of this computer is the arithmetic control unit of the above embodiment.
  • the computer specifies a layer region in the tomographic image corresponding to the retinal pigment epithelial layer based on the pixel values of the pixels of the three-dimensional tomographic image.
  • the computer obtains an approximate curved surface based on the shape of the layer region.
  • the computer determines a projection region in which the approximate curved surface is located in the depth direction of the fundus than the layer region, and the distance between the approximate curved surface and the layer region in the depth direction is equal to or greater than a predetermined threshold. Identify.
  • the computer (connected area specifying unit) specifies the connected area including the protruding area and surrounded by the approximate curved surface and the layer area.
  • a computer (form information generation part) produces
  • the computer can display, print, and transmit the generated form information.
  • drusen can be detected effectively based on a three-dimensional tomographic image of the fundus as in the above embodiment.
  • the fundus analysis program according to this embodiment may cause a computer to execute any function that the fundus analysis apparatus 1 of the above embodiment has.
  • the fundus analysis program according to the embodiment can be stored in any recording medium readable by a computer.
  • this recording medium for example, an optical disk, a magneto-optical disk (CD-ROM / DVD-RAM / DVD-ROM / MO, etc.), a magnetic storage medium (hard disk / floppy (registered trademark) disk / ZIP, etc.), etc. are used. Is possible. It can also be stored in a storage device such as a hard disk drive or memory.
  • This fundus analysis method analyzes a tomographic image describing the layer structure of the fundus and includes the following steps.
  • This fundus analysis method can also be grasped as an apparatus control method for causing the fundus analysis apparatus (OCT apparatus or the like) to execute the following steps.
  • the fundus analysis method (or device control method) according to this embodiment may include any process that can be executed by the fundus analysis device 1 of the above embodiment.
  • This fundus analysis method analyzes a three-dimensional tomographic image that describes the layer structure of the fundus, and includes the following steps.
  • a layer region in the tomographic image corresponding to the retinal pigment epithelium layer is specified based on the pixel value of the pixel of the three-dimensional tomographic image.
  • An approximate curved surface based on the shape of the layer region is obtained.
  • a protruding region in which the approximate curved surface is located in the depth direction of the fundus occupying the layer region and the distance in the depth direction between the approximate curved surface and the layer region is greater than or equal to a predetermined threshold is specified.
  • a connected area including the protruding area and surrounded by the approximate curved surface and the layer area is specified.
  • Form information representing the form of the connected area is generated.
  • the fundus analysis method (or device control method) according to this embodiment may include any process that can be executed by the fundus analysis device 1 of the above embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

Provided is a technology that can effectively detect drusen on the basis of a fundus tomogram. This fundus analysis device has a recording unit, a layer region identification unit, a similar curve computation unit, a protruding region identification unit, a joint region identification unit, and a form information generation unit. The recording unit records a tomogram depicting the layered structure of the fundus. On the basis of the pixel values of the pixels of the tomogram, the layer region identification unit identifies a layer region within the tomogram corresponding to the retinal pigment epithelium. The similar curve computation unit determines a similar curve on the basis of the shape of the layer region. The protruding region identification unit identifies a protruding region at which the similar curve is positioned in the direction of depth of the fundus with respect to the layer region and the distance between the layer region and the similar curve in the direction of depth is at least a predetermined threshold. The joint region identification unit identifies a joint region that includes the protruding region and is encircled by the layer region and the similar curve. The form information generation unit generates form information representing the form of the joint region.

Description

眼底解析装置、眼底解析プログラム及び眼底解析方法Fundus analysis apparatus, fundus analysis program, and fundus analysis method
 この発明は、光コヒーレンストモグラフィ(Optical Coherence Tomography:OCT)を用いて形成された眼底の画像を解析する技術に関する。 The present invention relates to a technique for analyzing a fundus image formed by using optical coherence tomography (OCT).
 近年、レーザ光源等からの光ビームを用いて被測定物体の表面形態や内部形態を表す画像を形成するOCTが注目を集めている。OCTは、X線CTのような人体に対する侵襲性を持たないことから、特に医療分野や生物学分野における応用の展開が期待されている。たとえば眼科分野においては、眼底や角膜等の画像を形成する装置が実用化段階に入っている。 In recent years, OCT that forms an image representing the surface form and internal form of an object to be measured using a light beam from a laser light source or the like has attracted attention. Since OCT has no invasiveness to the human body like X-ray CT, it is expected to be applied particularly in the medical field and the biological field. For example, in the field of ophthalmology, an apparatus for forming an image of the fundus oculi, cornea, etc. has entered a practical stage.
 特許文献1にはOCTを適用した装置が開示されている。この装置は、測定腕が回転式転向鏡(ガルバノミラー)により物体を走査し、参照腕に参照ミラーが設置されており、その出口に計測腕及び参照腕からの光束の干渉光の強度を分光器で分析する干渉器が設けられている。更に、参照腕は、参照光光束位相を不連続な値で段階的に変えるように構成されている。 Patent Document 1 discloses an apparatus to which OCT is applied. In this device, the measuring arm scans an object with a rotary turning mirror (galvanomirror), a reference mirror is installed on the reference arm, and the intensity of the interference light of the light beam from the measuring arm and the reference arm is dispersed at the exit. An interferometer is provided for analysis by the instrument. Further, the reference arm is configured to change the phase of the reference light beam stepwise by a discontinuous value.
 特許文献1の装置は、いわゆる「フーリエドメインOCT(Fourier Domain OCT)」の手法を用いるものである。すなわち、被測定物体に対して低コヒーレンス光のビームを照射し、その反射光と参照光とを重ね合わせて干渉光を生成し、この干渉光のスペクトル強度分布を取得してフーリエ変換を施すことにより被測定物体の深度方向(z方向)の形態を画像化するものである。なお、このタイプの手法は、特にスペクトラルドメイン(Spectral Domain)とも呼ばれる。 The apparatus of Patent Document 1 uses a so-called “Fourier Domain OCT (Fourier Domain OCT)” technique. In other words, a low-coherence beam is irradiated onto the object to be measured, the reflected light and the reference light are superimposed to generate interference light, and the spectral intensity distribution of the interference light is acquired and subjected to Fourier transform. Thus, the form of the object to be measured in the depth direction (z direction) is imaged. Note that this type of technique is also called a spectral domain.
 更に、特許文献1に記載の装置は、光ビーム(信号光)を走査するガルバノミラーを備え、それにより被測定物体の所望の測定対象領域の画像を形成するようになっている。この装置においては、z方向に直交する1方向(x方向)にのみ光ビームを走査するように構成されているので、この装置により形成される画像は、光ビームの走査方向(x方向)に沿った深度方向(z方向)の2次元断層像となる。 Furthermore, the apparatus described in Patent Document 1 includes a galvanometer mirror that scans a light beam (signal light), thereby forming an image of a desired measurement target region of the object to be measured. Since this apparatus is configured to scan the light beam only in one direction (x direction) orthogonal to the z direction, the image formed by this apparatus is in the scanning direction (x direction) of the light beam. It becomes a two-dimensional tomogram in the depth direction (z direction) along.
 特許文献2には、信号光を水平方向(x方向)及び垂直方向(y方向)に走査(スキャン)することにより水平方向の2次元断層像を複数形成し、これら複数の断層像に基づいて測定範囲の3次元の断層情報を取得して画像化する技術が開示されている。この3次元画像化としては、たとえば、複数の断層像を垂直方向に並べて表示させる方法や(スタックデータなどと呼ばれる)、複数の断層像にレンダリング処理を施して3次元断層像を形成する方法などが考えられる。 In Patent Document 2, a plurality of two-dimensional tomographic images in the horizontal direction are formed by scanning (scanning) the signal light in the horizontal direction (x direction) and the vertical direction (y direction), and based on the plurality of tomographic images. A technique for acquiring and imaging three-dimensional tomographic information of a measurement range is disclosed. Examples of the three-dimensional imaging include a method of displaying a plurality of tomographic images side by side in a vertical direction (referred to as stack data), a method of rendering a plurality of tomographic images and forming a three-dimensional tomographic image, and the like. Can be considered.
 特許文献3、4には、他のタイプのOCT装置が開示されている。特許文献3には、被測定物体に照射される光の波長を走査し、各波長の光の反射光と参照光とを重ね合わせて得られる干渉光に基づいてスペクトル強度分布を取得し、それに対してフーリエ変換を施すことにより被測定物体の形態を画像化するOCT装置が記載されている。このようなOCT装置は、スウェプトソース(Swept Source)タイプなどと呼ばれる。スウェプトソースタイプはフーリエドメインタイプの一種である。 Patent Documents 3 and 4 disclose other types of OCT apparatuses. Patent Document 3 scans the wavelength of light applied to an object to be measured, acquires a spectral intensity distribution based on interference light obtained by superimposing reflected light of each wavelength and reference light, On the other hand, an OCT apparatus for imaging the form of an object to be measured by performing Fourier transform on the object is described. Such an OCT apparatus is called a swept source type. The swept source type is a kind of Fourier domain type.
 また、特許文献4には、所定のビーム径を有する光を被測定物体に照射し、その反射光と参照光とを重ね合わせて得られる干渉光の成分を解析することにより、光の進行方向に直交する断面における被測定物体の画像を形成するOCT装置が記載されている。このようなOCT装置は、フルフィールド(full-field)タイプ、或いはインファス(en-face)タイプなどと呼ばれる。 In Patent Document 4, the traveling direction of light is obtained by irradiating the object to be measured with light having a predetermined beam diameter, and analyzing the component of interference light obtained by superimposing the reflected light and the reference light. An OCT apparatus for forming an image of an object to be measured in a cross-section orthogonal to is described. Such an OCT apparatus is called a full-field type or an en-face type.
 特許文献5には、OCTを眼科分野に適用した構成が開示されている。この文献に記載された装置は、眼底を撮影して眼底像を形成する機能と、OCTを用いて眼底を計測して断層像を形成する機能とを備える。更に、この装置は、断層像を解析して、眼底を構成する層組織に相当する画像領域を特定する。特定対象の層組織としては、内境界膜、神経繊維層、神経節細胞層、内網状層、内顆粒層、外網状層、外顆粒層、外境界膜、視細胞層、網膜色素上皮層などがある。なお、眼底は複数の層組織が重なって構成されるので、層組織に相当する画像領域を求めることと、隣接する層組織の境界位置に相当する画像領域を求めることは同義である。なお、OCTが応用される以前から、眼底を観察するための装置としては眼底カメラが広く使用されている(たとえば特許文献6を参照)。 Patent Document 5 discloses a configuration in which OCT is applied to the ophthalmic field. The apparatus described in this document has a function of photographing the fundus and forming a fundus image, and a function of measuring a fundus using OCT and forming a tomographic image. Further, this apparatus analyzes a tomographic image and specifies an image region corresponding to a layer tissue constituting the fundus. Specific target layer tissues include inner boundary membrane, nerve fiber layer, ganglion cell layer, inner reticular layer, inner granular layer, outer reticular layer, outer granular layer, outer boundary membrane, photoreceptor layer, retinal pigment epithelial layer, etc. There is. Since the fundus is formed by overlapping a plurality of layer tissues, obtaining an image area corresponding to the layer tissue is synonymous with obtaining an image area corresponding to a boundary position between adjacent layer tissues. Note that a fundus camera has been widely used as an apparatus for observing the fundus before OCT is applied (see, for example, Patent Document 6).
 OCTを用いた装置は、高精細の画像を取得できる点、更には断層像を取得できる点などにおいて、眼底カメラ等に対して優位性を持つ。 An apparatus using OCT has an advantage over a fundus camera or the like in that a high-definition image can be acquired and a tomographic image can be acquired.
 ところで近年、加齢黄斑変性症と呼ばれる眼疾患が注目を集めている。加齢黄斑変性症は、網膜の黄斑部の機能が老化によって低下するために起こる疾患であり、視界がゆがむ、視力が低下する、視野が部分的に見えにくくなる、周囲は正常に見えるが見ようとするものが見えない、などといった症状を引き起こす。 In recent years, an eye disease called age-related macular degeneration has attracted attention. Age-related macular degeneration is a disease that occurs because the function of the macular portion of the retina is reduced due to aging. Cause symptoms such as invisible.
 加齢黄斑変性症(滲出型)は次のようなメカニズムで発症すると考えられている。正常な網膜細胞は新陳代謝を繰り返している。新陳代謝で生じる老廃物は正常であれば網膜色素上皮内で消化されて消えてしまう。しかし加齢により網膜色素上皮の働きが低下すると、未消化の老廃物がブルッフ膜と網膜色素上皮層との間に溜まる。この状態の眼底を撮影すると、老廃物はドルーゼンと呼ばれる白い塊として認識される。老廃物が蓄積すると弱い炎症反応が起こる。そうすると特定の化学物質(ケミカルメディエータ)が産出されて炎症の治癒を促す。しかし、ケミカルメディエータには血管の発生を促す因子が存在し、それにより脈絡膜から新たな血管(新生血管)が生えてくる。新生血管がブルッフ膜を突き破って網膜色素上皮層の下または上まで侵入して増殖すると、血液や血液成分の滲出が激しくなって黄斑の機能低下が顕著になる。 Age-related macular degeneration (wetting type) is thought to develop by the following mechanism. Normal retinal cells repeat metabolism. Waste products generated by metabolism are normally digested in the retinal pigment epithelium and disappear. However, when the function of the retinal pigment epithelium decreases due to aging, undigested waste products accumulate between the Bruch's membrane and the retinal pigment epithelium layer. When photographing the fundus in this state, the waste is recognized as a white mass called drusen. When waste is accumulated, a weak inflammatory reaction occurs. Then, a specific chemical substance (chemical mediator) is produced and promotes healing of inflammation. However, chemical mediators have factors that promote the generation of blood vessels, and new blood vessels (new blood vessels) grow from the choroid. When new blood vessels break through the Bruch's membrane and invade below or above the retinal pigment epithelial layer and proliferate, blood and blood components are exuded so severely that the macular function decreases.
 加齢黄斑変性症の診断ではドルーゼンの存在や分布が重要となる。従来、ドルーゼンの状態を把握するためには主として眼底像(眼底カメラによる撮影画像)が用いられていた(たとえば特許文献7を参照)。また、断層像を用いる手法としては、本出願人による特許文献8がある。 Presence and distribution of drusen is important in the diagnosis of age-related macular degeneration. Conventionally, a fundus image (an image taken by a fundus camera) has been mainly used to grasp the drusen state (see, for example, Patent Document 7). Further, as a method using a tomographic image, there is Patent Document 8 by the present applicant.
特開平11-325849号公報Japanese Patent Laid-Open No. 11-325849 特開2002-139421号公報JP 2002-139421 A 特開2007-24677号公報JP 2007-24677 A 特開2006-153838号公報JP 2006-153838 A 特開2008-73099公報JP 2008-73099 A 特開平9-276232号公報JP-A-9-276232 特開2008-295804号公報JP 2008-295804 A 特開2011-30626号公報JP 2011-30626 A
 この発明の目的は、眼底の断層像に基づいてドルーゼンを効果的に検出することが可能な技術を提供することにある。 An object of the present invention is to provide a technique capable of effectively detecting drusen based on a tomographic image of the fundus.
 上記目的を達成するために、請求項1に記載の発明は、眼底の層構造を描写する断層像を記憶する記憶部と、前記断層像の画素の画素値に基づいて、網膜色素上皮層に相当する当該断層像中の層領域を特定する層領域特定部と、前記層領域の形状に基づく近似曲線を求める近似曲線演算部と、前記近似曲線が前記層領域よりも前記眼底の奥行方向に位置し、かつ前記奥行方向における前記近似曲線と前記層領域との間の距離が所定閾値以上である突出領域を特定する突出領域特定部と、前記突出領域を含み、かつ前記近似曲線及び前記層領域に囲まれた連結領域を特定する連結領域特定部と、前記連結領域の形態を表す形態情報を生成する形態情報生成部とを有する眼底解析装置である。
 また、請求項2に記載の発明は、請求項1に記載の眼底解析装置であって、前記近似曲線演算部は、前記層領域の形状に基づいて、前記層領域中の複数の特徴部位を特定する特徴部位特定部を含み、前記複数の特徴部位に基づいて前記近似曲線を求めることを特徴とする。
 また、請求項3に記載の発明は、請求項2に記載の眼底解析装置であって、前記特徴部位特定部は、前記層領域の形状に基づいて前記奥行方向における前記層領域中の最深部位を特定して特徴部位とし、前記最深部位を通過しかつ前記層領域に接する直線を求め、前記層領域と前記直線との接点を更なる特徴部位とすることを特徴とする。
 また、請求項4に記載の発明は、請求項3に記載の眼底解析装置であって、前記特徴部位特定部は、前記最深部位を通過する直線を前記最深部位を中心として回転させていくことにより接点を順次に特定することを特徴とする。
 また、請求項5に記載の発明は、請求項3に記載の眼底解析装置であって、前記特徴部位特定部は、前記最深部位を通過する直線を前記最深部位を中心として回転させて接点を特定し、特定された接点を通過する直線を当該接点を中心として回転させて更なる接点を特定することを特徴とする。
 また、請求項6に記載の発明は、請求項2~請求項5のいずれか一項に記載の眼底解析装置であって、前記近似曲線演算部は、隣接する2つの特徴部位の間の距離を求め、当該距離が所定値以上である場合に当該2つの特徴部位の間に新たな特徴部位を追加する特徴部位補間部を含むことを特徴とする。
 また、請求項7に記載の発明は、請求項2~請求項6のいずれか一項に記載の眼底解析装置であって、前記近似曲線演算部は、前記複数の特徴部位に基づく自由曲線を求め、前記自由曲線に基づいて前記近似曲線を求めることを特徴とする。
 また、請求項8に記載の発明は、請求項7に記載の眼底解析装置であって、前記近似曲線演算部は、前記自由曲線としてスプライン曲線を求めることを特徴とする。
 また、請求項9に記載の発明は、請求項8に記載の眼底解析装置であって、前記近似曲線演算部は、前記スプライン曲線として3次スプライン曲線を求めることを特徴とする。
 また、請求項10に記載の発明は、請求項7~請求項9のいずれか一項に記載の眼底解析装置であって、前記近似曲線演算部は、前記層領域において実質的に直線的な部位を特定する直線部位特定部と、前記自由曲線が当該直線部位よりも前記奥行方向に位置する場合に、前記自由曲線の当該部位を当該直線部位の位置に合わせるように前記自由曲線を変形する第1変形部とを含み、前記第1変形部による変形結果に基づいて前記近似曲線を求めることを特徴とする。
 また、請求項11に記載の発明は、請求項7~請求項10のいずれか一項に記載の眼底解析装置であって、前記近似曲線演算部は、前記自由曲線において前記層領域よりも前記奥行方向の逆方向に位置する部位を特定する曲線部位特定部と、当該特定部位を前記層領域の位置に合わせるように前記自由曲線を変形する第2変形部とを含み、前記第2変形部による変形結果に基づいて前記近似曲線を求めることを特徴とする。
 また、請求項12に記載の発明は、請求項7~請求項11のいずれか一項に記載の眼底解析装置であって、前記近似曲線演算部は、前記断層像のフレームの端部近傍において前記層領域が前記奥行方向の逆方向に突出しているか判定する突出判定部と、前記層領域が突出していると判定された場合に、当該突出部位に対応する前記自由曲線の部位を、前記フレームの中央側からの前記自由曲線の延長線に置換することで、前記自由曲線を変形する第3変形部とを含み、前記第3変形部による変形結果に基づいて前記近似曲線を求めることを特徴とする。
 また、請求項13に記載の発明は、請求項1~請求項12のいずれか一項に記載の眼底解析装置であって、前記突出領域特定部は、前記奥行方向における前記近似曲線上の各点と前記層領域との間の距離を算出し、算出された距離が前記所定閾値以上であるか判断し、前記所定閾値以上であると判断された前記近似曲線上の点の集合と前記層領域とに挟まれた画像領域を特定して前記突出領域とすることを特徴とする。
 また、請求項14に記載の発明は、請求項1~請求項13のいずれか一項に記載の眼底解析装置であって、前記形態情報生成部は、前記断層像を解析して、前記眼底の視神経乳頭に相当する乳頭領域が当該断層像に含まれるか判定する乳頭領域判定部を含み、前記乳頭領域が含まれると判定された場合に、前記連結領域特定部により特定された前記連結領域のうち当該乳頭領域の近傍に位置する連結領域を除外して前記形態情報を生成することを特徴とする。
 また、請求項15に記載の発明は、請求項14に記載の眼底解析装置であって、前記乳頭領域判定部は、前記断層像を解析して前記眼底の所定の特徴層に相当する特徴層領域を特定し、特定された特徴層領域の形状に基づいて前記乳頭領域が含まれるか判定することを特徴とする。
 また、請求項16に記載の発明は、請求項15に記載の眼底解析装置であって、前記所定の特徴層は、眼底において硝子体との境界をなす内境界膜であり、前記乳頭領域判定部は、前記特徴層領域において前記奥行方向に陥没している部位が存在するか判定し、当該陥没部位が存在すると判定された場合に前記乳頭領域が含まれると判定することを特徴とする。
 また、請求項17に記載の発明は、請求項14~請求項16のいずれか一項に記載の眼底解析装置であって、前記乳頭領域判定部は、被検眼が左眼であるか右眼であるかを示す左右眼情報の入力を受けて、被検眼が左眼である場合、前記断層像のフレームの少なくとも左側端部近傍を解析することにより前記判定を行い、被検眼が右眼である場合、前記フレームの少なくとも右側端部近傍を解析することにより前記判定を行うことを特徴とする。
 また、請求項18に記載の発明は、請求項1~請求項17のいずれか一項に記載の眼底解析装置であって、光源からの光を信号光と参照光とに分割し、被検眼の眼底を経由した信号光と参照光路を経由した参照光とを重畳させて干渉光を生成して検出する光学系と、干渉光の検出結果に基づいて眼底の断層像を形成する画像形成部とを更に備え、前記記憶部は、前記画像形成部により形成された断層像を記憶することを特徴とする。
 また、請求項19に記載の発明は、眼底の層構造を描写する3次元断層像を記憶する記憶部と、前記3次元断層像の画素の画素値に基づいて、網膜色素上皮層に相当する当該断層像中の層領域を特定する層領域特定部と、前記層領域の形状に基づく近似曲面を求める近似曲面演算部と、前記近似曲面が前記層領域よりも前記眼底の奥行方向に位置し、かつ前記奥行方向における前記近似曲面と前記層領域との間の距離が所定閾値以上である突出領域を特定する突出領域特定部と、前記突出領域を含み、かつ前記近似曲面及び前記層領域に囲まれた連結領域を特定する連結領域特定部と、前記連結領域の形態を表す形態情報を生成する形態情報生成部とを有する眼底解析装置である。
 また、請求項20に記載の発明は、眼底の層構造を描写する断層像を記憶する記憶部を有するコンピュータを、前記断層像の画素の画素値に基づいて、網膜色素上皮層に相当する当該断層像中の層領域を特定する層領域特定部、前記層領域の形状に基づく近似曲線を求める近似曲線演算部、前記近似曲線が前記層領域よりも前記眼底の奥行方向に位置し、かつ前記近似曲線と前記層領域との間の前記奥行方向における距離が所定閾値以上である突出領域を特定する突出領域特定部、前記突出領域を含み、かつ前記近似曲線及び前記層領域に囲まれた連結領域を特定する連結領域特定部、及び、前記連結領域の形態を表す形態情報を生成する形態情報生成部として機能させる眼底解析プログラムである。
 また、請求項21に記載の発明は、眼底の層構造を描写する3次元断層像を記憶する記憶部を有するコンピュータを、前記3次元断層像の画素の画素値に基づいて、網膜色素上皮層に相当する当該断層像中の層領域を特定する層領域特定部、前記層領域の形状に基づく近似曲面を求める近似曲面演算部、前記近似曲面が前記層領域よりも前記眼底の奥行方向に位置し、かつ前記奥行方向における前記近似曲面と前記層領域との間の距離が所定閾値以上である突出領域を特定する突出領域特定部、前記突出領域を含み、かつ前記近似曲面及び前記層領域に囲まれた連結領域を特定する連結領域特定部、及び、前記連結領域の形態を表す形態情報を生成する形態情報生成部として機能させる眼底解析プログラムである。
 また、請求項22に記載の発明は、眼底の層構造を描写する断層像を解析する眼底解析方法であって、前記断層像の画素の画素値に基づいて、網膜色素上皮層に相当する当該断層像中の層領域を特定するステップと、前記層領域の形状に基づく近似曲線を求めるステップと、前記近似曲線が前記層領域よりも前記眼底の奥行方向に位置し、かつ前記近似曲線と前記層領域との間の前記奥行方向における距離が所定閾値以上である突出領域を特定するステップと、前記突出領域を含み、かつ前記近似曲線及び前記層領域に囲まれた連結領域を特定するステップと、前記連結領域の形態を表す形態情報を生成するステップとを含む。
 また、請求項23に記載の発明は、眼底の層構造を描写する3次元断層像を解析する眼底解析方法であって、前記3次元断層像の画素の画素値に基づいて、網膜色素上皮層に相当する当該断層像中の層領域を特定するステップと、前記層領域の形状に基づく近似曲面を求めるステップと、前記近似曲面が前記層領域よりも前記眼底の奥行方向に位置し、かつ前記奥行方向における前記近似曲面と前記層領域との間の距離が所定閾値以上である突出領域を特定するステップと、前記突出領域を含み、かつ前記近似曲面及び前記層領域に囲まれた連結領域を特定するステップと、前記連結領域の形態を表す形態情報を生成するステップとを含む。
In order to achieve the above object, the invention according to claim 1 is directed to a retinal pigment epithelium layer based on a storage unit that stores a tomographic image depicting a fundus layer structure and a pixel value of a pixel of the tomographic image. A layer region specifying unit for specifying a corresponding layer region in the tomographic image, an approximate curve calculation unit for obtaining an approximate curve based on the shape of the layer region, and the approximate curve in the depth direction of the fundus than the layer region A protruding region specifying unit that specifies a protruding region that is located and a distance between the approximate curve in the depth direction and the layer region is equal to or greater than a predetermined threshold; the protruding region; and the approximate curve and the layer The fundus analysis apparatus includes a connected region specifying unit that specifies a connected region surrounded by a region and a form information generating unit that generates form information representing the form of the connected region.
The invention according to claim 2 is the fundus analysis apparatus according to claim 1, wherein the approximate curve calculation unit calculates a plurality of characteristic parts in the layer region based on the shape of the layer region. A characteristic part specifying unit to be specified is included, and the approximate curve is obtained based on the plurality of characteristic parts.
The invention according to claim 3 is the fundus analysis apparatus according to claim 2, wherein the characteristic part specifying unit is the deepest part in the layer region in the depth direction based on the shape of the layer region. Is defined as a characteristic part, a straight line passing through the deepest part and in contact with the layer region is obtained, and a contact point between the layer region and the straight line is defined as a further characteristic part.
The invention according to claim 4 is the fundus analysis apparatus according to claim 3, wherein the characteristic part specifying unit rotates a straight line passing through the deepest part around the deepest part. The contacts are sequentially identified by the above.
The invention according to claim 5 is the fundus analysis apparatus according to claim 3, wherein the characteristic part specifying unit rotates a straight line passing through the deepest part around the deepest part to provide a contact point. A further contact is specified by specifying and rotating a straight line passing through the specified contact around the contact.
The invention according to claim 6 is the fundus analysis apparatus according to any one of claims 2 to 5, wherein the approximate curve calculation unit is a distance between two adjacent characteristic parts. And a feature part interpolation unit that adds a new feature part between the two feature parts when the distance is equal to or greater than a predetermined value.
The invention according to claim 7 is the fundus analysis apparatus according to any one of claims 2 to 6, wherein the approximate curve calculation unit calculates a free curve based on the plurality of characteristic parts. And obtaining the approximate curve based on the free curve.
The invention according to claim 8 is the fundus analysis apparatus according to claim 7, wherein the approximate curve calculation unit obtains a spline curve as the free curve.
The invention according to claim 9 is the fundus analysis apparatus according to claim 8, wherein the approximate curve calculation unit obtains a cubic spline curve as the spline curve.
The invention according to claim 10 is the fundus analysis apparatus according to any one of claims 7 to 9, wherein the approximate curve calculation unit is substantially linear in the layer region. When the straight part specifying unit for specifying a part and the free curve are located in the depth direction with respect to the straight part, the free curve is deformed so that the part of the free curve matches the position of the straight part. A first deformation unit, and the approximate curve is obtained based on a deformation result by the first deformation unit.
The invention according to claim 11 is the fundus analysis apparatus according to any one of claims 7 to 10, wherein the approximate curve calculation unit is more in the free curve than the layer region. A curved part specifying part for specifying a part located in the reverse direction of the depth direction; and a second deforming part for deforming the free curve so as to match the specific part with the position of the layer region, The approximate curve is obtained on the basis of the deformation result obtained by the above.
The invention according to claim 12 is the fundus analysis apparatus according to any one of claims 7 to 11, wherein the approximate curve calculation unit is in the vicinity of an end of a frame of the tomographic image. A protrusion determining unit that determines whether the layer region protrudes in a direction opposite to the depth direction, and when it is determined that the layer region protrudes, the free curve portion corresponding to the protruding portion is defined as the frame. A third deforming portion that deforms the free curve by replacing with an extension line of the free curve from the center side, and the approximate curve is obtained based on a deformation result by the third deforming portion. And
The invention according to claim 13 is the fundus analysis apparatus according to any one of claims 1 to 12, wherein the projecting region specifying unit includes each of the approximate curves in the depth direction. A distance between a point and the layer region is calculated, it is determined whether the calculated distance is equal to or greater than the predetermined threshold, and a set of points on the approximate curve determined to be equal to or greater than the predetermined threshold and the layer An image region sandwiched between regions is specified as the protruding region.
The invention according to claim 14 is the fundus analysis apparatus according to any one of claims 1 to 13, wherein the morphological information generation unit analyzes the tomographic image to obtain the fundus Including a nipple region determination unit that determines whether a nipple region corresponding to the optic disc is included in the tomographic image, and the connection region specified by the connection region specifying unit when it is determined that the nipple region is included The shape information is generated by excluding a connected region located in the vicinity of the nipple region.
The invention according to claim 15 is the fundus analysis apparatus according to claim 14, wherein the papillary region determination unit analyzes the tomographic image and corresponds to a predetermined feature layer of the fundus. A region is specified, and it is determined whether the nipple region is included based on the shape of the specified feature layer region.
The invention according to claim 16 is the fundus analysis apparatus according to claim 15, wherein the predetermined feature layer is an inner boundary film that forms a boundary with the vitreous body in the fundus, and the papillary region determination The unit determines whether or not there is a portion depressed in the depth direction in the feature layer region, and determines that the nipple region is included when it is determined that the depressed portion exists.
The invention according to claim 17 is the fundus analysis apparatus according to any one of claims 14 to 16, wherein the papillary region determination unit is configured such that the eye to be examined is the left eye or the right eye. If the subject's eye is the left eye, the determination is made by analyzing at least the vicinity of the left end of the frame of the tomographic image, and the subject's eye is the right eye. In some cases, the determination is performed by analyzing at least the vicinity of the right end of the frame.
The invention according to claim 18 is the fundus analysis apparatus according to any one of claims 1 to 17, wherein the light from the light source is divided into signal light and reference light, and the eye to be inspected. System that generates and detects interference light by superimposing signal light that passes through the fundus of the eye and reference light that passes through the reference optical path, and an image forming unit that forms a tomographic image of the fundus based on the detection result of the interference light The storage unit stores a tomographic image formed by the image forming unit.
The invention according to claim 19 corresponds to a retinal pigment epithelial layer based on a storage unit that stores a three-dimensional tomographic image depicting a layer structure of the fundus and a pixel value of a pixel of the three-dimensional tomographic image. A layer region specifying unit for specifying a layer region in the tomographic image, an approximate curved surface calculating unit for obtaining an approximate curved surface based on the shape of the layer region, and the approximate curved surface being positioned in the depth direction of the fundus than the layer region. And a protruding area specifying unit that specifies a protruding area in which a distance between the approximate curved surface and the layer area in the depth direction is equal to or greater than a predetermined threshold; and the protruding area, and the approximate curved surface and the layer area The fundus analysis apparatus includes a connected region specifying unit that specifies an enclosed connected region, and a morphological information generating unit that generates morphological information representing the form of the connected region.
According to a twentieth aspect of the present invention, there is provided a computer having a storage unit that stores a tomogram depicting a layer structure of the fundus oculi corresponding to a retinal pigment epithelium layer based on a pixel value of a pixel of the tomogram. A layer region specifying unit for specifying a layer region in the tomogram, an approximate curve calculating unit for obtaining an approximate curve based on the shape of the layer region, the approximate curve being positioned in the depth direction of the fundus than the layer region, and A protruding area specifying unit that specifies a protruding area whose distance in the depth direction between the approximate curve and the layer area is equal to or greater than a predetermined threshold, including the protruding area, and a connection surrounded by the approximate curve and the layer area It is a fundus analysis program that functions as a connected region specifying unit that specifies a region and a form information generating unit that generates form information indicating the form of the connected region.
According to a twenty-first aspect of the present invention, there is provided a computer having a storage unit that stores a three-dimensional tomogram depicting a fundus layer structure based on pixel values of pixels of the three-dimensional tomogram. A layer region specifying unit for specifying a layer region in the tomographic image, an approximate curved surface calculation unit for obtaining an approximate curved surface based on the shape of the layer region, and the approximate curved surface being positioned in the depth direction of the fundus than the layer region And a protruding area specifying unit that specifies a protruding area in which the distance between the approximate curved surface and the layer area in the depth direction is equal to or greater than a predetermined threshold, the protruding area, and the approximate curved surface and the layer area. It is a fundus analysis program that functions as a connected region specifying unit that specifies an enclosed connected region and a form information generating unit that generates form information representing the form of the connected region.
The invention according to claim 22 is a fundus analysis method for analyzing a tomographic image depicting a layer structure of the fundus oculi, which corresponds to a retinal pigment epithelial layer based on pixel values of pixels of the tomographic image. Identifying a layer region in a tomogram, obtaining an approximate curve based on the shape of the layer region, the approximate curve being located in the depth direction of the fundus than the layer region, and the approximate curve and the Identifying a protruding region whose distance in the depth direction between the layer region is equal to or greater than a predetermined threshold; and specifying a connected region that includes the protruding region and is surrounded by the approximate curve and the layer region; Generating form information representing the form of the connected area.
The invention according to claim 23 is a fundus analysis method for analyzing a three-dimensional tomographic image depicting a layer structure of the fundus oculi, wherein the retinal pigment epithelial layer is based on pixel values of pixels of the three-dimensional tomographic image. Identifying the layer region in the tomographic image corresponding to the step, obtaining an approximate curved surface based on the shape of the layer region, the approximate curved surface is located in the depth direction of the fundus than the layer region, and Identifying a projecting region in which a distance between the approximate curved surface and the layer region in a depth direction is equal to or greater than a predetermined threshold; and a connecting region including the projecting region and surrounded by the approximate curved surface and the layer region. A step of specifying, and a step of generating form information representing the form of the connected region.
 この発明によれば、眼底の断層像に基づいてドルーゼンを効果的に検出することが可能である。 According to the present invention, drusen can be detected effectively based on a tomographic image of the fundus.
実施形態に係る眼底解析装置の構成の一例を示す概略図である。It is the schematic which shows an example of a structure of the fundus analysis apparatus which concerns on embodiment. 実施形態に係る眼底解析装置の構成の一例を示す概略図である。It is the schematic which shows an example of a structure of the fundus analysis apparatus which concerns on embodiment. 実施形態に係る眼底解析装置の構成の一例を示す概略ブロック図である。It is a schematic block diagram which shows an example of a structure of the fundus analysis apparatus which concerns on embodiment. 実施形態に係る眼底解析装置の構成の一例を示す概略ブロック図である。It is a schematic block diagram which shows an example of a structure of the fundus analysis apparatus which concerns on embodiment. 実施形態に係る眼底解析装置が実行する処理の例を説明するための概略説明図である。It is a schematic explanatory drawing for demonstrating the example of the process which the fundus analyzer which concerns on embodiment performs. 実施形態に係る眼底解析装置が実行する処理の例を説明するための概略説明図である。It is a schematic explanatory drawing for demonstrating the example of the process which the fundus analyzer which concerns on embodiment performs. 実施形態に係る眼底解析装置が実行する処理の例を説明するための概略説明図である。It is a schematic explanatory drawing for demonstrating the example of the process which the fundus analyzer which concerns on embodiment performs. 実施形態に係る眼底解析装置が実行する処理の例を説明するための概略説明図である。It is a schematic explanatory drawing for demonstrating the example of the process which the fundus analyzer which concerns on embodiment performs. 実施形態に係る眼底解析装置が実行する処理の例を説明するための概略説明図である。It is a schematic explanatory drawing for demonstrating the example of the process which the fundus analyzer which concerns on embodiment performs. 実施形態に係る眼底解析装置が実行する処理の例を説明するための概略説明図である。It is a schematic explanatory drawing for demonstrating the example of the process which the fundus analyzer which concerns on embodiment performs. 実施形態に係る眼底解析装置が実行する処理の例を説明するための概略説明図である。It is a schematic explanatory drawing for demonstrating the example of the process which the fundus analyzer which concerns on embodiment performs. 実施形態に係る眼底解析装置が実行する処理の例を説明するための概略説明図である。It is a schematic explanatory drawing for demonstrating the example of the process which the fundus analyzer which concerns on embodiment performs. 実施形態に係る眼底解析装置が実行する処理の例を説明するための概略説明図である。It is a schematic explanatory drawing for demonstrating the example of the process which the fundus analyzer which concerns on embodiment performs. 実施形態に係る眼底解析装置が実行する処理の例を説明するための概略説明図である。It is a schematic explanatory drawing for demonstrating the example of the process which the fundus analyzer which concerns on embodiment performs. 実施形態に係る眼底解析装置が実行する処理の例を説明するための概略説明図である。It is a schematic explanatory drawing for demonstrating the example of the process which the fundus analyzer which concerns on embodiment performs. 実施形態に係る眼底解析装置が実行する処理の例を説明するための概略説明図である。It is a schematic explanatory drawing for demonstrating the example of the process which the fundus analyzer which concerns on embodiment performs. 実施形態に係る眼底解析装置が実行する処理の例を説明するための概略説明図である。It is a schematic explanatory drawing for demonstrating the example of the process which the fundus analyzer which concerns on embodiment performs. 実施形態に係る眼底解析装置の動作例を表すフローチャートである。It is a flowchart showing the operation example of the fundus analysis apparatus which concerns on embodiment.
 この発明に係る眼底解析装置、眼底解析プログラム及び眼底解析方法の実施形態の一例について、図面を参照しながら詳細に説明する。なお、この明細書に記載された文献の記載内容を、以下の実施形態の内容として適宜援用することが可能である。 An example of an embodiment of a fundus analysis apparatus, a fundus analysis program, and a fundus analysis method according to the present invention will be described in detail with reference to the drawings. In addition, it is possible to use suitably the description content of the literature described in this specification as the content of the following embodiment.
 この発明に係る眼底解析装置は、眼底の断層像(2次元断層像、3次元断層像)を解析するコンピュータであってもよいし、光コヒーレンストモグラフィを用いて眼底の断層像を形成することが可能なOCT装置であってもよい。後者のOCT装置には前者のコンピュータが含まれる。よって、以下においては後者のOCT装置について特に詳しく説明する。 The fundus analysis apparatus according to the present invention may be a computer that analyzes a tomographic image (two-dimensional tomographic image, three-dimensional tomographic image) of the fundus, or forms a tomographic image of the fundus using optical coherence tomography. The OCT apparatus which can do is also possible. The latter OCT apparatus includes the former computer. Therefore, the latter OCT apparatus will be described in detail below.
 OCT装置としての眼底解析装置は、眼底の断層像を形成できるものであればどのようなタイプであってもよい。以下の実施形態では、スペクトラルドメインタイプについて特に詳しく説明する。なお、この発明の特徴の中心は眼底の断層像を解析する処理にあるので、スウェプトソースタイプやインファスタイプなどの他のタイプのOCT装置であっても同様に構成することが可能である。OCTを利用して断層像を形成するための計測動作をOCT計測と呼ぶことがある。 The fundus analyzer as the OCT apparatus may be of any type as long as it can form a tomographic image of the fundus. In the following embodiment, the spectral domain type will be described in detail. Since the feature of the present invention is centered on the processing of analyzing a tomographic image of the fundus, other types of OCT apparatuses such as a swept source type and an infass type can be similarly configured. A measurement operation for forming a tomographic image using OCT may be referred to as OCT measurement.
 以下の実施形態ではOCT装置と眼底カメラとを組み合わせた装置について説明するが、眼底カメラ以外の眼底撮影装置、たとえばSLO、スリットランプ、眼科手術用顕微鏡などに、この実施形態に係る構成を有するOCT装置を組み合わせることも可能である。また、この実施形態に係る構成を、単体のOCT装置に組み込むことも可能である。なお、SLO(Scanning Laser Ophthalmoscope)とは、レーザ光で眼底を走査し、その反射光を光電子増倍管等の高感度な素子で検出することにより眼底表面の形態を画像化する装置である。 In the following embodiment, an apparatus combining an OCT apparatus and a fundus camera will be described. However, a fundus imaging apparatus other than the fundus camera, such as an SLO, a slit lamp, an ophthalmic surgical microscope, or the like, has an OCT configuration according to this embodiment. It is also possible to combine devices. In addition, the configuration according to this embodiment can be incorporated into a single OCT apparatus. The SLO (Scanning Laser Ophthalmoscope) is an apparatus that images the fundus surface by scanning the fundus with laser light and detecting the reflected light with a highly sensitive element such as a photomultiplier tube.
[構成]
 図1及び図2に示すように、眼底解析装置1は、眼底カメラユニット2、OCTユニット100及び演算制御ユニット200を含んで構成される。眼底カメラユニット2は、従来の眼底カメラとほぼ同様の光学系を有する。OCTユニット100には、眼底の断層像を取得するための光学系が設けられている。演算制御ユニット200は、各種の演算処理や制御処理等を実行するコンピュータを具備している。
[Constitution]
As shown in FIGS. 1 and 2, the fundus analysis apparatus 1 includes a fundus camera unit 2, an OCT unit 100, and an arithmetic control unit 200. The retinal camera unit 2 has almost the same optical system as a conventional retinal camera. The OCT unit 100 is provided with an optical system for acquiring a fundus tomographic image. The arithmetic control unit 200 includes a computer that executes various arithmetic processes and control processes.
〔眼底カメラユニット〕
 図1に示す眼底カメラユニット2には、被検眼Eの眼底Efの表面形態を表す2次元画像(眼底像)を取得するための光学系が設けられている。眼底像には、観察画像や撮影画像などが含まれる。観察画像は、たとえば、近赤外光を用いて所定のフレームレートで形成されるモノクロの動画像である。撮影画像は、たとえば、可視光をフラッシュ発光して得られるカラー画像、又は近赤外光若しくは可視光を照明光として用いたモノクロの静止画像である。眼底カメラユニット2は、これら以外の画像、たとえばフルオレセイン蛍光画像やインドシアニングリーン蛍光画像や自発蛍光画像などを取得可能に構成されていてもよい。
[Fundus camera unit]
The fundus camera unit 2 shown in FIG. 1 is provided with an optical system for obtaining a two-dimensional image (fundus image) representing the surface form of the fundus oculi Ef of the eye E to be examined. The fundus image includes an observation image and a captured image. The observation image is, for example, a monochrome moving image formed at a predetermined frame rate using near infrared light. The captured image is, for example, a color image obtained by flashing visible light, or a monochrome still image using near infrared light or visible light as illumination light. The fundus camera unit 2 may be configured to be able to acquire images other than these, such as a fluorescein fluorescent image, an indocyanine green fluorescent image, a spontaneous fluorescent image, and the like.
 眼底カメラユニット2には、被検者の顔を支持するための顎受けや額当てが設けられている。更に、眼底カメラユニット2には、照明光学系10と撮影光学系30が設けられている。照明光学系10は眼底Efに照明光を照射する。撮影光学系30は、この照明光の眼底反射光を撮像装置(CCDイメージセンサ(単にCCDと呼ぶことがある)35、38)に導く。また、撮影光学系30は、OCTユニット100からの信号光を眼底Efに導くとともに、眼底Efを経由した信号光をOCTユニット100に導く。 The fundus camera unit 2 is provided with a chin rest and a forehead for supporting the subject's face. Further, the fundus camera unit 2 is provided with an illumination optical system 10 and a photographing optical system 30. The illumination optical system 10 irradiates the fundus oculi Ef with illumination light. The photographing optical system 30 guides the fundus reflection light of the illumination light to an imaging device (CCD image sensor (sometimes simply referred to as a CCD) 35, 38). The imaging optical system 30 guides the signal light from the OCT unit 100 to the fundus oculi Ef and guides the signal light passing through the fundus oculi Ef to the OCT unit 100.
 照明光学系10の観察光源11は、たとえばハロゲンランプにより構成される。観察光源11から出力された光(観察照明光)は、曲面状の反射面を有する反射ミラー12により反射され、集光レンズ13を経由し、可視カットフィルタ14を透過して近赤外光となる。更に、観察照明光は、撮影光源15の近傍にて一旦集束し、ミラー16により反射され、リレーレンズ17、18、絞り19及びリレーレンズ20を経由する。そして、観察照明光は、孔開きミラー21の周辺部(孔部の周囲の領域)にて反射され、ダイクロイックミラー46を透過し、対物レンズ22により屈折されて眼底Efを照明する。なお、観察光源としてLED(Light Emitting Diode)を用いることも可能である。 The observation light source 11 of the illumination optical system 10 is composed of, for example, a halogen lamp. The light (observation illumination light) output from the observation light source 11 is reflected by the reflection mirror 12 having a curved reflection surface, passes through the condensing lens 13, passes through the visible cut filter 14, and is converted into near infrared light. Become. Further, the observation illumination light is once converged in the vicinity of the photographing light source 15, reflected by the mirror 16, and passes through the relay lenses 17 and 18, the diaphragm 19 and the relay lens 20. Then, the observation illumination light is reflected at the peripheral portion (region around the hole portion) of the aperture mirror 21, passes through the dichroic mirror 46, and is refracted by the objective lens 22 to illuminate the fundus oculi Ef. An LED (Light Emitting Diode) can also be used as the observation light source.
 観察照明光の眼底反射光は、対物レンズ22により屈折され、ダイクロイックミラー46を透過し、孔開きミラー21の中心領域に形成された孔部を通過し、ダイクロイックミラー55を透過し、合焦レンズ31を経由し、ミラー32により反射される。更に、この眼底反射光は、ハーフミラー39Aを透過し、ダイクロイックミラー33により反射され、集光レンズ34によりCCDイメージセンサ35の受光面に結像される。CCDイメージセンサ35は、たとえば所定のフレームレートで眼底反射光を検出する。表示装置3には、CCDイメージセンサ35により検出された眼底反射光に基づく画像(観察画像)が表示される。なお、撮影光学系30のピントが前眼部に合わせられている場合、被検眼Eの前眼部の観察画像が表示される。 The fundus reflection light of the observation illumination light is refracted by the objective lens 22, passes through the dichroic mirror 46, passes through the hole formed in the central region of the perforated mirror 21, passes through the dichroic mirror 55, and is a focusing lens. It is reflected by the mirror 32 via 31. Further, the fundus reflection light passes through the half mirror 39A, is reflected by the dichroic mirror 33, and forms an image on the light receiving surface of the CCD image sensor 35 by the condenser lens. The CCD image sensor 35 detects fundus reflected light at a predetermined frame rate, for example. On the display device 3, an image (observation image) based on fundus reflection light detected by the CCD image sensor 35 is displayed. When the photographing optical system 30 is focused on the anterior segment, an observation image of the anterior segment of the eye E is displayed.
 撮影光源15は、たとえばキセノンランプにより構成される。撮影光源15から出力された光(撮影照明光)は、観察照明光と同様の経路を通って眼底Efに照射される。撮影照明光の眼底反射光は、観察照明光のそれと同様の経路を通ってダイクロイックミラー33まで導かれ、ダイクロイックミラー33を透過し、ミラー36により反射され、集光レンズ37によりCCDイメージセンサ38の受光面に結像される。表示装置3には、CCDイメージセンサ38により検出された眼底反射光に基づく画像(撮影画像)が表示される。なお、観察画像を表示する表示装置3と撮影画像を表示する表示装置3は、同一のものであってもよいし、異なるものであってもよい。また、被検眼Eを赤外光で照明して同様の撮影を行う場合には、赤外の撮影画像が表示される。また、撮影光源としてLEDを用いることも可能である。 The photographing light source 15 is constituted by, for example, a xenon lamp. The light (imaging illumination light) output from the imaging light source 15 is applied to the fundus oculi Ef through the same path as the observation illumination light. The fundus reflection light of the imaging illumination light is guided to the dichroic mirror 33 through the same path as that of the observation illumination light, passes through the dichroic mirror 33, is reflected by the mirror 36, and is reflected by the condenser lens 37 of the CCD image sensor 38. An image is formed on the light receiving surface. On the display device 3, an image (captured image) based on fundus reflection light detected by the CCD image sensor 38 is displayed. Note that the display device 3 that displays the observation image and the display device 3 that displays the captured image may be the same or different. In addition, when similar imaging is performed by illuminating the eye E with infrared light, an infrared captured image is displayed. It is also possible to use an LED as a photographing light source.
 LCD(Liquid Crystal Display)39は、固視標や視力測定用指標を表示する。固視標は被検眼Eを固視させるための指標であり、眼底撮影時やOCT計測時などに使用される。 The LCD (Liquid Crystal Display) 39 displays a fixation target and an eyesight measurement index. The fixation target is an index for fixing the eye E to be examined, and is used at the time of fundus photographing or OCT measurement.
 LCD39から出力された光は、その一部がハーフミラー39Aにて反射され、ミラー32に反射され、合焦レンズ31及びダイクロイックミラー55を経由し、孔開きミラー21の孔部を通過し、ダイクロイックミラー46を透過し、対物レンズ22により屈折されて眼底Efに投影される。 A part of the light output from the LCD 39 is reflected by the half mirror 39A, reflected by the mirror 32, passes through the focusing lens 31 and the dichroic mirror 55, passes through the hole of the perforated mirror 21, and reaches the dichroic. The light passes through the mirror 46, is refracted by the objective lens 22, and is projected onto the fundus oculi Ef.
 LCD39の画面上における固視標の表示位置を変更することにより、被検眼Eの固視位置を変更できる。被検眼Eの固視位置としては、たとえば従来の眼底カメラと同様に、眼底Efの黄斑部を中心とする画像を取得するための位置や、視神経乳頭を中心とする画像を取得するための位置や、黄斑部と視神経乳頭との間の眼底中心を中心とする画像を取得するための位置などがある。また、固視標の表示位置を任意に変更することも可能である。 The fixation position of the eye E can be changed by changing the display position of the fixation target on the screen of the LCD 39. As the fixation position of the eye E, for example, a position for acquiring an image centered on the macular portion of the fundus oculi Ef, or a position for acquiring an image centered on the optic disc as in the case of a conventional fundus camera And a position for acquiring an image centered on the fundus center between the macula and the optic disc. It is also possible to arbitrarily change the display position of the fixation target.
 更に、眼底カメラユニット2には、従来の眼底カメラと同様にアライメント光学系50とフォーカス光学系60が設けられている。アライメント光学系50は、被検眼Eに対する装置光学系の位置合わせ(アライメント)を行うための指標(アライメント指標)を生成する。フォーカス光学系60は、眼底Efに対してフォーカス(ピント)を合わせるための指標(スプリット指標)を生成する。 Furthermore, the fundus camera unit 2 is provided with an alignment optical system 50 and a focus optical system 60 as in the conventional fundus camera. The alignment optical system 50 generates an index (alignment index) for performing alignment (alignment) of the apparatus optical system with respect to the eye E. The focus optical system 60 generates an index (split index) for focusing on the fundus oculi Ef.
 アライメント光学系50のLED51から出力された光(アライメント光)は、絞り52、53及びリレーレンズ54を経由してダイクロイックミラー55により反射され、孔開きミラー21の孔部を通過し、ダイクロイックミラー46を透過し、対物レンズ22により被検眼Eの角膜に投影される。 The light (alignment light) output from the LED 51 of the alignment optical system 50 is reflected by the dichroic mirror 55 via the apertures 52 and 53 and the relay lens 54, passes through the hole of the aperture mirror 21, and reaches the dichroic mirror 46. And is projected onto the cornea of the eye E by the objective lens 22.
 アライメント光の角膜反射光は、対物レンズ22、ダイクロイックミラー46及び上記孔部を経由し、その一部がダイクロイックミラー55を透過し、合焦レンズ31を通過し、ミラー32により反射され、ハーフミラー39Aを透過し、ダイクロイックミラー33に反射され、集光レンズ34によりCCDイメージセンサ35の受光面に投影される。CCDイメージセンサ35による受光像(アライメント指標)は、観察画像とともに表示装置3に表示される。ユーザは、従来の眼底カメラと同様の操作を行ってアライメントを実施する。また、演算制御ユニット200がアライメント指標の位置を解析して光学系を移動させることによりアライメントを行ってもよい(オートアライメント機能)。 The cornea-reflected light of the alignment light passes through the objective lens 22, the dichroic mirror 46, and the hole, part of which passes through the dichroic mirror 55, passes through the focusing lens 31, and is reflected by the mirror 32. The light passes through 39A, is reflected by the dichroic mirror 33, and is projected onto the light receiving surface of the CCD image sensor 35 by the condenser lens. The light reception image (alignment index) by the CCD image sensor 35 is displayed on the display device 3 together with the observation image. The user performs alignment by performing the same operation as that of a conventional fundus camera. Further, the arithmetic control unit 200 may perform alignment by analyzing the position of the alignment index and moving the optical system (auto-alignment function).
 フォーカス調整を行う際には、照明光学系10の光路上に反射棒67の反射面が斜設される。フォーカス光学系60のLED61から出力された光(フォーカス光)は、リレーレンズ62を通過し、スプリット指標板63により2つの光束に分離され、二孔絞り64を通過し、ミラー65に反射され、集光レンズ66により反射棒67の反射面に一旦結像されて反射される。更に、フォーカス光は、リレーレンズ20を経由し、孔開きミラー21に反射され、ダイクロイックミラー46を透過し、対物レンズ22により屈折されて眼底Efに投影される。 When performing the focus adjustment, the reflecting surface of the reflecting rod 67 is obliquely provided on the optical path of the illumination optical system 10. The light (focus light) output from the LED 61 of the focus optical system 60 passes through the relay lens 62, is separated into two light beams by the split indicator plate 63, passes through the two-hole aperture 64, and is reflected by the mirror 65, The light is focused on the reflecting surface of the reflecting bar 67 by the condenser lens 66 and reflected. Further, the focus light passes through the relay lens 20, is reflected by the perforated mirror 21, passes through the dichroic mirror 46, is refracted by the objective lens 22, and is projected onto the fundus oculi Ef.
 フォーカス光の眼底反射光は、アライメント光の角膜反射光と同様の経路を通ってCCDイメージセンサ35により検出される。CCDイメージセンサ35による受光像(スプリット指標)は、観察画像とともに表示装置3に表示される。演算制御ユニット200は、従来と同様に、スプリット指標の位置を解析して合焦レンズ31及びフォーカス光学系60を移動させてピント合わせを行う(オートフォーカス機能)。また、スプリット指標を視認しつつ手動でピント合わせを行ってもよい。 The fundus reflection light of the focus light is detected by the CCD image sensor 35 through the same path as the corneal reflection light of the alignment light. A light reception image (split index) by the CCD image sensor 35 is displayed on the display device 3 together with the observation image. The arithmetic control unit 200 analyzes the position of the split index and moves the focusing lens 31 and the focus optical system 60 to perform focusing as in the conventional case (autofocus function). Alternatively, focusing may be performed manually while visually checking the split indicator.
 ダイクロイックミラー46は、眼底撮影用の光路からOCT計測用の光路を分岐させている。ダイクロイックミラー46は、OCT計測に用いられる波長帯の光を反射し、眼底撮影用の光を透過させる。このOCT計測用の光路には、OCTユニット100側から順に、コリメータレンズユニット40と、光路長変更部41と、ガルバノスキャナ42と、合焦レンズ43と、ミラー44と、リレーレンズ45とが設けられている。 The dichroic mirror 46 branches the optical path for OCT measurement from the optical path for fundus imaging. The dichroic mirror 46 reflects light in a wavelength band used for OCT measurement and transmits light for fundus photographing. In this optical path for OCT measurement, a collimator lens unit 40, an optical path length changing unit 41, a galvano scanner 42, a focusing lens 43, a mirror 44, and a relay lens 45 are provided in this order from the OCT unit 100 side. It has been.
 光路長変更部41は、図1に示す矢印の方向に移動可能とされ、OCT計測用の光路の光路長を変更する。この光路長の変更は、被検眼Eの眼軸長に応じた光路長の補正や、干渉状態の調整などに利用される。光路長変更部41は、たとえばコーナーキューブと、これを移動する機構とを含んで構成される。 The optical path length changing unit 41 is movable in the direction of the arrow shown in FIG. 1, and changes the optical path length of the optical path for OCT measurement. This change in the optical path length is used for correcting the optical path length according to the axial length of the eye E or adjusting the interference state. The optical path length changing unit 41 includes, for example, a corner cube and a mechanism for moving the corner cube.
 ガルバノスキャナ42は、OCT計測用の光路を通過する光(信号光LS)の進行方向を変更する。それにより、眼底Efを信号光LSで走査することができる。ガルバノスキャナ42は、たとえば、信号光LSをx方向に走査するガルバノミラーと、y方向に走査するガルバノミラーと、これらを独立に駆動する機構とを含んで構成される。それにより、信号光LSをxy平面上の任意の方向に走査することができる。 The galvano scanner 42 changes the traveling direction of light (signal light LS) passing through the optical path for OCT measurement. Thereby, the fundus oculi Ef can be scanned with the signal light LS. The galvano scanner 42 includes, for example, a galvano mirror that scans the signal light LS in the x direction, a galvano mirror that scans in the y direction, and a mechanism that drives these independently. Thereby, the signal light LS can be scanned in an arbitrary direction on the xy plane.
〔OCTユニット〕
 図2を参照しつつOCTユニット100の構成の一例を説明する。OCTユニット100には、眼底Efの断層像を取得するための光学系が設けられている。この光学系は、従来のスペクトラルドメインタイプのOCT装置と同様の構成を有する。すなわち、この光学系は、低コヒーレンス光を参照光と信号光に分割し、眼底Efを経由した信号光と参照光路を経由した参照光とを干渉させて干渉光を生成し、この干渉光のスペクトル成分を検出するように構成されている。この検出結果(検出信号)は演算制御ユニット200に送られる。
[OCT unit]
An example of the configuration of the OCT unit 100 will be described with reference to FIG. The OCT unit 100 is provided with an optical system for acquiring a tomographic image of the fundus oculi Ef. This optical system has the same configuration as a conventional spectral domain type OCT apparatus. That is, this optical system divides low-coherence light into reference light and signal light, and generates interference light by causing interference between the signal light passing through the fundus oculi Ef and the reference light passing through the reference optical path. It is configured to detect spectral components. This detection result (detection signal) is sent to the arithmetic control unit 200.
 なお、スウェプトソースタイプのOCT装置の場合には、低コヒーレンス光源を出力する光源の代わりに波長掃引光源が設けられるとともに、干渉光をスペクトル分解する光学部材が設けられない。一般に、OCTユニット100の構成については、光コヒーレンストモグラフィのタイプに応じた公知の技術を任意に適用することができる。 In the case of a swept source type OCT apparatus, a wavelength swept light source is provided instead of a light source that outputs a low coherence light source, and an optical member that spectrally decomposes interference light is not provided. In general, for the configuration of the OCT unit 100, a known technique according to the type of optical coherence tomography can be arbitrarily applied.
 光源ユニット101は広帯域の低コヒーレンス光L0を出力する。低コヒーレンス光L0は、たとえば、近赤外領域の波長帯(約800nm~900nm程度)を含み、数十マイクロメートル程度の時間的コヒーレンス長を有する。なお、人眼では視認できない波長帯、たとえば1040~1060nm程度の中心波長を有する近赤外光を低コヒーレンス光L0として用いてもよい。 The light source unit 101 outputs a broadband low-coherence light L0. The low coherence light L0 includes, for example, a near-infrared wavelength band (about 800 nm to 900 nm) and has a temporal coherence length of about several tens of micrometers. Note that near-infrared light having a wavelength band invisible to the human eye, for example, a center wavelength of about 1040 to 1060 nm, may be used as the low-coherence light L0.
 光源ユニット101は、スーパールミネセントダイオード(Super Luminescent Diode:SLD)や、LEDや、SOA(Semiconductor Optical Amplifier)等の光出力デバイスを含んで構成される。 The light source unit 101 includes a super luminescent diode (Super Luminescent Diode: SLD), an LED, and an optical output device such as an SOA (Semiconductor Optical Amplifier).
 光源ユニット101から出力された低コヒーレンス光L0は、光ファイバ102によりファイバカプラ103に導かれて信号光LSと参照光LRに分割される。 The low coherence light L0 output from the light source unit 101 is guided to the fiber coupler 103 by the optical fiber 102, and is divided into the signal light LS and the reference light LR.
 参照光LRは、光ファイバ104により導かれて光減衰器(アッテネータ)105に到達する。光減衰器105は、公知の技術を用いて、演算制御ユニット200の制御の下、光ファイバ104に導かれる参照光LRの光量を自動で調整する。光減衰器105により光量が調整された参照光LRは、光ファイバ104により導かれて偏波調整器(偏波コントローラ)106に到達する。偏波調整器106は、たとえば、ループ状にされた光ファイバ104に対して外部から応力を与えることで、光ファイバ104内を導かれる参照光LRの偏光状態を調整する装置である。なお、偏波調整器106の構成はこれに限定されるものではなく、任意の公知技術を用いることが可能である。偏波調整器106により偏光状態が調整された参照光LRは、ファイバカプラ109に到達する。 The reference light LR is guided by the optical fiber 104 and reaches an optical attenuator (attenuator) 105. The optical attenuator 105 automatically adjusts the amount of the reference light LR guided to the optical fiber 104 under the control of the arithmetic control unit 200 using a known technique. The reference light LR whose light amount has been adjusted by the optical attenuator 105 is guided by the optical fiber 104 and reaches the polarization adjuster (polarization controller) 106. The polarization adjuster 106 is, for example, a device that adjusts the polarization state of the reference light LR guided in the optical fiber 104 by applying a stress from the outside to the optical fiber 104 in a loop shape. The configuration of the polarization adjuster 106 is not limited to this, and any known technique can be used. The reference light LR whose polarization state is adjusted by the polarization adjuster 106 reaches the fiber coupler 109.
 ファイバカプラ103により生成された信号光LSは、光ファイバ107により導かれ、コリメータレンズユニット40により平行光束とされる。更に、信号光LSは、光路長変更部41、ガルバノスキャナ42、合焦レンズ43、ミラー44、及びリレーレンズ45を経由してダイクロイックミラー46に到達する。そして、信号光LSは、ダイクロイックミラー46により反射され、対物レンズ22により屈折されて眼底Efに照射される。信号光LSは、眼底Efの様々な深さ位置において散乱(反射を含む)される。眼底Efによる信号光LSの後方散乱光は、往路と同じ経路を逆向きに進行してファイバカプラ103に導かれ、光ファイバ108を経由してファイバカプラ109に到達する。 The signal light LS generated by the fiber coupler 103 is guided by the optical fiber 107 and converted into a parallel light beam by the collimator lens unit 40. Further, the signal light LS reaches the dichroic mirror 46 via the optical path length changing unit 41, the galvano scanner 42, the focusing lens 43, the mirror 44, and the relay lens 45. The signal light LS is reflected by the dichroic mirror 46, is refracted by the objective lens 22, and is applied to the fundus oculi Ef. The signal light LS is scattered (including reflection) at various depth positions of the fundus oculi Ef. The backscattered light of the signal light LS from the fundus oculi Ef travels in the same direction as the forward path in the reverse direction, is guided to the fiber coupler 103, and reaches the fiber coupler 109 via the optical fiber 108.
 ファイバカプラ109は、信号光LSの後方散乱光と、光ファイバ104を経由した参照光LRとを干渉させる。これにより生成された干渉光LCは、光ファイバ110により導かれて出射端111から出射される。更に、干渉光LCは、コリメータレンズ112により平行光束とされ、回折格子113により分光(スペクトル分解)され、集光レンズ114により集光されてCCDイメージセンサ115の受光面に投影される。なお、図2に示す回折格子113は透過型であるが、たとえば反射型の回折格子など、他の形態の分光素子を用いることも可能である。 The fiber coupler 109 causes the backscattered light of the signal light LS and the reference light LR that has passed through the optical fiber 104 to interfere with each other. The interference light LC generated thereby is guided by the optical fiber 110 and emitted from the emission end 111. Further, the interference light LC is converted into a parallel light beam by the collimator lens 112, dispersed (spectral decomposition) by the diffraction grating 113, condensed by the condenser lens 114, and projected onto the light receiving surface of the CCD image sensor 115. Although the diffraction grating 113 shown in FIG. 2 is a transmission type, other types of spectroscopic elements such as a reflection type diffraction grating may be used.
 CCDイメージセンサ115は、たとえばラインセンサであり、分光された干渉光LCの各スペクトル成分を検出して電荷に変換する。CCDイメージセンサ115は、この電荷を蓄積して検出信号を生成し、これを演算制御ユニット200に送る。 The CCD image sensor 115 is a line sensor, for example, and detects each spectral component of the split interference light LC and converts it into electric charges. The CCD image sensor 115 accumulates this electric charge, generates a detection signal, and sends it to the arithmetic control unit 200.
 この実施形態ではマイケルソン型の干渉計を採用しているが、たとえばマッハツェンダー型など任意のタイプの干渉計を適宜に採用することが可能である。また、CCDイメージセンサに代えて、他の形態のイメージセンサ、たとえばCMOS(Complementary Metal Oxide Semiconductor)イメージセンサなどを用いることが可能である。 In this embodiment, a Michelson type interferometer is used, but any type of interferometer such as a Mach-Zehnder type can be appropriately used. Further, in place of the CCD image sensor, another form of image sensor, for example, a CMOS (Complementary Metal Oxide Semiconductor) image sensor or the like can be used.
〔演算制御ユニット〕
 演算制御ユニット200の構成について説明する。演算制御ユニット200は、CCDイメージセンサ115から入力される検出信号を解析して眼底Efの断層像を形成する。そのための演算処理は、従来のスペクトラルドメインタイプのOCT装置と同様である。
[Calculation control unit]
The configuration of the arithmetic control unit 200 will be described. The arithmetic control unit 200 analyzes the detection signal input from the CCD image sensor 115 and forms a tomographic image of the fundus oculi Ef. The arithmetic processing for this is the same as that of a conventional spectral domain type OCT apparatus.
 また、演算制御ユニット200は、眼底カメラユニット2、表示装置3及びOCTユニット100の各部を制御する。たとえば演算制御ユニット200は、眼底Efの断層像を表示装置3に表示させる。 The arithmetic control unit 200 controls each part of the fundus camera unit 2, the display device 3, and the OCT unit 100. For example, the arithmetic control unit 200 displays a tomographic image of the fundus oculi Ef on the display device 3.
 また、眼底カメラユニット2の制御として、演算制御ユニット200は、観察光源11、撮影光源15及びLED51、61の動作制御、LCD39の動作制御、合焦レンズ31、43の移動制御、反射棒67の移動制御、フォーカス光学系60の移動制御、光路長変更部41の移動制御、ガルバノスキャナ42の動作制御などを行う。 As the control of the fundus camera unit 2, the arithmetic control unit 200 controls the operation of the observation light source 11, the imaging light source 15 and the LEDs 51 and 61, the operation control of the LCD 39, the movement control of the focusing lenses 31 and 43, and the reflector 67. Movement control, movement control of the focus optical system 60, movement control of the optical path length changing unit 41, operation control of the galvano scanner 42, and the like are performed.
 また、OCTユニット100の制御として、演算制御ユニット200は、光源ユニット101の動作制御、光減衰器105の動作制御、偏波調整器106の動作制御、CCDイメージセンサ115の動作制御などを行う。 Further, as control of the OCT unit 100, the arithmetic control unit 200 performs operation control of the light source unit 101, operation control of the optical attenuator 105, operation control of the polarization adjuster 106, operation control of the CCD image sensor 115, and the like.
 演算制御ユニット200は、たとえば、従来のコンピュータと同様に、マイクロプロセッサ、RAM、ROM、ハードディスクドライブ、通信インターフェイスなどを含んで構成される。ハードディスクドライブ等の記憶装置には、眼底解析装置1を制御するためのコンピュータプログラムが記憶されている。演算制御ユニット200は、各種の回路基板、たとえば断層像を形成するための回路基板を備えていてもよい。また、演算制御ユニット200は、キーボードやマウス等の操作デバイス(入力デバイス)や、LCD等の表示デバイスを備えていてもよい。 The arithmetic control unit 200 includes, for example, a microprocessor, a RAM, a ROM, a hard disk drive, a communication interface, and the like, as in a conventional computer. A computer program for controlling the fundus analyzer 1 is stored in a storage device such as a hard disk drive. The arithmetic control unit 200 may include various circuit boards, for example, a circuit board for forming a tomographic image. The arithmetic control unit 200 may include an operation device (input device) such as a keyboard and a mouse, and a display device such as an LCD.
 眼底カメラユニット2、表示装置3、OCTユニット100及び演算制御ユニット200は、一体的に(つまり単一の筺体内に)構成されていてもよいし、2つ以上の筐体に別れて構成されていてもよい。 The fundus camera unit 2, the display device 3, the OCT unit 100, and the calculation control unit 200 may be configured integrally (that is, in a single housing) or separated into two or more housings. It may be.
〔制御系〕
 眼底解析装置1の制御系の構成について図3及び図4を参照しつつ説明する。
[Control system]
The configuration of the control system of the fundus analysis apparatus 1 will be described with reference to FIGS. 3 and 4.
(制御部)
 眼底解析装置1の制御系は、制御部210を中心に構成される。制御部210は、たとえば、前述のマイクロプロセッサ、RAM、ROM、ハードディスクドライブ、通信インターフェイス等を含んで構成される。制御部210には、主制御部211と記憶部212が設けられている。
(Control part)
The control system of the fundus analysis apparatus 1 is configured around the control unit 210. The control unit 210 includes, for example, the aforementioned microprocessor, RAM, ROM, hard disk drive, communication interface, and the like. The control unit 210 is provided with a main control unit 211 and a storage unit 212.
(主制御部)
 主制御部211は前述の各種制御を行う。特に、主制御部211は、眼底カメラユニット2の合焦駆動部31A、光路長変更部41及びガルバノスキャナ42、更にOCTユニット100の光源ユニット101、光減衰器105及び偏波調整器106を制御する。
(Main control unit)
The main control unit 211 performs the various controls described above. In particular, the main control unit 211 controls the focusing drive unit 31A, the optical path length changing unit 41, and the galvano scanner 42 of the fundus camera unit 2, and further the light source unit 101, the optical attenuator 105, and the polarization adjuster 106 of the OCT unit 100. To do.
 合焦駆動部31Aは、合焦レンズ31を光軸方向に移動させる。それにより、撮影光学系30の合焦位置が変更される。なお、主制御部211は、図示しない光学系駆動部を制御して、眼底カメラユニット2に設けられた光学系を3次元的に移動させることもできる。この制御は、アライメントやトラッキングにおいて用いられる。トラッキングとは、被検眼Eの眼球運動に合わせて装置光学系を移動させるものである。トラッキングを行う場合には、事前にアライメントとピント合わせが実行される。トラッキングは、装置光学系の位置を眼球運動に追従させることにより、アライメントとピントが合った好適な位置関係を維持する機能である。 The focusing drive unit 31A moves the focusing lens 31 in the optical axis direction. Thereby, the focus position of the photographic optical system 30 is changed. The main control unit 211 can also move an optical system provided in the fundus camera unit 2 in a three-dimensional manner by controlling an optical system drive unit (not shown). This control is used in alignment and tracking. Tracking is to move the apparatus optical system in accordance with the eye movement of the eye E. When tracking is performed, alignment and focusing are performed in advance. Tracking is a function of maintaining a suitable positional relationship in which the alignment and focus are achieved by causing the position of the apparatus optical system to follow the eye movement.
 また、主制御部211は、記憶部212にデータを書き込む処理や、記憶部212からデータを読み出す処理を行う。 Further, the main control unit 211 performs a process of writing data to the storage unit 212 and a process of reading data from the storage unit 212.
(記憶部)
 記憶部212は、各種のデータを記憶する。記憶部212に記憶されるデータとしては、たとえば、断層像の画像データ、眼底像の画像データ、被検眼情報などがある。被検眼情報は、患者IDや氏名などの被検者に関する情報や、被検眼が左眼であるか右眼であるかを示す左右眼情報などの被検眼に関する情報を含む。また、記憶部212には、眼底解析装置1を動作させるための各種プログラムやデータが記憶されている。
(Memory part)
The storage unit 212 stores various data. Examples of data stored in the storage unit 212 include tomographic image data, fundus image data, and eye information to be examined. The eye information includes information about the subject such as patient ID and name, and information about the eye such as left and right eye information indicating whether the eye is the left eye or the right eye. The storage unit 212 stores various programs and data for operating the fundus analysis apparatus 1.
(画像形成部)
 画像形成部220は、CCDイメージセンサ115からの検出信号に基づいて、眼底Efの層構造を描写する2次元断層像の画像データを形成する。この処理には、従来のスペクトラルドメインタイプの光コヒーレンストモグラフィと同様に、ノイズ除去(ノイズ低減)、フィルタ処理、分散補償、FFT(Fast Fourier Transform)などの処理が含まれている。他のタイプのOCT装置の場合、画像形成部220は、そのタイプに応じた公知の処理を実行する。制御部211は、画像形成部220により形成された2次元断層像の画像データを記憶部212に記憶させる。
(Image forming part)
The image forming unit 220 forms image data of a two-dimensional tomographic image depicting the layer structure of the fundus oculi Ef based on the detection signal from the CCD image sensor 115. This process includes processes such as noise removal (noise reduction), filter processing, dispersion compensation, and FFT (Fast Fourier Transform) as in the conventional spectral domain type optical coherence tomography. In the case of another type of OCT apparatus, the image forming unit 220 executes a known process corresponding to the type. The control unit 211 causes the storage unit 212 to store image data of the two-dimensional tomographic image formed by the image forming unit 220.
 画像形成部220は、たとえば、前述の回路基板を含んで構成される。なお、この明細書では、「画像データ」と、それに基づく「画像」とを同一視することがある。 The image forming unit 220 includes, for example, the circuit board described above. In this specification, “image data” and “image” based thereon may be identified.
(画像処理部)
 画像処理部230は、眼底Efの断層像に対して各種の画像処理や解析処理を施す。たとえば、画像処理部230は、画像の輝度補正等の各種補正処理を実行する。また、画像処理部230は、眼底カメラユニット2により得られた画像(眼底像、前眼部像等)に対して各種の画像処理や解析処理を施す。
(Image processing unit)
The image processing unit 230 performs various types of image processing and analysis processing on the tomographic image of the fundus oculi Ef. For example, the image processing unit 230 executes various correction processes such as image brightness correction. The image processing unit 230 performs various types of image processing and analysis processing on the image (fundus image, anterior eye image, etc.) obtained by the fundus camera unit 2.
 画像処理部230は、2次元断層像の間の画素を補間する補間処理などの公知の画像処理を実行して、眼底Efの3次元断層像を形成する。なお、3次元断層像の画像データとは、3次元座標系により画素の位置が定義された画像データを意味する。3次元断層像の画像データとしては、3次元的に配列されたボクセルからなる画像データがある。この画像データは、ボリュームデータ或いはボクセルデータなどと呼ばれる。ボリュームデータに基づく画像を表示させる場合、画像処理部230は、このボリュームデータに対してレンダリング処理(ボリュームレンダリングやMIP(Maximum Intensity Projection:最大値投影)など)を施して、特定の視線方向から見たときの擬似的な3次元断層像を形成する。表示部240A等の表示デバイスには、この擬似的な3次元断層像が表示される。 The image processing unit 230 performs known image processing such as interpolation processing for interpolating pixels between two-dimensional tomographic images to form a three-dimensional tomographic image of the fundus oculi Ef. Note that the image data of a three-dimensional tomographic image means image data in which pixel positions are defined by a three-dimensional coordinate system. As image data of a three-dimensional tomographic image, there is image data composed of voxels arranged three-dimensionally. This image data is called volume data or voxel data. When displaying an image based on volume data, the image processing unit 230 performs a rendering process (such as volume rendering or MIP (Maximum Intensity Projection)) on the volume data, and views the image from a specific line-of-sight direction. A pseudo three-dimensional tomographic image is formed. This pseudo three-dimensional tomographic image is displayed on a display device such as the display unit 240A.
 また、3次元断層像の画像データとして、複数の2次元断層像のスタックデータを形成することも可能である。スタックデータは、複数の走査線に沿って得られた複数の2次元断層像を、走査線の位置関係に基づいて3次元的に配列させることで得られる画像データである。すなわち、スタックデータは、元々個別の2次元座標系により定義されていた複数の断層像を、1つの3次元座標系により表現する(つまり1つの3次元空間に埋め込む)ことにより得られる画像データである。 It is also possible to form stack data of a plurality of two-dimensional tomograms as image data of a three-dimensional tomogram. The stack data is image data obtained by three-dimensionally arranging a plurality of two-dimensional tomographic images obtained along a plurality of scanning lines based on the positional relationship of the scanning lines. That is, stack data is image data obtained by expressing a plurality of tomographic images originally defined by individual two-dimensional coordinate systems by one three-dimensional coordinate system (that is, by embedding them in one three-dimensional space). is there.
 制御部211は、上記のようにして形成された3次元断層像(ボリュームデータ、擬似的な3次元断層像、スタックデータ等)を記憶部212に記憶させる。 The control unit 211 causes the storage unit 212 to store the three-dimensional tomogram (volume data, pseudo three-dimensional tomogram, stack data, etc.) formed as described above.
 画像処理部230は、眼底Efの黄斑及びその周辺部位を描出する断層像を解析してドルーゼンの状態を取得するための解析処理を行う。そのための構成として、画像処理部230は、層領域特定部231、近似曲線演算部232、突出領域特定部233、連結領域特定部234、及び形態情報生成部235を有する。 The image processing unit 230 performs an analysis process for obtaining a drusen state by analyzing a tomographic image depicting the macular of the fundus oculi Ef and its surrounding region. As a configuration for that, the image processing unit 230 includes a layer region specifying unit 231, an approximate curve calculating unit 232, a protruding region specifying unit 233, a connected region specifying unit 234, and a form information generating unit 235.
(層領域特定部)
 層領域特定部231は、眼底Efの断層像の画素の画素値に基づいて、この断層像において網膜色素上皮層に相当する画像領域を特定する。この画像領域を層領域と呼ぶ。
(Layer region specific part)
The layer region specifying unit 231 specifies an image region corresponding to the retinal pigment epithelium layer in the tomographic image based on the pixel value of the pixel of the tomographic image of the fundus oculi Ef. This image area is called a layer area.
 層領域特定部231は、たとえば特許文献5と同様の処理を実行して層領域を特定する。この処理について簡単に説明する。層領域特定部231は、まず、階調変換処理、画像強調処理、閾値処理、コントラスト変換処理、二値化処理、エッジ検出処理、画像平均化処理、画像平滑化処理、フィルタ処理などの前処理を断層像に施して、この断層像が描出する層構造を明瞭化する。 The layer region specifying unit 231 specifies the layer region by executing, for example, the same process as in Patent Document 5. This process will be briefly described. The layer area specifying unit 231 first performs preprocessing such as tone conversion processing, image enhancement processing, threshold processing, contrast conversion processing, binarization processing, edge detection processing, image averaging processing, image smoothing processing, and filter processing. Is applied to the tomographic image to clarify the layer structure depicted by the tomographic image.
 次に、層領域特定部231は、前処理が施された断層像を構成する画素の画素値(たとえば輝度値)を、眼底Efの深度方向(z軸方向)に沿って一列ずつ解析し、隣接する層の境界位置に相当する画素を特定する。このとき、深度方向にのみ広がりを有するフィルタ(たとえば微分フィルタ)を用いて層の境界位置に相当する画素を特定することができる。なお、深度方向及びそれに直交する方向の二方向に広がるエリアフィルタを用いて画素のエッジ検出を行うようにしてもよい。 Next, the layer region specifying unit 231 analyzes pixel values (for example, luminance values) of pixels constituting the pre-processed tomographic image one by one along the depth direction (z-axis direction) of the fundus oculi Ef, A pixel corresponding to a boundary position between adjacent layers is specified. At this time, a pixel corresponding to the boundary position of the layer can be specified using a filter (for example, a differential filter) having a spread only in the depth direction. Note that pixel edge detection may be performed using an area filter that extends in two directions, the depth direction and the direction orthogonal thereto.
 層領域特定部231は、このような処理により眼底Efの幾つかの層に相当する画像領域を特定する。更に、層領域特定部231は、特定された幾つかの画像領域のうちから網膜色素上皮層に相当するものを特定する。この処理の例を説明する。断層像中において、網膜表面側から数えて幾つ目の明るい層が網膜色素上皮層に相当しているかは、臨床的に取得された多数の眼底の断層像から既知である。よって、解析対象の断層像について、網膜表面をまず特定し、この網膜表面の側から明るい層の数をカウントし、所定のカウント値に相当する層が目的の層領域となる。 The layer region specifying unit 231 specifies image regions corresponding to several layers of the fundus oculi Ef through such processing. Furthermore, the layer area specifying unit 231 specifies an image corresponding to the retinal pigment epithelium layer from among the specified image areas. An example of this processing will be described. In the tomographic image, it is known from a number of clinically acquired tomographic images of the fundus how many bright layers counted from the retinal surface side correspond to the retinal pigment epithelium layer. Therefore, the tomographic image to be analyzed first identifies the retina surface, counts the number of bright layers from the retina surface side, and the layer corresponding to the predetermined count value is the target layer region.
 層領域の別の特定方法として、網膜表面から奥行方向に向かって網膜色素上皮層までの標準的な距離に基づいて、解析対象の断層像中の層領域を特定してもよい。また、断層像中において、眼底Efの各層の明るさには差異があるので、当該差異を考慮して層領域を特定することができる。たとえば、明るく描写される層のうち網膜色素上皮層がN番目に明るく描写される場合、解析対象の断層像中に特定された層に相当する画像領域のうちN番目に明るいものを特定して層領域とすることができる。なお、層領域の特定方法はここに例示したものには限定されず、目的の層領域を特定可能であればその手法は問わない。 As another method for specifying the layer region, the layer region in the tomographic image to be analyzed may be specified based on the standard distance from the retina surface to the retinal pigment epithelium layer in the depth direction. In addition, since there is a difference in the brightness of each layer of the fundus oculi Ef in the tomographic image, the layer region can be specified in consideration of the difference. For example, when the retinal pigment epithelium layer is depicted Nth brightly among the brightly depicted layers, the Nth brightest image region corresponding to the layer identified in the tomographic image to be analyzed is specified. It can be a layer region. Note that the method of specifying the layer region is not limited to the one exemplified here, and any method may be used as long as the target layer region can be specified.
 なお、解析対象の断層像は2次元又は3次元の断層像である。2次元断層像を解析する場合、層領域は、(層領域の厚みを無視すると)略曲線状の画像領域として特定される。一方、3次元断層像を解析する場合、層領域は、(層領域の厚みを無視すると)略曲面状の画像領域として特定される。ここでは、2次元断層像を解析する場合について詳しく説明する。3次元断層像を解析する場合については変形例として後述する。 The tomographic image to be analyzed is a two-dimensional or three-dimensional tomographic image. When analyzing a two-dimensional tomographic image, the layer region is specified as a substantially curved image region (ignoring the thickness of the layer region). On the other hand, when analyzing a three-dimensional tomographic image, the layer region is specified as a substantially curved image region (ignoring the thickness of the layer region). Here, the case where a two-dimensional tomographic image is analyzed will be described in detail. The case of analyzing a three-dimensional tomographic image will be described later as a modified example.
 層領域特定部231は、特定された層領域の情報、たとえば断層像中における層領域の位置情報(座標値)を生成する。なお、断層像から層領域を抽出するようにしてもよい。また、特定された層領域の形状を表す画像情報(たとえばワイヤモデル等)を生成してもよい。いずれにしても、層領域特定部231は、少なくとも、断層像中における網膜色素上皮層に相当する層領域を特定するものであればよい。 The layer region specifying unit 231 generates information on the specified layer region, for example, position information (coordinate values) of the layer region in the tomographic image. Note that the layer region may be extracted from the tomographic image. Moreover, you may produce | generate the image information (for example, wire model etc.) showing the shape of the specified layer area | region. In any case, the layer region specifying unit 231 only needs to specify at least a layer region corresponding to the retinal pigment epithelium layer in the tomographic image.
(近似曲線演算部)
 断層像における層領域(曲線状の画像領域)の特定結果は近似曲線演算部232に入力される。近似曲線演算部232は、この層領域の形状に基づいて、この形状を近似する曲線を求める。この曲線を近似曲線と呼ぶ。近似曲線は、この断層像が示す断面にドルーゼンが存在しないと仮定した場合における網膜色素上皮層の推定形状を表す。
(Approximate curve calculation part)
The identification result of the layer region (curved image region) in the tomographic image is input to the approximate curve calculation unit 232. Based on the shape of this layer region, the approximate curve calculation unit 232 obtains a curve that approximates this shape. This curve is called an approximate curve. The approximate curve represents the estimated shape of the retinal pigment epithelium layer when it is assumed that drusen does not exist in the cross section indicated by the tomographic image.
 当該断面にドルーゼンが存在しない場合、特定される層領域は、少なくとも大局的には奥行方向に凸な曲線として断層像中に描出される。一方、当該断面にドルーゼンが存在する場合、特定される層領域にはドルーゼンに相当する凹凸が描出される。近似曲線は、このような凹凸を無視した層領域の大局的な形状を表すものである。 When there is no drusen in the cross section, the specified layer region is depicted in the tomographic image as a curve convex in the depth direction at least globally. On the other hand, when drusen exists in the cross section, irregularities corresponding to drusen are drawn in the specified layer region. The approximate curve represents the global shape of the layer region ignoring such irregularities.
 このような近似曲線を求めるために、近似曲線演算部232には、特徴部位特定部232a、特徴部位補間部232b、自由曲線演算部232c、及び補正部232dが設けられている。また、補正部232dには、直線部位特定部232e及び第1変形部232fの組と、曲線部位特定部232g及び第2変形部232hの組と、突出判定部232i及び第3変形部232jの組とが設けられている。 In order to obtain such an approximate curve, the approximate curve calculation unit 232 is provided with a feature site specifying unit 232a, a feature site interpolation unit 232b, a free curve calculation unit 232c, and a correction unit 232d. Further, the correction unit 232d includes a set of the straight part specifying unit 232e and the first deforming unit 232f, a set of the curved part specifying unit 232g and the second deforming unit 232h, and a set of the protrusion determining unit 232i and the third deforming unit 232j. And are provided.
(特徴部位特定部)
 特徴部位特定部232aは、特定された層領域中の画素の画素値に基づいて、この層領域の形状に基づく複数の特徴部位を特定する。この処理の例を、図5A~図5Cを参照しつつ説明する。
(Characteristic specific part)
The feature part specifying unit 232a specifies a plurality of feature parts based on the shape of the layer region based on the pixel values of the pixels in the specified layer region. An example of this processing will be described with reference to FIGS. 5A to 5C.
 まず、図5Aに示すように、特徴部位特定部232aは、層領域300の形状に基づいて、奥行方向(+z方向)における層領域300中の最深部位P0を特定する。この処理は、たとえば、層領域300中の各画素の座標値を参照し、z座標値が最大の画素を特定して最深部位P0に設定することで実行できる。異なる手法として、解析対象の断層像において、奥行方向に直交する直線を+z方向から-z方向に移動させていき、この直線に最初に接する層領域中の位置を最深部位に設定するようにしてもよい。このようにして特定された最深部位P0は層領域300の特徴部位とされる。 First, as shown in FIG. 5A, the characteristic part specifying unit 232 a specifies the deepest part P0 in the layer region 300 in the depth direction (+ z direction) based on the shape of the layer region 300. This process can be executed, for example, by referring to the coordinate value of each pixel in the layer region 300, specifying the pixel having the maximum z coordinate value, and setting it to the deepest part P0. As a different method, in the tomographic image to be analyzed, a straight line orthogonal to the depth direction is moved from the + z direction to the −z direction, and the position in the layer region that first contacts the straight line is set as the deepest part. Also good. The deepest part P0 specified in this way is a characteristic part of the layer region 300.
 次に、特徴部位特定部232aは、最深部位P0を通過しかつ層領域300に接する直線を求め、この直線と層領域300との接点を特徴部位とする。この処理の具体例を説明する。図5Bに示すように、最深部位P0を通過する直線Lを最深部位P0を中心として回転させていく。図5Bは、直線Lを時計回りrに回転させる場合を示している。 Next, the characteristic part specifying unit 232a obtains a straight line passing through the deepest part P0 and in contact with the layer region 300, and a contact point between the straight line and the layer region 300 is set as a characteristic part. A specific example of this process will be described. As shown in FIG. 5B, the straight line L passing through the deepest part P0 is rotated around the deepest part P0. FIG. 5B shows a case where the straight line L is rotated clockwise r.
 このように直線Lを回転させていくと、図5Cに示すように、或る段階で直線Lが層領域300に接する。このときの直線Lが、上記「最深部位P0を通過しかつ層領域300に接する直線」に相当する。この接点P1は層領域300の特徴部位とされる。なお、他の特徴部位は全て最深部位P0よりも-z側に位置するので、たとえば最深部位P0を通過しz座標軸に直交する位置から直線Lを回転させれば十分である。また、直線Lを逆方向(反時計回り)に回転させていくことにより、最深部位P0に対して特徴部位P1と反対側に位置する特徴部位を特定することができる。 When the straight line L is rotated in this way, the straight line L contacts the layer region 300 at a certain stage as shown in FIG. 5C. The straight line L at this time corresponds to the above-mentioned “straight line passing through the deepest part P0 and in contact with the layer region 300”. The contact P1 is a characteristic part of the layer region 300. Since all the other characteristic parts are located on the −z side with respect to the deepest part P0, for example, it is sufficient to rotate the straight line L from a position that passes through the deepest part P0 and is orthogonal to the z coordinate axis. In addition, by rotating the straight line L in the reverse direction (counterclockwise), it is possible to specify a feature part located on the opposite side of the feature part P1 with respect to the deepest part P0.
 このような処理を繰り返すことにより層領域300の複数の特徴部位Pj(j=0、1、2、・・・、J)が特定される。なお、上記処理の反復方法としては、たとえば次の2つが考えられる。なお、他の方法で複数の特徴部位Pjを特定することも勿論可能である。 By repeating such processing, a plurality of characteristic portions Pj (j = 0, 1, 2,..., J) of the layer region 300 are specified. Note that, for example, the following two methods are conceivable as a method of repeating the above processing. Of course, it is possible to specify a plurality of characteristic portions Pj by other methods.
 第1の反復方法として、常に最深部位P0を通過する直線Lを考慮し、直線Lと層領域300との接点を順次に特定していくことができる。その場合、直線Lの回転に伴う第1番目の接点、第2番目の接点、第3番目の接点、・・・というように順次に接点(特徴部位)が特定されていく。 As a first iterative method, the straight line L that always passes through the deepest part P0 can be considered, and the contact point between the straight line L and the layer region 300 can be specified sequentially. In that case, the first contact point, the second contact point, the third contact point,... According to the rotation of the straight line L are sequentially specified as the contact point (characteristic part).
 第2の反復方法として、直線Lの回転中心を順次に変更していくことができる。つまり、最初は最深部位P0を中心として直線Lを回転させて第1の接点を特定する。次に、この第1の接点を中心として同様に直線Lを回転させて第2の接点を特定する。続いて、この第2の接点を中心として同様に直線Lを回転させて第3の接点を特定する。このようにして複数の接点(特徴部位)が順次に特定されていく。 As a second iteration method, the rotation center of the straight line L can be changed sequentially. That is, first, the first contact point is specified by rotating the straight line L around the deepest part P0. Next, the second contact point is specified by rotating the straight line L similarly around the first contact point. Subsequently, the straight line L is similarly rotated around the second contact point to specify the third contact point. In this way, a plurality of contact points (characteristic portions) are specified sequentially.
 上記のようにして特定される特徴部位の個数は任意であるが、その個数が多いほど下記の処理の精度は向上する。一方、特徴部位の特定個数が増加するほど、処理に必要なリソースが増大する。 The number of characteristic parts specified as described above is arbitrary, but the accuracy of the following processing improves as the number increases. On the other hand, the resource required for processing increases as the specific number of feature parts increases.
(特徴部位補間部)
 特徴部位補間部232bは、層領域の特徴部位を追加する処理を行う。この処理は、たとえば、常に実行されるものではなく、必要に応じて実行されるものである。以下、2種類の補間処理を説明する。
(Feature interpolation unit)
The feature part interpolation unit 232b performs a process of adding a feature part of the layer region. This process is not always executed, for example, but is executed as necessary. Hereinafter, two types of interpolation processing will be described.
 第1の補間処理は、後段の自由曲線演算処理の要請に従うものである。すなわち、特徴部位に基づき自由曲線を求めるには、その方程式を解くために必要な特徴部位の最小数がある。この補間処理は、特徴部位特定部232aにより特定された特徴部位の個数が、この最低数未満である場合に実行される。なお、自由曲線の演算に必要な特徴部位の最小数は、求められる自由曲線の種類に応じて予め設定されている。その具体例として、自由曲線が3次スプライン曲線である場合、最小数は4に設定される。以上の準備の下、第1の補間処理は、たとえば次のようにして行われる。 The first interpolation process follows the request for the free curve calculation process in the subsequent stage. That is, in order to obtain a free curve based on a feature part, there is a minimum number of feature parts necessary for solving the equation. This interpolation process is executed when the number of feature parts specified by the feature part specifying unit 232a is less than the minimum number. Note that the minimum number of characteristic portions necessary for the calculation of the free curve is set in advance according to the type of free curve to be obtained. As a specific example, when the free curve is a cubic spline curve, the minimum number is set to 4. With the above preparation, the first interpolation process is performed as follows, for example.
 まず、特徴部位補間部232bは、特徴部位特定部232aにより特定された特徴部位の個数が、所定の最小数未満であるか判断する。特定された個数が最小数以上である場合、第1の補間処理は実行されない。 First, the feature part interpolation unit 232b determines whether the number of feature parts specified by the feature part specifying unit 232a is less than a predetermined minimum number. When the specified number is equal to or greater than the minimum number, the first interpolation process is not executed.
 特定された個数が最小数未満である場合、特徴部位補間部232bは、最小数と特定された個数との差として得られる個数だけ、新たな特徴部位を追加する。新たな特徴部位としては、たとえば、現時点までに得られている特徴部位を連結して形成される折れ線上において、層領域に最も近い点(奥行方向、つまりz方向における最近接点)が選択される。つまり、1番目に追加される特徴部位は、特徴部位特定部232aにより特定された特徴部位を連結してなる折れ線上において、層領域に最も近い点である。また、2つ以上の特徴部位を追加する場合、i(≧2)番目に追加される特徴部位は、特徴部位特定部232aにより特定された特徴部位及びi-1番目までに追加された特徴部位を連結してなる折れ線上において、層領域に最も近い点である。 When the specified number is less than the minimum number, the feature part interpolation unit 232b adds new feature parts by the number obtained as a difference between the minimum number and the specified number. As the new feature portion, for example, a point closest to the layer region (the closest point in the depth direction, that is, the z direction) is selected on a broken line formed by connecting the feature portions obtained up to the present time. . That is, the first feature part added is the point closest to the layer region on the polygonal line formed by connecting the feature parts specified by the feature part specifying unit 232a. When two or more feature parts are added, the i (≧ 2) th feature part is added to the feature part specified by the feature part specifying unit 232a and the feature part added up to the (i−1) th part. It is the point closest to the layer region on the polygonal line formed by connecting.
 なお、特徴部位特定部232aにより1つしか特徴部位が特定されなかった場合には上記のような折れ線を形成することはできない。この場合、特徴部位補間部232bは、層領域上の任意の点、たとえば、断層像のフレーム(描画枠)の端部(縁部)と層領域との交点を、新たな特徴部位として追加することができる。この処理は、フレームの端部において層領域が突出している場合にも適用できる。以上が第1の補間処理の例である。 In addition, when only one feature part is specified by the feature part specifying unit 232a, a broken line as described above cannot be formed. In this case, the feature part interpolation unit 232b adds an arbitrary point on the layer region, for example, an intersection of the end (edge) of the frame (drawing frame) of the tomographic image and the layer region as a new feature part. be able to. This process can also be applied when the layer region protrudes at the end of the frame. The above is an example of the first interpolation processing.
 第2の補間処理について説明する。この補間処理は、自由曲線の近似の確からしさ(確度)の要請に従うものである。具体的には、隣接する2つの特徴部位の間隔が広すぎると、これら特徴部位の間における層領域と近似曲線との間の誤差が大きくなってしまう。第2の補間処理は、特徴部位の間隔が広い場合に、それらの間に新たな特徴部位を追加するものである。この処理の例を以下に説明する。なお、特徴部位特定部232aにより1つしか特徴部位が特定されなかった場合、第1の補間処理と同様に、層領域上の任意の点(たとえばフレーム端部と層領域との交点)が新たな特徴部位としてまず追加される。 The second interpolation process will be described. This interpolation processing complies with a request for accuracy (accuracy) of approximation of a free curve. Specifically, if the interval between two adjacent characteristic parts is too wide, an error between the layer region and the approximate curve between the characteristic parts becomes large. The second interpolation processing is to add a new feature portion between the feature portions when the interval between the feature portions is wide. An example of this processing will be described below. If only one feature part is specified by the feature part specifying unit 232a, an arbitrary point on the layer area (for example, the intersection of the frame end and the layer area) is newly added as in the first interpolation process. First added as a special feature.
 まず、特徴部位補間部232bは、特徴部位特定部232aにより特定された特徴部位について、隣接する2つの特徴部位の間の距離を求める。この距離は、たとえば、奥行方向(z方向)に直交する断面方向(つまり信号光LSの走査方向)における距離である。なお、断層像が示す断面における2つの特徴部位の間の空間的な距離であってもよい。 First, the feature part interpolation unit 232b obtains the distance between two adjacent feature parts for the feature part specified by the feature part specifying unit 232a. This distance is, for example, a distance in a cross-sectional direction (that is, the scanning direction of the signal light LS) orthogonal to the depth direction (z direction). Note that it may be a spatial distance between two characteristic parts in the cross section indicated by the tomographic image.
 次に、特徴部位補間部232bは、求められた距離が所定値以上であるか判断する。距離が所定値未満である場合には、この特徴部位のペアに対する第2の補間処理は実行されず、必要に応じて他のペアに対する処理に移行する。 Next, the characteristic part interpolation unit 232b determines whether the obtained distance is equal to or greater than a predetermined value. If the distance is less than the predetermined value, the second interpolation process for the pair of feature parts is not executed, and the process proceeds to a process for another pair as necessary.
 一方、2つの特徴部位の間の距離が所定値以上である場合、特徴部位補間部232bは、当該2つの特徴部位の間に新たな特徴部位を追加する。新たな特徴部位を追加する位置は、たとえば、2つの特徴部位の中間位置とされる。また、z方向における2つの特徴部位の相対位置や、他の特徴部位の位置などの要因を考慮して追加位置を決定するようにしてもよい。また、2つの特徴部位の間の距離が所定値の2倍以上である場合、これら特徴部位の間に新たな特徴部位を2つ以上追加するようにしてもよい。特徴部位補間部232bは、全ての隣接する特徴部位のペアの間隔が所定値未満になるまで、上記したような補間処理を繰り返し行う。以上が第2の補間処理の例である。 On the other hand, when the distance between the two feature parts is equal to or greater than the predetermined value, the feature part interpolation unit 232b adds a new feature part between the two feature parts. The position where the new feature part is added is, for example, an intermediate position between the two feature parts. Further, the additional position may be determined in consideration of factors such as the relative positions of the two characteristic parts in the z direction and the positions of other characteristic parts. In addition, when the distance between two feature parts is twice or more the predetermined value, two or more new feature parts may be added between these feature parts. The feature part interpolation unit 232b repeatedly performs the above-described interpolation processing until the distance between all adjacent feature part pairs becomes less than a predetermined value. The above is an example of the second interpolation processing.
(自由曲線演算部)
 自由曲線演算部232cは、複数の特徴部位に基づく自由曲線を求める。この複数の特徴部位は、特徴部位特定部232aにより特定された特徴部位、及び、必要に応じて特徴部位補間部232bにより追加された特徴部位である。
(Free curve calculation part)
The free curve calculation unit 232c obtains a free curve based on a plurality of feature parts. The plurality of feature parts are a feature part specified by the feature part specifying unit 232a and a feature part added by the feature part interpolation unit 232b as necessary.
 自由曲線は、一般に、平面上のいくつかの点を所定の順序で通過するように定義された滑らかな曲線である。自由曲線の例としてスプライン曲線やベジェ曲線がある。この実施形態では、自由曲線としてスプライン曲線、特に3次スプライン曲線が用いられる。なお、スプライン曲線は、一般に、与えられた複数の制御点を通過する滑らかな曲線であり、隣接する2つの制御点を両端とする区間(セグメント)に対して個別の多項式を用いて求められる曲線である。また、n次スプライン曲線は、n次多項式を用いるものである。また、n次ベジェ曲線は、一般に、n個の制御点から得られるn-1次の曲線である。この実施形態では、各特徴部位を制御点として自由曲線の演算が行われる。 A free curve is generally a smooth curve defined to pass through several points on a plane in a predetermined order. Examples of free curves are spline curves and Bezier curves. In this embodiment, a spline curve, particularly a cubic spline curve is used as the free curve. Note that a spline curve is generally a smooth curve that passes through a plurality of given control points, and is a curve obtained using individual polynomials for sections (segments) having two adjacent control points at both ends. It is. The nth order spline curve uses an nth order polynomial. The n-th order Bezier curve is generally an n-1th order curve obtained from n control points. In this embodiment, the calculation of the free curve is performed using each characteristic part as a control point.
(補正部)
 補正部232dは、自由曲線演算部232cにより求められた自由曲線を補正する。この補正処理には様々なものがあるが、ここでは3種類の補正処理を説明する。第1の補正処理は、直線部位特定部232e及び第1変形部232fにより行われる。第2の補正処理は、曲線部位特定部232g及び第2変形部232hにより行われる。第3の補正処理は、突出判定部232i及び第3変形部232jにより行われる。これら補正処理は、数学的に求められた自由曲線を、医学的な知見に基づき補正するものと言える。
(Correction part)
The correction unit 232d corrects the free curve obtained by the free curve calculation unit 232c. There are various types of correction processing. Here, three types of correction processing will be described. The first correction process is performed by the straight part specifying unit 232e and the first deforming unit 232f. The second correction process is performed by the curved part specifying unit 232g and the second deforming unit 232h. The third correction process is performed by the protrusion determination unit 232i and the third deformation unit 232j. These correction processes can be said to correct mathematically obtained free curves based on medical knowledge.
 なお、自由曲線演算部232cにより求められた自由曲線、又は、必要に応じて第1~第3の補正処理の少なくとも1つを施して得られた曲線が、当該断面にドルーゼンが存在しないと仮定した場合における網膜色素上皮層の推定形状を表す近似曲線となる。 It is assumed that the free curve obtained by the free curve calculation unit 232c or the curve obtained by performing at least one of the first to third correction processes as necessary does not include drusen in the cross section. In this case, the approximate curve represents the estimated shape of the retinal pigment epithelium layer.
(第1の補正処理:直線部位特定部、第1変形部)
 第1の補正処理について説明する。第1の補正処理では、層領域の一部が(ほぼ)直線状になっている場合に、自由曲線の当該部位を補正するものである。自由曲線の演算の特性により、層領域が直線状になっている部位では、自由曲線が、層領域よりも奥行方向(+z方向)に配置され、かつ奥行方向(+z方向)に凸になることがある。このような自由曲線を近似曲線として採用すると、この直線状の部位をドルーゼンとして検出してしまうおそれがある。第1の補正処理は、このような事態を回避するためのものである。以下、この補正処理の例を説明する。
(First correction process: straight part specifying unit, first deforming unit)
The first correction process will be described. In the first correction process, when a part of the layer region is (substantially) linear, the portion of the free curve is corrected. Due to the characteristics of the calculation of the free curve, the free curve is arranged in the depth direction (+ z direction) and convex in the depth direction (+ z direction) in the region where the layer region is linear. There is. When such a free curve is adopted as an approximate curve, there is a risk that this linear portion will be detected as drusen. The first correction process is for avoiding such a situation. Hereinafter, an example of this correction processing will be described.
 直線部位特定部232eは、層領域特定部231により特定された層領域において実質的に直線的な部位を特定する。この処理は、たとえば、層領域の各点における傾き(微分値)を算出し、傾きが実質的に一定の区間を探索することにより行うことができる。なお、傾きが実質的に一定とは、傾きが完全に一定の区間だけでなく、傾きの変化が所定の閾値未満である場合も含むものとする。このようにして特定される層領域の部位を直線部位と呼ぶ。 The linear part specifying unit 232e specifies a substantially linear part in the layer region specified by the layer region specifying unit 231. This process can be performed, for example, by calculating a slope (differential value) at each point in the layer region and searching for a section where the slope is substantially constant. Note that “the slope is substantially constant” includes not only a section where the slope is completely constant but also a case where the change in the slope is less than a predetermined threshold. The part of the layer region specified in this way is called a straight part.
 第1変形部232fは、自由曲線が直線部位よりも奥行方向(+z方向)に位置するか否か判定する。この処理は、たとえば、直線部位のz座標値と、この直線部位に対応する自由曲線の部位のz座標値とを比較することにより行われる。自由曲線が直線部位よりも奥行方向の逆方向(-z方向)に位置すると判定された場合、第1の補正処理は実行されない。 The first deformation unit 232f determines whether or not the free curve is located in the depth direction (+ z direction) from the straight line portion. This process is performed, for example, by comparing the z-coordinate value of the straight line portion with the z-coordinate value of the free curve portion corresponding to the straight line portion. When it is determined that the free curve is located in the reverse direction (−z direction) of the depth direction with respect to the straight line portion, the first correction process is not executed.
 自由曲線が直線部位よりも奥行方向に位置すると判定された場合、第1変形部232fは、自由曲線の当該対応部位を直線部位の位置に合わせるように自由曲線を変形する。その第1の処理例として、自由直線の対応部位を直線部位に置換するように対応部位及びその近傍部位を変形することができる。具体的には、たとえば、直線部位の両端位置等に制御点を新たに設定し、この新たな制御点を加味して対応部位の片側又は両側の自由曲線を新たに求め、かつ、直線部位を近似曲線の一部として採用する。この処理によれば、(実質的に)直線からなる部位を含む近似曲線が得られる。 When it is determined that the free curve is located in the depth direction from the straight line part, the first deforming unit 232f deforms the free curve so that the corresponding part of the free curve matches the position of the straight line part. As the first processing example, the corresponding part and its vicinity can be modified so that the corresponding part of the free straight line is replaced with the straight part. Specifically, for example, a new control point is set at both end positions of the straight part, and a new free curve on one or both sides of the corresponding part is calculated taking this new control point into account, and the straight part is obtained. Adopted as part of approximate curve. According to this processing, an approximate curve including a part (substantially) a straight line is obtained.
 第2の処理例として、直線部位上に1つ以上の制御点を新たに設定し、この制御点を加味して新たな自由曲線を求める。この処理によれば、直線部位を自由曲線で近似することになる。なお、新たに追加される制御点の個数は、たとえば直線部位の長さ等に基づいて任意に設定することができる。 As a second processing example, one or more control points are newly set on the straight line portion, and a new free curve is obtained by taking these control points into consideration. According to this process, the straight part is approximated by a free curve. The number of newly added control points can be arbitrarily set based on, for example, the length of the straight line part.
(第2の補正処理:曲線部位特定部、第2変形部)
 第2の補正処理について説明する。第2の補正処理では、自由曲線が層領域よりも奥行方向の逆方向(-z方向)に位置する場合に、自由曲線の当該部位を補正するものである。前述のように、自由曲線に基づき得られる近似曲線は、ドルーゼンが存在しないと仮定した場合における層領域の位置を推定するものである。よって、近似曲線は、層領域と同じ位置か、或いは層領域よりも奥行方向(+z方向)に位置していることが好適である。第2の補正処理は、この条件を満足しない自由曲線の部位を、この条件を満足するように変形するものである。以下、この補正処理の例を説明する。
(Second correction process: curve part specifying part, second deformation part)
The second correction process will be described. In the second correction process, when the free curve is positioned in the reverse direction (−z direction) of the depth direction with respect to the layer region, the relevant part of the free curve is corrected. As described above, the approximate curve obtained based on the free curve is for estimating the position of the layer region when it is assumed that drusen does not exist. Therefore, it is preferable that the approximate curve is located at the same position as the layer region or in the depth direction (+ z direction) from the layer region. In the second correction process, a portion of a free curve that does not satisfy this condition is transformed so as to satisfy this condition. Hereinafter, an example of this correction processing will be described.
 曲線部位特定部232gは、自由曲線において層領域よりも奥行方向の逆方向(-z方向)に位置する部位を特定する。この処理は、たとえば、自由曲線のz座標値と層領域のz座標値とを比較することにより、容易に行うことができる。 The curved part specifying unit 232g specifies a part located in the reverse direction (−z direction) of the depth direction with respect to the layer region in the free curve. This process can be easily performed, for example, by comparing the z coordinate value of the free curve and the z coordinate value of the layer region.
 第2変形部232hは、曲線部位特定部232gにより特定された自由曲線の部位を層領域の位置に合わせるように自由曲線を変形する。この処理は、第1の補正処理と同様に、変形のターゲット(層領域)への置換でもよいし、ターゲットに基づく新たな制御点を加味した自由曲線演算でもよい。 The second deforming unit 232h deforms the free curve so that the part of the free curve specified by the curve part specifying unit 232g matches the position of the layer region. Similar to the first correction process, this process may be a replacement of the deformation with a target (layer region) or a free curve calculation that takes into account a new control point based on the target.
(第3の補正処理:突出判定部、第3変形部)
 第3の補正処理について説明する。第3の補正処理では、断層像のフレーム端部において層領域が突出している場合に、この突出部位の自由曲線を補正する。
(Third correction processing: protrusion determination unit, third deformation unit)
The third correction process will be described. In the third correction process, when the layer region protrudes at the frame end of the tomographic image, the free curve of the protruding portion is corrected.
 フレーム端部において層領域が突出している場合の例を図6Aに示す。層領域300は、断層像のフレームFの右側端部において、奥行方向の逆方向(-z方向)に突出している。更に、層領域300の特徴部位として符号310及び320に示す点が得られたとする。ここで、特徴部位320は、特徴部位特定部232aにより特定されたものである。また、層領域300とフレームFの端部との交点である特徴部位310は、特徴部位補間部232bにより追加されたものである。 FIG. 6A shows an example where the layer region protrudes at the end of the frame. The layer region 300 protrudes in the direction opposite to the depth direction (−z direction) at the right end of the frame F of the tomographic image. Furthermore, it is assumed that points 310 and 320 are obtained as characteristic portions of the layer region 300. Here, the characteristic part 320 is specified by the characteristic part specifying unit 232a. Also, the feature part 310 that is the intersection of the layer region 300 and the end of the frame F is added by the feature part interpolation unit 232b.
 図6Bの符号400は、これら特徴部位310及び320等に基づく自由曲線を示す。自由曲線400は、2つの特徴部位310及び320を通過するものであるから、図6Bに示すように、突出部位の突出方向(-z方向)に傾斜したものとなる。すなわち、フレームFの端部に(便宜的に)設定された特徴部位310が自由曲線400の形状に影響を与え、その結果、突出部位の突出量を正確に測定することができなくなる。このような事態に対処するために第3の補正処理が実行される。以下、第3の補正処理の例を説明する。 6B indicates a free curve based on these characteristic portions 310 and 320 and the like. Since the free curve 400 passes through the two characteristic portions 310 and 320, as shown in FIG. 6B, the free curve 400 is inclined in the protruding direction (−z direction) of the protruding portion. That is, the feature part 310 set (for convenience) at the end of the frame F affects the shape of the free curve 400, and as a result, the protrusion amount of the protrusion part cannot be measured accurately. In order to cope with such a situation, a third correction process is executed. Hereinafter, an example of the third correction process will be described.
 突出判定部232iは、断層像のフレームの端部近傍において層領域が奥行方向の逆方向(-z方向)に突出しているか判定する。この処理は、たとえば、(少なくともフレームの端部近傍において)層領域又は自由曲線の各点における傾き(微分値)を算出してその形状を求めることで実行できる。 The protrusion determining unit 232i determines whether or not the layer region protrudes in the reverse direction (−z direction) of the depth direction in the vicinity of the end of the tomographic frame. This processing can be executed, for example, by calculating the slope (differential value) at each point of the layer region or the free curve (at least in the vicinity of the end of the frame) and obtaining its shape.
 他の処理として、特徴部位補間部232bが層領域とフレーム端部との交点に特徴部位を追加した場合において、この新たな特徴部位(310)が、これに隣接する特徴部位(320)よりも-z方向に設定された場合に、断層像のフレーム端部近傍において層領域が奥行方向の逆方向(-z方向)に突出していると判定することも可能である。 As another process, when the feature part interpolation unit 232b adds a feature part at the intersection of the layer region and the frame end, the new feature part (310) is more than the feature part (320) adjacent thereto. When set in the −z direction, it is also possible to determine that the layer region protrudes in the direction opposite to the depth direction (−z direction) in the vicinity of the frame end of the tomographic image.
 第3変形部232jは、層領域がフレーム端部近傍において突出していると判定された場合に、この突出部位に対応する自由曲線の部位を、フレームの中央側からの自由曲線の延長線に置換することで自由曲線を変形する。この処理の例を以下に説明する。 When it is determined that the layer region protrudes in the vicinity of the frame end, the third deforming portion 232j replaces the portion of the free curve corresponding to the protruding portion with an extension line of the free curve from the center side of the frame. To deform the free curve. An example of this processing will be described below.
 図6Cを参照する。まず、第3変形部232jは、層領域とフレーム端部との交点に設定された特徴部位310に隣接する特徴部位320よりもフレーム中央側の所定位置に、制御点(基準制御点)410を設定する。基準制御点410としては、たとえば、フレームの横方向における中間点が用いられる。 Refer to FIG. 6C. First, the third deforming portion 232j sets a control point (reference control point) 410 at a predetermined position on the center side of the frame with respect to the feature portion 320 adjacent to the feature portion 310 adjacent to the feature portion 310 set at the intersection of the layer region and the frame end. Set. As the reference control point 410, for example, an intermediate point in the horizontal direction of the frame is used.
 次に、第3変形部232jは、基準制御点410と特徴部位320との間を等分するように、基準制御点410と特徴部位320との間に所定個数の制御点を設定する。それにより、基準制御点410を含む複数の制御点と、特徴部位320とからなる点の組が得られる。この組に含まれる点の個数は、自由曲線の演算に必要な特徴部位の最小数以上とされる。また、隣接する点の間隔は比較的に広めに設定することができる。図6Cに示す例では、3つの制御点410、420及び430と、特徴部位320とからなる組が設定されている。 Next, the third deforming unit 232j sets a predetermined number of control points between the reference control point 410 and the feature part 320 so that the reference control point 410 and the feature part 320 are equally divided. Thereby, a set of points including a plurality of control points including the reference control point 410 and the characteristic part 320 is obtained. The number of points included in this set is equal to or greater than the minimum number of characteristic parts necessary for the calculation of the free curve. Further, the interval between adjacent points can be set relatively wide. In the example illustrated in FIG. 6C, a set including three control points 410, 420, and 430 and a characteristic portion 320 is set.
 続いて、第3変形部232jは、得られた組に含まれる複数の点を制御点とする自由曲線を求める。図6Cに示す4つの制御点320、410、420及び430を通過し、かつフレーム端部に至る3次スプライン曲線を求めることで、図6Dに示す推定曲線500が得られる。推定曲線500は、層領域300の突出部位及びその近傍において定義される自由曲線であり、突出部位の突出量を測定するためのベースラインの推定位置を示すものである。 Subsequently, the third deformation unit 232j obtains a free curve having a plurality of points included in the obtained set as control points. By obtaining a cubic spline curve that passes through the four control points 320, 410, 420, and 430 shown in FIG. 6C and reaches the end of the frame, an estimated curve 500 shown in FIG. 6D is obtained. The estimated curve 500 is a free curve defined at and near the protruding portion of the layer region 300, and indicates the estimated position of the baseline for measuring the protruding amount of the protruding portion.
 次に、第3変形部232jは、自由曲線400のうち特徴部位320よりフレーム中央側の部分(内挿曲線)400aと、推定曲線500のうち特徴部位320よりフレーム端部側の部分(外挿曲線)500aとを、特徴部位320の位置において連結する(図6Eを参照)。このようにして得られる曲線が、自由曲線400の変形結果として用いられる。 Next, the third deforming unit 232j includes a portion of the free curve 400 on the frame center side from the feature portion 320 (interpolation curve) 400a and a portion of the estimated curve 500 on the frame end side of the feature portion 320 (extrapolation). Curve) 500a is connected at the position of the characteristic part 320 (see FIG. 6E). The curve thus obtained is used as a deformation result of the free curve 400.
 他の処理例を説明する。まず、第3変形部232jは、層領域とフレーム端部との交点に設定された特徴部位(310)に隣接する特徴部位(320)から所定距離(たとえば当該隣接特徴部位のごく近傍)だけフレーム中央側の位置(基準位置)における、層領域の傾きを算出する。次に、第3変形部232jは、この基準位置とフレーム端部とを結び、かつ算出された傾きを有する線分を求める。そして、第3変形部232jは、突出部位に対応する自由曲線の部位、つまり交点に相当する特徴部位(310)と基準位置との間の部位を、この線分に置換する。図6Fに示す補正線分510がこれに相当する。 Other processing examples will be described. First, the third deforming portion 232j is a frame that is a predetermined distance (for example, very close to the adjacent feature portion) from the feature portion (320) adjacent to the feature portion (310) set at the intersection of the layer region and the frame end. The inclination of the layer area at the center position (reference position) is calculated. Next, the third deforming unit 232j obtains a line segment connecting the reference position and the frame end and having the calculated inclination. And the 3rd deformation | transformation part 232j substitutes the site | part of the free curve corresponding to a protrusion site | part, ie, the site | part between the characteristic site | part (310) equivalent to an intersection, and a reference position with this line segment. The correction line segment 510 shown in FIG. 6F corresponds to this.
 他の処理として、フレームの端部上の位置、又は突出部位中の任意の位置に、新たな特徴部位を設定し、この新たな特徴部位を加味して新たな自由曲線を求めることも可能である。 As another process, it is also possible to set a new feature part at a position on the end of the frame or an arbitrary position in the projecting part and obtain a new free curve by taking this new feature part into consideration. is there.
(突出領域特定部)
 突出領域特定部233は、近似曲線演算部232により求められた近似曲線と、層領域とに基づいて、断層像中の突出領域を特定する。突出領域とは、近似曲線が層領域よりも眼底Efの奥行方向(+z方向)に位置し、かつ、奥行方向における近似曲線と層領域との間の距離が所定閾値以上である画像領域を示す。つまり、突出領域とは、近似曲線に対して層領域が大きく-z方向に突出している領域を示す。
(Projection area specific part)
The protruding area specifying unit 233 specifies the protruding area in the tomographic image based on the approximate curve obtained by the approximate curve calculating unit 232 and the layer area. The protruding region is an image region in which the approximate curve is located in the depth direction (+ z direction) of the fundus oculi Ef with respect to the layer region, and the distance between the approximate curve and the layer region in the depth direction is equal to or greater than a predetermined threshold. . In other words, the protruding region indicates a region where the layer region is greatly protruded in the −z direction with respect to the approximate curve.
 なお、近似曲線に対して-z方向に突出する画像領域全体を突出領域として特定することもできるが、この実施形態では、網膜色素上皮層の自然な凹凸やノイズ等を回避するために、近似曲線から所定距離以上突出している部分のみを検出することにより、突出領域の特定精度の向上を図る。そのために、突出領域特定部233には、距離算出部233a、距離判断部233b及び画像領域特定部233cが設けられる。 Note that the entire image region protruding in the −z direction with respect to the approximate curve can be specified as the protruding region. However, in this embodiment, in order to avoid natural unevenness and noise of the retinal pigment epithelium layer, approximation is performed. By detecting only the portion protruding from the curve by a predetermined distance or more, the specified accuracy of the protruding region is improved. For this purpose, the protruding area specifying unit 233 is provided with a distance calculating unit 233a, a distance determining unit 233b, and an image area specifying unit 233c.
(距離算出部)
 距離算出部233aは、近似曲線上の各点と層領域との間の奥行方向における距離を算出する。この処理は、近似曲線上の全ての点(画素)について実行する必要はなく、所定の画素間隔毎(たとえば5ピクセル毎)に行うようにしてもよい。距離の算出処理は、たとえば、近似曲線上の点(画素)と層領域上の対応点(画素)との間の画素数をカウントし、隣接する画素の間隔を単位距離と当該カウント結果とに基づいて行うことができる。また、画像の計測倍率と距離計測対象の画素間の画像中における距離とに基づいて求めてもよい。なお、距離算出部233aにより演算される距離は、画像中の距離(xyz座標系で定義される距離や、画素の間隔)を実空間の距離に換算したものであってもよいし、画像中の距離をそのまま使用してもよい。
(Distance calculation unit)
The distance calculation unit 233a calculates the distance in the depth direction between each point on the approximate curve and the layer region. This process does not need to be performed for all points (pixels) on the approximate curve, and may be performed at predetermined pixel intervals (for example, every 5 pixels). In the distance calculation process, for example, the number of pixels between a point (pixel) on the approximate curve and a corresponding point (pixel) on the layer area is counted, and the interval between adjacent pixels is set as a unit distance and the count result. Can be done on the basis. Alternatively, it may be obtained based on the measurement magnification of the image and the distance in the image between the pixels of the distance measurement target. The distance calculated by the distance calculation unit 233a may be a distance obtained by converting a distance in the image (a distance defined in the xyz coordinate system or a pixel interval) into a distance in real space. The distance may be used as it is.
(距離判断部)
 距離判断部233bは、距離算出部233aにより算出された各距離が所定閾値以上であるか判断する。この閾値は、たとえば多数の臨床例に基づいて事前に設定される。また、装置の計測精度などを考慮して閾値を設定することもできる。
(Distance judgment part)
The distance determination unit 233b determines whether each distance calculated by the distance calculation unit 233a is equal to or greater than a predetermined threshold. This threshold is set in advance based on, for example, many clinical cases. Further, the threshold value can be set in consideration of the measurement accuracy of the apparatus.
 距離判断部233bは、距離が所定閾値以上と判断された近似曲線上の点(画素)に識別情報(たとえばフラグやタグ)を付与する。 The distance determination unit 233b gives identification information (for example, a flag or a tag) to a point (pixel) on the approximate curve where the distance is determined to be greater than or equal to a predetermined threshold.
(画像領域特定部)
 画像領域特定部233cは、距離判断部233bにより識別情報が付与された近似曲線上の画素の集合、つまり距離が所定閾値以上と判断された近似曲線上の点の集合と、層領域とに挟まれた画像領域を特定して目的の突出領域とする。
(Image area identification part)
The image region specifying unit 233c is sandwiched between a layer region and a set of pixels on the approximate curve to which identification information is given by the distance determination unit 233b, that is, a set of points on the approximate curve whose distance is determined to be equal to or greater than a predetermined threshold. The specified image area is specified as a target protruding area.
 このようにして特定される突出領域の例を図7に示す。突出領域600は、層領域300と近似曲線400とに挟まれた画像領域であって、層領域300と近似曲線400との間の距離が所定閾値以上である画像領域である。図7に示すように、この段階においては、層領域300と近似曲線400に挟まれた画像領域のうち、これらの共有点である特徴部位330及び340に近い裾野の部分(距離が所定閾値未満である部分)は突出領域とは判定されない。なお、図7に示す範囲では層領域のピークは1つしか存在しないが、ピークが2つ以上存在する場合には、特徴部位の近傍以外にも突出領域とは判定されない部分が存在することがある。 FIG. 7 shows an example of the protruding area specified in this way. The protruding region 600 is an image region sandwiched between the layer region 300 and the approximate curve 400, and is an image region in which the distance between the layer region 300 and the approximate curve 400 is equal to or greater than a predetermined threshold value. As shown in FIG. 7, at this stage, in the image region sandwiched between the layer region 300 and the approximate curve 400, the portion of the base close to the feature parts 330 and 340 that are the common points (the distance is less than a predetermined threshold value). Is not determined as a protruding region. In the range shown in FIG. 7, there is only one peak in the layer region, but when there are two or more peaks, there may be a portion that is not determined as a protruding region other than the vicinity of the characteristic part. is there.
(連結領域特定部)
 連結領域特定部234は、突出領域特定部233により特定された突出領域を含み、かつ近似曲線及び層領域に囲まれた連結領域を特定する。この処理は、たとえば、画素に対するラベリング処理により実行される。このラベリング処理は、たとえば4連結(4近傍)又は8連結(8近傍)のラベリング処理である。
(Linked area identification part)
The connection area specifying unit 234 specifies the connection area including the protruding area specified by the protruding area specifying part 233 and surrounded by the approximate curve and the layer area. This process is executed by, for example, a labeling process for pixels. This labeling process is, for example, a labeling process of 4 connections (near 4) or 8 connections (near 8).
 図7の例に対して上記処理を適用した場合に特定される連結領域を図8に示す。図7に示す突出領域600は、層領域300と近似曲線400とに挟まれた画像領域から、層領域300と近似曲線400との間の距離が所定閾値未満である部分を除いた画像領域である。連結領域特定部234は、この突出領域600を含み、かつ、近似曲線400及び層領域300に囲まれた連結領域、つまり図8において符号700で示す連結領域を特定する。 FIG. 8 shows a connected area specified when the above processing is applied to the example of FIG. The protruding region 600 shown in FIG. 7 is an image region obtained by excluding a portion where the distance between the layer region 300 and the approximate curve 400 is less than a predetermined threshold from the image region sandwiched between the layer region 300 and the approximate curve 400. is there. The connection area specifying unit 234 specifies the connection area including the protruding area 600 and surrounded by the approximate curve 400 and the layer area 300, that is, the connection area indicated by reference numeral 700 in FIG.
 なお、図7に示す段階では、特徴部位330及び340に近い裾野の部分を除いた突出領域600が特定されるが、図8に示す段階では、この裾野の部分も含めた連結領域700が特定される。このような段階的処理を行うことで、ドルーゼンに相当すると考えられる画像領域全体が好適に検出される。 In the stage shown in FIG. 7, the protruding region 600 excluding the skirt portion close to the feature parts 330 and 340 is specified, but in the step shown in FIG. 8, the connecting region 700 including the skirt portion is specified. Is done. By performing such stepwise processing, the entire image region considered to correspond to drusen is preferably detected.
 なお、上記の例では、突出量が大きい突出領域を特定した後、各突出領域を含む連結領域をそれぞれ特定しているが、他の処理によって同様の結果を得ることも可能である。たとえば、層領域と近似曲線とに挟まれた連結領域をそれぞれ特定し、各連結領域について突出量が大きい部分、つまり奥行方向における層領域と近似曲線との間の距離が所定閾値以上である部位が存在するか判定することにより、目的の連結領域を特定することが可能である。この発明の要旨にはこのような変形も含まれる。 In the above example, after specifying a protruding region having a large protruding amount, a connected region including each protruding region is specified. However, similar results can be obtained by other processes. For example, each of the connected regions sandwiched between the layer region and the approximate curve is specified, and a portion with a large protrusion amount for each connected region, that is, a portion where the distance between the layer region and the approximate curve in the depth direction is equal to or greater than a predetermined threshold By determining whether or not exists, it is possible to specify the target connected region. Such modifications are also included in the gist of the present invention.
(形態情報生成部)
 形態情報生成部235は、連結領域特定部234により特定された連結領域の形態を表す形態情報を生成する。形態情報としては、たとえば、特定された連結領域の個数、サイズ、分布状態などがある。形態情報生成部235は、乳頭領域判定部235a、分布画像形成部235b、カウント部235c及びサイズ算出部235dを有する。
(Form information generator)
The form information generation unit 235 generates form information representing the form of the connected area specified by the connected area specifying unit 234. The form information includes, for example, the number, size, distribution state, and the like of the identified connected areas. The form information generation unit 235 includes a nipple region determination unit 235a, a distribution image formation unit 235b, a count unit 235c, and a size calculation unit 235d.
(乳頭領域判定部)
 乳頭領域判定部235aは、断層像を解析して、眼底Efの視神経乳頭に相当する乳頭領域がこの断層像に含まれるか判定する。視神経乳頭は、眼底Efにおける奥行方向(+z方向)への凹みとして断層像に描出される。乳頭領域判定部235aは、このような形状が断層像に描出されているか否か判定するものである。また、眼底Efの深部(脈絡膜や強膜)が描出された断層像を解析する場合には、視神経乳頭の深部に位置する組織の特徴的な形状が断層像に描出されているか否か判定するようにしてもよい。このような判定処理の例を以下に説明する。
(Nipple area determination unit)
The nipple region determination unit 235a analyzes the tomographic image and determines whether the tomographic image includes a nipple region corresponding to the optic nerve head of the fundus oculi Ef. The optic disc is depicted in a tomographic image as a dent in the depth direction (+ z direction) in the fundus oculi Ef. The nipple region determination unit 235a determines whether or not such a shape is depicted in a tomographic image. Further, when analyzing a tomographic image in which the deep part (choroid or sclera) of the fundus oculi Ef is depicted, it is determined whether or not the characteristic shape of the tissue located in the deep part of the optic nerve head is depicted in the tomographic image. You may do it. An example of such determination processing will be described below.
 乳頭領域判定部235aは、断層像を解析して眼底Efの所定の特徴層に相当する特徴層領域を特定し、この特徴層領域の形状に基づいて乳頭領域が含まれるか判定する。特定対象の特徴層は、眼底Efを構成する任意の層組織、つまり網膜を構成する任意の層組織や、脈絡膜や、強膜である。網膜を構成する層組織としては、内境界膜、神経繊維層、神経節細胞層、内網状層、内顆粒層、外網状層、外顆粒層、外境界膜、視細胞層、網膜色素上皮層がある。なお、視神経乳頭の特徴的形状が明確に反映される層組織を特徴層として採用することが望ましい。また、断層像において明瞭に描出される層組織を特徴層として採用することが望ましい。そのような層組織として、眼底Efにおいて硝子体との境界をなす内境界膜がある。内境界膜に相当する画像領域を内境界膜領域と呼ぶ。 The nipple region determination unit 235a analyzes the tomographic image, identifies a feature layer region corresponding to a predetermined feature layer of the fundus oculi Ef, and determines whether the nipple region is included based on the shape of the feature layer region. The characteristic layer to be identified is an arbitrary layer tissue constituting the fundus oculi Ef, that is, an arbitrary layer tissue constituting the retina, a choroid, or a sclera. The layer structure constituting the retina includes the inner boundary membrane, nerve fiber layer, ganglion cell layer, inner reticular layer, inner granular layer, outer reticular layer, outer granular layer, outer boundary membrane, photoreceptor layer, retinal pigment epithelial layer There is. In addition, it is desirable to employ | adopt as a characteristic layer the layer structure | tissue in which the characteristic shape of an optic disc is reflected clearly. In addition, it is desirable to employ a layer structure that is clearly depicted in a tomographic image as a feature layer. As such a layered structure, there is an inner boundary film that forms a boundary with the vitreous body in the fundus oculi Ef. An image area corresponding to the inner boundary film is called an inner boundary film area.
 視神経乳頭の一部が描出された断層像の例を図9に示す。断層像Gには、層領域300及び近似曲線400とともに、内境界膜領域800が示されている。内境界膜領域800の右側端部の+z方向への陥没部分810は、視神経乳頭の陥没形状に起因するものとする。一方、図9に示すように、視神経乳頭の近傍では網膜色素上皮層が-z方向に湾曲しているため、この部分が連結領域700として検出される可能性がある。 FIG. 9 shows an example of a tomographic image in which a part of the optic nerve head is depicted. In the tomographic image G, the inner boundary film region 800 is shown together with the layer region 300 and the approximate curve 400. It is assumed that the depressed portion 810 in the + z direction at the right end portion of the inner boundary membrane region 800 is caused by the depressed shape of the optic nerve head. On the other hand, as shown in FIG. 9, since the retinal pigment epithelium layer is curved in the −z direction in the vicinity of the optic disc, this portion may be detected as the connection region 700.
 乳頭領域判定部235aは、内境界膜領域において奥行方向(+z方向)に陥没している部位が存在するか判定し、陥没部位が存在すると判定された場合に乳頭領域が含まれると判定する。この処理の例として、乳頭領域判定部235aは、断層像Gのフレーム端部における内境界膜領域800の+z方向への陥没量を求める。この陥没量は、たとえば、内境界膜領域800において最も-z方向に位置する点と、最も+z方向に位置する点との差として算出される。この変位量が所定閾値以上である場合、乳頭領域判定部235aは、断層像Gに乳頭領域が含まれると判定する。他の処理として、内境界膜領域800に湾曲部分(たとえば所定値以上の曲率で湾曲している部分)が存在するか判断し、湾曲部分が存在する場合に乳頭領域が含まれると判定することも可能である。 The nipple region determination unit 235a determines whether or not there is a portion depressed in the depth direction (+ z direction) in the inner boundary membrane region, and determines that the nipple region is included when it is determined that the depressed portion exists. As an example of this processing, the nipple region determination unit 235a obtains the amount of depression in the + z direction of the inner boundary membrane region 800 at the frame end of the tomographic image G. This amount of depression is calculated, for example, as the difference between the point located most in the −z direction and the point located most in the + z direction in the inner boundary film region 800. When the amount of displacement is greater than or equal to a predetermined threshold, the nipple region determination unit 235a determines that the tomogram G includes a nipple region. As another process, it is determined whether or not the inner boundary membrane region 800 has a curved portion (for example, a portion curved with a curvature equal to or greater than a predetermined value), and when the curved portion exists, it is determined that the nipple region is included. Is also possible.
 乳頭領域の判定において、被検眼Eが左眼であるか右眼であるかを加味して処理を行うことが可能である。一般に、眼底EfのOCT計測の計測対象は、視神経乳頭、又は黄斑及びその近傍部位である。視神経乳頭が計測対象の断層像においては、乳頭領域の判定を行う必要は特にない。一方、黄斑及びその近傍部位が計測対象の断層像においては、被検眼が左眼であればフレームの左側端部近傍に視神経乳頭が描出されている可能性があり、被検眼が右眼であればフレームの右側端部近傍に視神経乳頭が描出されている可能性がある。このような事項を考慮して、次のような構成を適用することが可能である。 In the determination of the nipple region, it is possible to perform processing by considering whether the eye E is the left eye or the right eye. Generally, the measurement target for OCT measurement of the fundus oculi Ef is the optic nerve head or the macula and its vicinity. It is not particularly necessary to determine the nipple area in the tomographic image in which the optic nerve head is a measurement target. On the other hand, in the tomographic image whose measurement target is the macula and its vicinity, if the eye to be examined is the left eye, the optic nerve head may be depicted near the left end of the frame, and the eye to be examined may be the right eye. For example, the optic disc may be depicted near the right edge of the frame. In consideration of such matters, the following configuration can be applied.
 記憶部212には、被検眼Eが左眼であるか右眼であるかを示す左右眼情報が記憶されているものとする。左右眼情報は、ユーザが手入力してもよいし、電子カルテ等から自動で取得してもよい。また、OCT計測を実施したときに、たとえば装置光学系と顎受け(額当て)との位置関係に基づいて被検眼Eが左眼であるか右眼であるか自動判定し、その判定結果を左右眼情報として断層像に関連付けて記憶することもできる。 It is assumed that the storage unit 212 stores left and right eye information indicating whether the eye E is the left eye or the right eye. The left and right eye information may be manually input by the user or may be automatically acquired from an electronic medical record or the like. Further, when the OCT measurement is performed, it is automatically determined whether the eye E is the left eye or the right eye based on the positional relationship between the device optical system and the chin rest (forehead), and the determination result is The left and right eye information can also be stored in association with the tomographic image.
 乳頭領域判定部235aは、左右眼情報に基づいて被検眼Eが左眼であるか右眼であるか認識する。被検眼Eが左眼である場合、乳頭領域判定部235aは、断層像のフレームの少なくとも左側端部近傍を解析することにより、乳頭領域が含まれるか否か判定する。一方、被検眼Eが右眼である場合、乳頭領域判定部235aは、断層像のフレームの少なくとも右側端部近傍を解析することにより、乳頭領域が含まれるか否か判定する。 The nipple region determination unit 235a recognizes whether the eye E is the left eye or the right eye based on the left and right eye information. When the eye E is the left eye, the nipple area determination unit 235a determines whether or not the nipple area is included by analyzing at least the vicinity of the left end of the frame of the tomographic image. On the other hand, when the eye E is the right eye, the nipple region determination unit 235a determines whether or not the nipple region is included by analyzing at least the vicinity of the right end of the frame of the tomographic image.
 このように被検眼Eが左眼であるか右眼であるかに応じて処理を切り替えることで、処理に供されるデータ量を低減させることができるので、処理時間の短縮や処理リソースの低減を図ることが可能である。 In this way, by switching the processing depending on whether the eye E is the left eye or the right eye, the amount of data provided for the processing can be reduced, so that the processing time is shortened and the processing resources are reduced. Can be achieved.
 断層像に乳頭領域が含まれているか否か判定する処理の他の例を説明する。OCT計測においては、信号光LSが走査された領域(走査線)を示す情報(走査位置情報)が得られる。また、計測中には、リアルタイムで眼底像が得られる。乳頭領域判定部235aは、走査位置情報と眼底像に基づいて、断層像に対応する走査線が視神経乳頭の少なくとも一部を通過しているか、つまり断層像に視神経乳頭の少なくとも一部が描出されているか判定することができる。このような判定処理によって、乳頭領域が断層像に含まれるか判定することが可能である。 Another example of processing for determining whether or not a nipple region is included in a tomographic image will be described. In the OCT measurement, information (scanning position information) indicating a region (scanning line) scanned with the signal light LS is obtained. In addition, a fundus image is obtained in real time during measurement. Based on the scanning position information and the fundus image, the nipple region determination unit 235a determines whether the scanning line corresponding to the tomographic image passes through at least a part of the optic nerve head, that is, at least a part of the optic nerve head is depicted in the tomographic image. Can be determined. By such determination processing, it is possible to determine whether the nipple region is included in the tomographic image.
 乳頭領域判定部235aにより断層像に乳頭領域が含まれると判定された場合、形態情報生成部235は、連結領域特定部により特定された連結領域のうち乳頭領域の近傍に位置する連結領域を除外して形態情報の生成を行う。ここで、除外される連結領域は、たとえば、乳頭領域の最も近くに位置する連結領域である。また、乳頭領域から所定距離だけ離れた範囲に含まれる連結領域を除外するようにしてもよい。それにより、視神経乳頭の近傍における層領域の湾曲部分をドルーゼンと誤検出することを防止できる。 When the nipple region determination unit 235a determines that the tomographic image includes the nipple region, the morphological information generation unit 235 excludes the connected region located in the vicinity of the nipple region from the connected regions specified by the connected region specifying unit. Then, form information is generated. Here, the excluded connection region is, for example, a connection region located closest to the nipple region. Moreover, you may make it exclude the connection area | region included in the range away only predetermined distance from the nipple area | region. Thereby, it is possible to prevent erroneous detection of the curved portion of the layer region in the vicinity of the optic nerve head as drusen.
(分布画像形成部)
 分布画像形成部235bは、連結領域特定部234により特定された連結領域の分布状態を表す分布画像を形成する。たとえば、単一の断層像における連結領域の分布を表す分布画像を形成することができる。この処理は、たとえば、連結領域を他の画像領域と異なる態様(表示色、表示輝度等)で呈示するものである。
(Distribution image forming unit)
The distribution image forming unit 235b forms a distribution image representing the distribution state of the connected area specified by the connected area specifying unit 234. For example, a distribution image representing the distribution of connected regions in a single tomographic image can be formed. In this process, for example, the connected area is presented in a mode (display color, display brightness, etc.) different from other image areas.
 また、複数の断層像のそれぞれについて連結領域の特定がなされた場合、奥行方向に直交するxy平面における連結領域の分布状態を表す分布画像を形成することもできる。以下、xy平面における分布画像を形成する処理の例を説明する。 In addition, when a connected region is specified for each of a plurality of tomographic images, a distribution image representing the distribution state of the connected region in the xy plane orthogonal to the depth direction can be formed. Hereinafter, an example of processing for forming a distribution image in the xy plane will be described.
 複数の断層像は、たとえば後述の3次元スキャンを実行して得られる。3次元スキャンは、たとえばx方向に沿った、かつy方向に配列された複数の直線状の走査線に沿って信号光LSの照射位置を走査するスキャン形態である。3次元スキャンにより、各走査線に沿った断面における断層像が複数枚得られる。 A plurality of tomographic images are obtained, for example, by executing a three-dimensional scan described later. The three-dimensional scan is a scan form in which the irradiation position of the signal light LS is scanned along a plurality of linear scanning lines arranged in the x direction and in the y direction, for example. A plurality of tomographic images in a cross section along each scanning line are obtained by the three-dimensional scanning.
 分布画像形成部235bは、これら断層像のそれぞれについて特定された連結領域に基づいて、xy平面における連結領域の分布画像を形成する。各断層像において、各連結領域はx方向(走査線方向)及びz方向に広がる画像領域である。また、複数の断層像はy方向に配列されている。複数の断層像をy方向に配列させると、各断層像中の連結領域が組み合わされて連結領域の2次元分布(xy平面における分布)が得られる。 The distribution image forming unit 235b forms a distribution image of the connected area in the xy plane based on the connected area specified for each of these tomographic images. In each tomographic image, each connected region is an image region that extends in the x direction (scan line direction) and the z direction. The plurality of tomographic images are arranged in the y direction. When a plurality of tomographic images are arranged in the y direction, the connected regions in each tomographic image are combined to obtain a two-dimensional distribution (distribution in the xy plane) of the connected regions.
 このとき、隣接する断層像における連結領域がy方向に隣接している場合には、これら連結領域間の画素を連結領域に設定してもよい。この処理は、特に、隣接する断層像の間隔(走査線の間隔)が十分狭い場合に有効である。 At this time, if the connected regions in the adjacent tomographic images are adjacent in the y direction, the pixels between these connected regions may be set as the connected regions. This processing is particularly effective when the interval between adjacent tomographic images (scan line interval) is sufficiently narrow.
 分布画像形成部235bは、たとえば、連結領域に相当する画素とそれ以外の画素との画素値を相違させて表現することにより分布画像を形成する。一例として、連結領域とそれ以外の領域とを二値で区別して表現して二値画像を形成し、これを分布画像とすることができる。 The distribution image forming unit 235b forms a distribution image by expressing the pixel values of the pixels corresponding to the connected region and the other pixels, for example, differently. As an example, a binary image can be formed by distinguishing and expressing connected regions and other regions by binary values, which can be used as a distribution image.
 このようにして形成される分布画像の例を図10に示す。分布画像900は、信号光LSの入射方向(-z方向)から眼底Efをみたときの連結領域Tの分布状態を表している。連結領域Tは、複数の連結領域Tk(k=1~K)からなる。分布画像900は、複数の走査線Ri(i=1~m)を断面とする複数の断層像に基づいて形成される。 An example of a distribution image formed in this way is shown in FIG. The distribution image 900 represents the distribution state of the connected region T when the fundus oculi Ef is viewed from the incident direction (−z direction) of the signal light LS. The connection region T includes a plurality of connection regions Tk (k = 1 to K). The distribution image 900 is formed based on a plurality of tomographic images having a plurality of scanning lines Ri (i = 1 to m) as cross sections.
(カウント部)
 カウント部235cは、連結領域特定部234により特定された連結領域の個数をカウントする。この処理は、たとえば、分布画像に含まれる複数の連結領域(画素の集合)に対して所定の順序(たとえば分布画像の左上から右下に向かう順序)で番号1、2、・・・を順次に付与し、付与された最大の番号を連結領域の個数とすることにより実行される。なお、連結領域を特定する処理で実行されるラベリング処理において、連結領域をカウントする処理を並行して行うようにしてもよい。
(Counting part)
The counting unit 235c counts the number of connected areas specified by the connected area specifying unit 234. In this process, for example, the numbers 1, 2,. And the maximum number assigned is used as the number of connected regions. In the labeling process executed in the process for specifying the connected area, the process for counting the connected areas may be performed in parallel.
(サイズ算出部)
 サイズ算出部235dは、連結領域特定部234により特定された各連結領域のサイズを算出する。連結領域のサイズを表す指標としては、面積、径(直径、半径等)、体積などがある。以下、連結領域のサイズ算出処理の例を説明する。
(Size calculator)
The size calculating unit 235d calculates the size of each connected area specified by the connected area specifying unit 234. Examples of the index representing the size of the connected region include area, diameter (diameter, radius, etc.), volume, and the like. Hereinafter, an example of the size calculation process of the connected area will be described.
 まず、連結領域の面積を求める処理例を説明する。各連結領域は、連結と判断された複数の画素の集合である。各画素には予め面積(単位面積)が設定されている。この単位面積は、分布画像や断層像に対して任意に設定されたものであってよい。たとえば、計測倍率を考慮して、一つの画素に相当する実空間における面積を単位面積として設定することができる。サイズ算出部235dは、各連結領域に含まれる画素数と単位面積との積を演算して当該連結領域の面積とする。 First, a processing example for obtaining the area of the connected region will be described. Each connected region is a set of a plurality of pixels determined to be connected. An area (unit area) is set in advance for each pixel. This unit area may be arbitrarily set for the distribution image or tomographic image. For example, in consideration of the measurement magnification, the area in the real space corresponding to one pixel can be set as the unit area. The size calculation unit 235d calculates the product of the number of pixels included in each connected region and the unit area to obtain the area of the connected region.
 次に、連結領域の径を求める処理例を説明する。サイズ算出部235dは、まず上記のように面積を算出する。そして、サイズ算出部235dは、この面積を有する円の直径(又は半径)を当該連結領域の径とする。なお、連結領域に含まれる最長の線分を探索し、この線分の長さを当該連結領域の径として採用することも可能である。これら以外にも、連結領域を特徴づけることが可能な任意の距離を径として採用することができる。 Next, an example of processing for obtaining the diameter of the connected region will be described. The size calculator 235d first calculates the area as described above. The size calculation unit 235d sets the diameter (or radius) of the circle having this area as the diameter of the connection region. It is also possible to search for the longest line segment included in the connection area and adopt the length of this line segment as the diameter of the connection area. In addition to these, any distance that can characterize the connection region can be adopted as the diameter.
 次に、連結領域の体積を求める処理例を説明する。前述のように、近似曲線上の各点と層領域との間の奥行方向における距離は、距離算出部233aにより既に算出されている。サイズ算出部235dは、各連結領域にわたって当該距離を積分することにより、この連結領域の体積を算出する。 Next, an example of processing for obtaining the volume of the connected area will be described. As described above, the distance in the depth direction between each point on the approximate curve and the layer region has already been calculated by the distance calculation unit 233a. The size calculator 235d calculates the volume of the connected region by integrating the distance over each connected region.
 なお、サイズの算出方法は上記のものに限定されるものではない。また、サイズを表す指標(次元)も上記のものに限定されるものではない。 Note that the size calculation method is not limited to the above. Further, the index (dimension) indicating the size is not limited to the above.
 以上のように機能する画像処理部230は、たとえば、前述のマイクロプロセッサ、RAM、ROM、ハードディスクドライブ、回路基板等を含んで構成される。ハードディスクドライブ等の記憶装置には、上記機能をマイクロプロセッサに実行させるコンピュータプログラムが予め格納されている。 The image processing unit 230 that functions as described above includes, for example, the aforementioned microprocessor, RAM, ROM, hard disk drive, circuit board, and the like. In a storage device such as a hard disk drive, a computer program for causing the microprocessor to execute the above functions is stored in advance.
(ユーザインターフェイス)
 ユーザインターフェイス240には、表示部240Aと操作部240Bとが含まれる。表示部240Aは、前述した演算制御ユニット200の表示デバイスや表示装置3を含んで構成される。操作部240Bは、前述した演算制御ユニット200の操作デバイスを含んで構成される。操作部240Bには、眼底解析装置1の筐体や外部に設けられた各種のボタンやキーが含まれていてもよい。たとえば眼底カメラユニット2が従来の眼底カメラと同様の筺体を有する場合、操作部240Bは、この筺体に設けられたジョイスティックや操作パネル等を含んでいてもよい。また、表示部240Aは、眼底カメラユニット2の筺体に設けられたタッチパネルなどの各種表示デバイスを含んでいてもよい。
(User interface)
The user interface 240 includes a display unit 240A and an operation unit 240B. The display unit 240A includes the display device of the arithmetic control unit 200 and the display device 3 described above. The operation unit 240B includes the operation device of the arithmetic control unit 200 described above. The operation unit 240B may include various buttons and keys provided on the housing of the fundus analysis apparatus 1 or outside. For example, when the fundus camera unit 2 has a housing similar to that of a conventional fundus camera, the operation unit 240B may include a joystick, an operation panel, or the like provided on the housing. The display unit 240 </ b> A may include various display devices such as a touch panel provided on the housing of the fundus camera unit 2.
 なお、表示部240Aと操作部240Bは、それぞれ個別のデバイスとして構成される必要はない。たとえばタッチパネルのように、表示機能と操作機能とが一体化されたデバイスを用いることも可能である。その場合、操作部240Bは、このタッチパネルとコンピュータプログラムとを含んで構成される。操作部240Bに対する操作内容は、電気信号として制御部210に入力される。また、表示部240Aに表示されたグラフィカルユーザインターフェイス(GUI)と、操作部240Bとを用いて、操作や情報入力を行うようにしてもよい。 Note that the display unit 240A and the operation unit 240B do not need to be configured as individual devices. For example, a device in which a display function and an operation function are integrated, such as a touch panel, can be used. In that case, the operation unit 240B includes the touch panel and a computer program. The operation content for the operation unit 240B is input to the control unit 210 as an electrical signal. Further, operations and information input may be performed using a graphical user interface (GUI) displayed on the display unit 240A and the operation unit 240B.
〔信号光の走査及び断層像について〕
 ここで、信号光LSの走査及び断層像について説明しておく。
[Signal light scanning and tomographic images]
Here, scanning of the signal light LS and a tomographic image will be described.
 眼底解析装置1による信号光LSの走査態様としては、たとえば、水平スキャン、垂直スキャン、十字スキャン、放射スキャン、円スキャン、同心円スキャン、螺旋(渦巻)スキャンなどがある。これらの走査態様は、眼底の観察部位、解析対象(網膜厚など)、走査に要する時間、走査の精密さなどを考慮して適宜に選択的に使用される。 Examples of the scanning mode of the signal light LS by the fundus analyzing apparatus 1 include a horizontal scan, a vertical scan, a cross scan, a radiation scan, a circle scan, a concentric scan, and a spiral (vortex) scan. These scanning modes are selectively used as appropriate in consideration of the observation site of the fundus, the analysis target (such as retinal thickness), the time required for scanning, the precision of scanning, and the like.
 水平スキャンは、信号光LSを水平方向(x方向)に走査させるものである。水平スキャンには、垂直方向(y方向)に配列された複数の水平方向に延びる走査線に沿って信号光LSを走査させる態様も含まれる。この態様においては、走査線の間隔を任意に設定することが可能である。また、隣接する走査線の間隔を十分に狭くすることにより、前述の3次元断層像を形成することができる(3次元スキャン)。垂直スキャンについても同様である。 The horizontal scan is to scan the signal light LS in the horizontal direction (x direction). The horizontal scan also includes an aspect in which the signal light LS is scanned along a plurality of horizontal scanning lines arranged in the vertical direction (y direction). In this aspect, it is possible to arbitrarily set the scanning line interval. In addition, the aforementioned three-dimensional tomographic image can be formed by sufficiently narrowing the interval between adjacent scanning lines (three-dimensional scanning). The same applies to the vertical scan.
 十字スキャンは、互いに直交する2本の直線状の軌跡(直線軌跡)からなる十字型の軌跡に沿って信号光LSを走査するものである。放射スキャンは、所定の角度を介して配列された複数の直線軌跡からなる放射状の軌跡に沿って信号光LSを走査するものである。なお、十字スキャンは放射スキャンの一例である。 The cross scan scans the signal light LS along a cross-shaped trajectory composed of two linear trajectories (straight trajectories) orthogonal to each other. In the radiation scan, the signal light LS is scanned along a radial trajectory composed of a plurality of linear trajectories arranged at a predetermined angle. The cross scan is an example of a radiation scan.
 円スキャンは、円形状の軌跡に沿って信号光LSを走査させるものである。同心円スキャンは、所定の中心位置の周りに同心円状に配列された複数の円形状の軌跡に沿って信号光LSを走査させるものである。円スキャンは同心円スキャンの一例である。螺旋スキャンは、回転半径を次第に小さく(又は大きく)させながら螺旋状(渦巻状)の軌跡に沿って信号光LSを走査するものである。 The circle scan scans the signal light LS along a circular locus. In the concentric scan, the signal light LS is scanned along a plurality of circular trajectories arranged concentrically around a predetermined center position. A circle scan is an example of a concentric scan. In the spiral scan, the signal light LS is scanned along a spiral (spiral) locus while the radius of rotation is gradually reduced (or increased).
 ガルバノスキャナ42は、互いに直交する方向に信号光LSを走査するように構成されているので、信号光LSをx方向及びy方向にそれぞれ独立に走査できる。更に、ガルバノスキャナ42に含まれる2つのガルバノミラーの向きを同時に制御することで、xy面上の任意の軌跡に沿って信号光LSを走査することが可能である。それにより、上記のような各種の走査態様を実現できる。 Since the galvano scanner 42 is configured to scan the signal light LS in directions orthogonal to each other, the signal light LS can be scanned independently in the x direction and the y direction, respectively. Further, by simultaneously controlling the directions of the two galvanometer mirrors included in the galvano scanner 42, the signal light LS can be scanned along an arbitrary locus on the xy plane. Thereby, various scanning modes as described above can be realized.
 上記のような態様で信号光LSを走査することにより、走査線(走査軌跡)に沿う方向と眼底深度方向(z方向)とにより張られる面における断層像を取得することができる。また、特に走査線の間隔が狭い場合には、前述の3次元断層像を取得することができる。 By scanning the signal light LS in the above-described manner, a tomographic image on a plane stretched by the direction along the scanning line (scanning locus) and the fundus depth direction (z direction) can be acquired. In particular, when the interval between scanning lines is narrow, the above-described three-dimensional tomographic image can be acquired.
 上記のような信号光LSの走査対象となる眼底Ef上の領域、つまりOCT計測の対象となる眼底Ef上の領域を走査領域と呼ぶ。3次元スキャンにおける走査領域は、複数の水平スキャンが配列された矩形の領域である。また、同心円スキャンにおける走査領域は、最大径の円スキャンの軌跡により囲まれる円盤状の領域である。また、放射スキャンにおける走査領域は、各スキャンラインの両端位置を結んだ円盤状(或いは多角形状)の領域である。 The region on the fundus oculi Ef to be scanned with the signal light LS as described above, that is, the region on the fundus oculi Ef to be subjected to OCT measurement is called a scanning region. The scanning area in the three-dimensional scan is a rectangular area in which a plurality of horizontal scans are arranged. The scanning area in the concentric scan is a disk-shaped area surrounded by the locus of the circular scan with the maximum diameter. In addition, the scanning area in the radial scan is a disk-shaped (or polygonal) area connecting both end positions of each scan line.
[動作]
 眼底解析装置1の動作について説明する。図11は、眼底解析装置1の動作の一例を表す。
[Operation]
The operation of the fundus analysis apparatus 1 will be described. FIG. 11 illustrates an example of the operation of the fundus analysis apparatus 1.
(S1:OCT計測)
 被検眼EのOCT計測を行なって眼底Efの断層像を取得する。
(S1: OCT measurement)
A tomographic image of the fundus oculi Ef is obtained by performing OCT measurement of the eye E.
(S2:層領域の特定)
 層領域特定部231は、眼底Efの断層像の画素の画素値に基づいて、この断層像において網膜色素上皮層に相当する画像領域(層領域)を特定する。層領域特定部231により得られた情報は近似曲線演算部232に送られる。
(S2: Identification of layer region)
The layer region specifying unit 231 specifies an image region (layer region) corresponding to the retinal pigment epithelium layer in the tomographic image based on the pixel value of the pixel of the tomographic image of the fundus oculi Ef. The information obtained by the layer region specifying unit 231 is sent to the approximate curve calculation unit 232.
(S3:特徴部位の特定)
 特徴部位特定部232aは、ステップ2で特定された層領域中の画素の画素値に基づいて、この層領域の形状に基づく複数の特徴部位を特定する。
(S3: Identification of characteristic part)
Based on the pixel values of the pixels in the layer area specified in step 2, the characteristic part specifying unit 232a specifies a plurality of characteristic parts based on the shape of the layer area.
(S4:特徴部位の補間)
 特徴部位補間部232bは、必要に応じ、層領域の特徴部位を追加する処理を行う。
(S4: Feature part interpolation)
The feature part interpolation unit 232b performs a process of adding a feature part of the layer region as necessary.
(S5:自由曲線の演算)
 自由曲線演算部232cは、ステップ3で特定された特徴部位及びステップ4で追加された特徴部位に基づいて、自由曲線を求める。
(S5: Free curve calculation)
The free curve calculation unit 232c obtains a free curve based on the feature part specified in step 3 and the feature part added in step 4.
(S6:自由曲線の補正)
 補正部232dは、必要に応じ、ステップ5で求められた自由曲線を補正する。それにより、断層像が示す断面にドルーゼンが存在しないと仮定した場合における網膜色素上皮層の推定形状を表す近似曲線が得られる。
(S6: Correction of free curve)
The correction unit 232d corrects the free curve obtained in step 5 as necessary. Accordingly, an approximate curve representing the estimated shape of the retinal pigment epithelium layer when it is assumed that drusen does not exist in the cross section indicated by the tomogram is obtained.
(S7:距離の算出)
 距離算出部233aは、ステップ6(補正が行われない場合にはステップ5)で得られた近似曲線上の各点と層領域との間の奥行方向における距離を算出する。
(S7: Calculation of distance)
The distance calculation unit 233a calculates the distance in the depth direction between each point on the approximate curve obtained in Step 6 (Step 5 when correction is not performed) and the layer region.
(S8:距離の判断)
 距離判断部233bは、ステップ7で算出された各距離が所定閾値以上であるか判断する。
(S8: Judgment of distance)
The distance determination unit 233b determines whether each distance calculated in step 7 is equal to or greater than a predetermined threshold.
(S9:突出領域の特定)
 画像領域特定部233cは、ステップ8において距離が所定閾値以上と判断された近似曲線上の点の集合と、層領域とに挟まれた画像領域を特定する。この画像領域が突出領域となる。
(S9: Identification of protruding region)
The image area specifying unit 233c specifies an image area sandwiched between a set of points on the approximate curve whose distance is determined to be greater than or equal to a predetermined threshold in step 8 and the layer area. This image area becomes a protruding area.
(S10:連結領域の特定)
 連結領域特定部234は、ステップ9で特定された突出領域を含み、かつ近似曲線及び層領域に囲まれた連結領域を特定する。
(S10: Identification of connected area)
The connection area specifying unit 234 specifies the connection area including the protruding area specified in step 9 and surrounded by the approximate curve and the layer area.
(S11:乳頭領域の判定)
 乳頭領域判定部235aは、断層像を解析して、眼底Efの視神経乳頭に相当する乳頭領域がこの断層像に含まれるか判定する。乳頭領域が含まれると判定された場合、形態情報生成部235は、ステップ10で特定された連結領域のうちから、乳頭領域の近傍に位置する連結領域を除外する。
(S11: Determination of nipple area)
The nipple region determination unit 235a analyzes the tomographic image and determines whether the tomographic image includes a nipple region corresponding to the optic nerve head of the fundus oculi Ef. When it is determined that the nipple region is included, the form information generation unit 235 excludes the connected region located in the vicinity of the nipple region from the connected regions specified in Step 10.
(S12:形態情報の生成)
 形態情報生成部235は、ステップ10で特定された連結領域(ステップ11で除外されたものを除く)の形態を表す形態情報を生成する。
(S12: Generation of form information)
The form information generation unit 235 generates form information indicating the form of the connected area specified in step 10 (excluding the area excluded in step 11).
(S13:解析結果の出力)
 画像処理部230は、ステップ12で生成された形態情報を含む解析結果呈示情報を生成する。主制御部211は、解析結果呈示情報に基づいて表示部240Aに解析結果を表示させる。この解析結果は、眼底Efにおけるドルーゼンの存在の有無を示す情報や、眼底Efに存在するドルーゼンのサイズや分布等を示す情報を含む。なお、解析結果呈示情報に基づいて解析レポートを印刷出力することも可能である。また、解析結果呈示情報や上記処理により得られた情報、更には患者や被検眼に関する情報などを外部装置に送信したり、記録媒体に記録させたりすることもできる。以上で、この動作例に係る処理は終了となる。
(S13: Output of analysis result)
The image processing unit 230 generates analysis result presentation information including the form information generated in step 12. The main control unit 211 causes the display unit 240A to display the analysis result based on the analysis result presentation information. This analysis result includes information indicating the presence / absence of drusen in the fundus oculi Ef and information indicating the size and distribution of drusen present in the fundus oculi Ef. An analysis report can be printed out based on the analysis result presentation information. In addition, analysis result presentation information, information obtained by the above processing, and information on a patient and an eye to be examined can be transmitted to an external apparatus or recorded on a recording medium. Thus, the process according to this operation example ends.
[効果]
 眼底解析装置1の効果について説明する。
[effect]
The effect of the fundus analyzer 1 will be described.
 眼底解析装置1は、記憶部212と、層領域特定部231と、近似曲線演算部232と、突出領域特定部233と、連結領域特定部234と、形態情報生成部235とを有する。記憶部212には、眼底Efの層構造を描写する断層像が記憶される。層領域特定部231は、断層像の画素の画素値に基づいて、網膜色素上皮層に相当する当該断層像中の層領域を特定する。近似曲線演算部232は、層領域の形状に基づく近似曲線を求める。突出領域特定部233は、近似曲線が層領域よりも眼底Efの奥行方向に位置し、かつ奥行方向における近似曲線と層領域との間の距離が所定閾値以上である突出領域を特定する。連結領域特定部234は、この突出領域を含み、かつ近似曲線及び層領域に囲まれた連結領域を特定する。形態情報生成部235は、連結領域の形態を表す形態情報を生成する。 The fundus analysis apparatus 1 includes a storage unit 212, a layer region specifying unit 231, an approximate curve calculating unit 232, a protruding region specifying unit 233, a connected region specifying unit 234, and a form information generating unit 235. The storage unit 212 stores a tomographic image depicting the layer structure of the fundus oculi Ef. The layer region specifying unit 231 specifies a layer region in the tomographic image corresponding to the retinal pigment epithelium layer based on the pixel value of the pixel of the tomographic image. The approximate curve calculation unit 232 obtains an approximate curve based on the shape of the layer region. The protruding region specifying unit 233 specifies a protruding region in which the approximate curve is located in the depth direction of the fundus oculi Ef with respect to the layer region, and the distance between the approximate curve and the layer region in the depth direction is equal to or greater than a predetermined threshold. The connection area specifying unit 234 specifies the connection area including the protruding area and surrounded by the approximate curve and the layer area. The form information generation unit 235 generates form information representing the form of the connected area.
 このような眼底解析装置1によれば、ドルーゼンに相当すると考えられる画像領域を効果的に検出することが可能である。より具体的には、大きく突出している突出領域を特定することにより、ノイズ等に起因する小さな凹凸を除外してドルーゼンと考えられる画像領域を好適に検出することができる。また、このような突出領域を含む連結領域全体を検出することにより、ドルーゼンに相当すると考えられる画像領域を漏れ無く検出することができる。したがって、この実施形態によれば、眼底Efの断層像に基づいてドルーゼンを効果的に検出することが可能である。 According to such a fundus analysis apparatus 1, it is possible to effectively detect an image region considered to correspond to drusen. More specifically, by specifying a protruding region that protrudes greatly, an image region that is considered to be drusen can be suitably detected by removing small irregularities caused by noise or the like. Further, by detecting the whole connected region including such a protruding region, an image region that is considered to correspond to drusen can be detected without omission. Therefore, according to this embodiment, it is possible to effectively detect drusen based on the tomographic image of the fundus oculi Ef.
 近似曲線演算部232は、層領域の形状に基づいて、層領域中の複数の特徴部位を特定する特徴部位特定部232aを含んでいてもよい。その場合、近似曲線演算部232は、特定された複数の特徴部位に基づいて近似曲線を求める。複数の特徴部位を考慮することにより、より高い確度や精度で近似曲線を求めることが可能となる。 The approximate curve calculation unit 232 may include a feature part specifying unit 232a that specifies a plurality of feature parts in the layer region based on the shape of the layer region. In that case, the approximate curve calculation unit 232 obtains an approximate curve based on the plurality of identified characteristic parts. By considering a plurality of characteristic parts, an approximate curve can be obtained with higher accuracy and accuracy.
 特徴部位特定部232aの構成例として次のものがある。特徴部位特定部232aが、層領域の形状に基づいて奥行方向における層領域中の最深部位を特定して特徴部位とし、この最深部位を通過しかつ層領域に接する直線を求め、層領域とこの直線との接点を更なる特徴部位とするように構成することが可能である。更に、特徴部位特定部232aが、最深部位を通過する直線を最深部位を中心として回転させていくことにより接点を順次に特定するように構成することが可能である。また、特徴部位特定部232aが、最深部位を通過する直線を最深部位を中心として回転させて接点を特定し、特定された接点を通過する直線をこの接点を中心として回転させて更なる接点を特定するように構成することが可能である。 The following is an example of the configuration of the characteristic part specifying unit 232a. The characteristic part specifying unit 232a specifies the deepest part in the layer area in the depth direction based on the shape of the layer area as a characteristic part, obtains a straight line that passes through the deepest part and touches the layer area, It is possible to configure the contact point with the straight line as a further characteristic part. Further, the feature part specifying unit 232a can be configured to sequentially specify the contact points by rotating a straight line passing through the deepest part around the deepest part. Further, the characteristic part specifying unit 232a specifies a contact by rotating a straight line passing through the deepest part around the deepest part, and further rotates a straight line passing through the specified contact around the contact. It can be configured to be specific.
 近似曲線演算部232は、特徴部位補間部232bを有していてもよい。特徴部位補間部232bは、特徴部位特定部232aにより特定された特徴部位のうち隣接する2つの特徴部位の間の距離を求める。更に、特徴部位補間部232bは、この距離が所定値以上である場合に、この2つの特徴部位の間に新たな特徴部位を追加する。このような補間処理を行うことにより、より高い確度や精度で近似曲線を求めることができる。 The approximate curve calculation unit 232 may include a feature part interpolation unit 232b. The feature part interpolation unit 232b obtains a distance between two adjacent feature parts among the feature parts specified by the feature part specifying unit 232a. Furthermore, the feature part interpolation unit 232b adds a new feature part between the two feature parts when the distance is equal to or greater than a predetermined value. By performing such an interpolation process, an approximate curve can be obtained with higher accuracy and accuracy.
 近似曲線演算部232は、複数の特徴部位に基づく自由曲線を求め、この自由曲線に基づいて近似曲線を求めることができる。このとき、自由曲線は、スプライン曲線、特に3次スプライン曲線であってよい。自由曲線を求めることにより、高い確度や精度で近似曲線を求めることができる。 The approximate curve calculation unit 232 can obtain a free curve based on a plurality of characteristic parts, and can obtain an approximate curve based on the free curve. At this time, the free curve may be a spline curve, particularly a cubic spline curve. By obtaining a free curve, an approximate curve can be obtained with high accuracy and accuracy.
 自由曲線を求める場合において、以下のような処理を行なって自由曲線を補正することができる。第1の例として、近似曲線演算部232に、直線部位特定部232eと、第1変形部232fとを設ける。直線部位特定部232eは、層領域において実質的に直線的な部位を特定する。第1変形部232fは、自由曲線が直線部位よりも奥行方向に位置する場合に、自由曲線の当該部位を直線部位の位置に合わせるように自由曲線を変形する。この場合、近似曲線演算部232は、第1変形部232fによる変形結果に基づいて近似曲線を求める。 When obtaining a free curve, the following process can be performed to correct the free curve. As a first example, the approximate curve calculation unit 232 is provided with a straight line part specifying unit 232e and a first deformation unit 232f. The straight part specifying unit 232e specifies a substantially straight part in the layer region. When the free curve is located in the depth direction with respect to the straight line part, the first deforming unit 232f deforms the free curve so that the part of the free curve matches the position of the straight line part. In this case, the approximate curve calculation unit 232 obtains an approximate curve based on the deformation result by the first deformation unit 232f.
 第2の例として、近似曲線演算部232に、曲線部位特定部232gと、第2変形部232hとを設ける。曲線部位特定部232gは、自由曲線において層領域よりも奥行方向の逆方向に位置する部位を特定する。第2変形部232hは、この特定部位を層領域の位置に合わせるように自由曲線を変形する。この場合、近似曲線演算部232は、第2変形部232hによる変形結果に基づいて近似曲線を求める。 As a second example, the approximate curve calculation unit 232 is provided with a curve part specifying unit 232g and a second deformation unit 232h. The curved part specifying unit 232g specifies a part located in the reverse direction of the depth direction with respect to the layer region in the free curve. The second deforming unit 232h deforms the free curve so that the specific part is matched with the position of the layer region. In this case, the approximate curve calculation unit 232 obtains an approximate curve based on the deformation result by the second deformation unit 232h.
 第3の例として、近似曲線演算部232に、突出判定部232iと、第3変形部232jとを設ける。突出判定部232iは、断層像のフレームの端部近傍において層領域が奥行方向の逆方向に突出しているか判定する。第3変形部232jは、層領域が突出していると判定された場合に、この突出部位に対応する自由曲線の部位を、フレームの中央側からの自由曲線の延長線に置換することで、自由曲線を変形する。この場合、近似曲線演算部232は、第3変形部232jによる変形結果に基づいて近似曲線を求める。 As a third example, the approximate curve calculation unit 232 includes a protrusion determination unit 232i and a third deformation unit 232j. The protrusion determination unit 232i determines whether the layer region protrudes in the direction opposite to the depth direction in the vicinity of the end of the frame of the tomographic image. When it is determined that the layer region protrudes, the third deforming portion 232j replaces the portion of the free curve corresponding to the protruding portion with an extension of the free curve from the center side of the frame. Deform the curve. In this case, the approximate curve calculation unit 232 obtains an approximate curve based on the deformation result by the third deformation unit 232j.
 上記のような処理により自由曲線を補正することで、近似曲線の確度や精度の向上を図ることが可能である。 It is possible to improve the accuracy and accuracy of the approximate curve by correcting the free curve by the processing as described above.
 突出領域特定部233は、たとえば次の処理を行なって突出領域の特定を行うことができる。まず、突出領域特定部233は、眼底Efの奥行方向における近似曲線上の各点と層領域との間の距離を算出し、算出された距離が所定閾値以上であるか判断する。そして、突出領域特定部233は、距離が所定閾値以上であると判断された近似曲線上の点の集合と層領域とに挟まれた画像領域を特定し、この画像領域を突出領域とする。このような処理を行うことで、突出領域の特定を好適に行うことができる。 The protruding area specifying unit 233 can specify the protruding area by performing the following processing, for example. First, the protruding area specifying unit 233 calculates the distance between each point on the approximate curve in the depth direction of the fundus oculi Ef and the layer area, and determines whether the calculated distance is equal to or greater than a predetermined threshold. Then, the protruding area specifying unit 233 specifies an image area sandwiched between a set of points on the approximate curve determined to have a distance greater than or equal to a predetermined threshold and the layer area, and sets this image area as the protruding area. By performing such processing, it is possible to suitably specify the protruding region.
 形態情報生成部235は、乳頭領域判定部235aを含んでいてもよい。乳頭領域判定部235aは、断層像を解析して、眼底Efの視神経乳頭に相当する乳頭領域が当該断層像に含まれるか判定する。乳頭領域が含まれると判定された場合、形態情報生成部235は、連結領域特定部234により特定された連結領域のうち乳頭領域の近傍に位置する連結領域を除外して形態情報を生成する。この処理により、視神経乳頭の近傍における網膜色素上皮層の湾曲形状に起因する誤検出を防止できるので、ドルーゼンに相当すると考えられる画像領域をより効果的に検出することが可能である。 The form information generation unit 235 may include a nipple region determination unit 235a. The nipple region determination unit 235a analyzes the tomographic image and determines whether or not the nipple region corresponding to the optic nerve head of the fundus oculi Ef is included in the tomographic image. When it is determined that the nipple region is included, the shape information generation unit 235 generates the shape information by excluding the connected region located in the vicinity of the nipple region among the connected regions specified by the connected region specifying unit 234. This process can prevent erroneous detection due to the curved shape of the retinal pigment epithelium layer in the vicinity of the optic nerve head, so that it is possible to more effectively detect an image region considered to correspond to drusen.
 乳頭領域判定部235aは、たとえば、断層像を解析して眼底Efの所定の特徴層に相当する特徴層領域を特定し、この特徴層領域の形状に基づいて乳頭領域が含まれるか判定するように構成される。このとき、所定の特徴層は、眼底Efにおいて硝子体との境界をなす内境界膜であってよい。その場合、乳頭領域判定部235aは、特徴層領域(内境界膜領域)において眼底Efの奥行方向に陥没している部位が存在するか判定し、陥没部位が存在すると判定された場合に乳頭領域が含まれると判定する。この処理により、乳頭領域を効果的に検出することができる。 For example, the nipple region determination unit 235a analyzes a tomographic image to identify a feature layer region corresponding to a predetermined feature layer of the fundus oculi Ef, and determines whether the nipple region is included based on the shape of the feature layer region. Configured. At this time, the predetermined feature layer may be an inner boundary film that forms a boundary with the vitreous body in the fundus oculi Ef. In that case, the nipple region determination unit 235a determines whether there is a portion that is depressed in the depth direction of the fundus oculi Ef in the feature layer region (inner boundary membrane region), and when it is determined that there is a depressed portion, the nipple region Is determined to be included. By this process, the nipple region can be detected effectively.
 乳頭領域判定部235aは、被検眼Eが左眼であるか右眼であるかを示す左右眼情報の入力を受けて、異なる処理を実行することができる。つまり、被検眼Eが左眼である場合、乳頭領域判定部235aは、断層像のフレームの少なくとも左側端部近傍を解析することにより上記判定を行う。一方、被検眼Eが右眼である場合、乳頭領域判定部235aは、断層像のフレームの少なくとも右側端部近傍を解析することにより上記判定を行う。この処理により、解析処理に掛かる時間の短縮や、リソースの低減を図ることが可能である。 The nipple region determination unit 235a can execute different processes upon receiving input of left and right eye information indicating whether the eye E is the left eye or the right eye. That is, when the eye E is the left eye, the nipple region determination unit 235a performs the above determination by analyzing at least the vicinity of the left end of the frame of the tomographic image. On the other hand, when the eye E is the right eye, the nipple region determination unit 235a performs the above determination by analyzing at least the vicinity of the right end of the frame of the tomographic image. By this process, it is possible to shorten the time required for the analysis process and reduce resources.
 眼底解析装置1は、光源ユニット101からの光を信号光LSと参照光LRとに分割し、被検眼Eの眼底Efを経由した信号光LSと参照光路を経由した参照光LRとを重畳させて干渉光LCを生成して検出する光学系と、干渉光LCの検出結果に基づいて眼底Efの断層像を形成する画像形成部(画像形成部220、画像処理部230)を有していてもよい。記憶部212は、画像形成部により形成された断層像を記憶する。 The fundus analyzer 1 divides the light from the light source unit 101 into the signal light LS and the reference light LR, and superimposes the signal light LS passing through the fundus oculi Ef of the eye E and the reference light LR passing through the reference optical path. An optical system that generates and detects the interference light LC, and an image forming unit (image forming unit 220, image processing unit 230) that forms a tomographic image of the fundus oculi Ef based on the detection result of the interference light LC. Also good. The storage unit 212 stores the tomographic image formed by the image forming unit.
 ここで、画像形成部220により形成される断層像は2次元断層像であり、画像処理部230により形成される断層像は3次元断層像である。上記例では2次元断層像に対する処理について特に詳しく説明したが、3次元断層像に対する処理も同様にして実行することが可能である。このとき、2次元断層像に対する処理における「線」が「面」に読み替えられる。たとえば、曲線が曲面に、直線が平面にそれぞれ読み替えられる。断層像の次元の相違に基づく処理の相違はこのような便宜的なものに過ぎないので、双方の断層像に対する処理は概念的には実質的に同一のものと言える。 Here, the tomographic image formed by the image forming unit 220 is a two-dimensional tomographic image, and the tomographic image formed by the image processing unit 230 is a three-dimensional tomographic image. In the above example, the processing for the two-dimensional tomographic image has been described in detail, but the processing for the three-dimensional tomographic image can be executed in the same manner. At this time, “line” in the process for the two-dimensional tomographic image is read as “surface”. For example, a curved line is read as a curved surface, and a straight line is read as a flat surface. Since the difference in processing based on the difference in the dimensions of the tomographic images is merely such a convenience, the processing for both tomographic images can be said to be substantially the same in concept.
 上記構成を用いて3次元断層像を処理する場合の例を説明する。眼底解析装置1は、記憶部212と、層領域特定部231と、近似曲線演算部232と、突出領域特定部233と、連結領域特定部234と、形態情報生成部235とを有する。記憶部212には、眼底Efの層構造を描写する3次元断層像が記憶される。層領域特定部231は、3次元断層像の画素の画素値に基づいて、網膜色素上皮層に相当する当該断層像中の層領域を特定する。この層領域は一般に曲面である(特異点を含んでいてもよい)。近似曲線演算部232は、層領域の形状に基づく近似曲面を求める。突出領域特定部233は、近似曲面が層領域よりも眼底Efの奥行方向に位置し、かつ奥行方向における近似曲面と層領域との間の距離が所定閾値以上である突出領域を特定する。突出領域は一般に3次元領域である。連結領域特定部234は、この突出領域を含み、かつ近似曲面及び層領域に囲まれた連結領域を特定する。連結領域は一般に3次元領域である。形態情報生成部235は、連結領域の形態を表す形態情報を生成する。このような眼底解析装置1によれば、眼底Efの3次元断層像に基づいてドルーゼンを効果的に検出することが可能である。 An example of processing a three-dimensional tomographic image using the above configuration will be described. The fundus analysis apparatus 1 includes a storage unit 212, a layer region specification unit 231, an approximate curve calculation unit 232, a protruding region specification unit 233, a connection region specification unit 234, and a form information generation unit 235. The storage unit 212 stores a three-dimensional tomogram depicting the layer structure of the fundus oculi Ef. The layer region specifying unit 231 specifies a layer region in the tomographic image corresponding to the retinal pigment epithelium layer based on the pixel value of the pixel of the three-dimensional tomographic image. This layer region is generally a curved surface (which may include singular points). The approximate curve calculation unit 232 obtains an approximate curved surface based on the shape of the layer region. The protruding region specifying unit 233 specifies a protruding region in which the approximate curved surface is located in the depth direction of the fundus oculi Ef with respect to the layer region, and the distance between the approximate curved surface and the layer region in the depth direction is equal to or greater than a predetermined threshold. The protruding region is generally a three-dimensional region. The connection area specifying unit 234 specifies the connection area including the protruding area and surrounded by the approximate curved surface and the layer area. The connected region is generally a three-dimensional region. The form information generation unit 235 generates form information representing the form of the connected area. According to such a fundus analysis apparatus 1, it is possible to effectively detect drusen based on a three-dimensional tomographic image of the fundus oculi Ef.
[変形例]
 以上に説明した構成は、この発明を好適に実施するための一例に過ぎない。よって、この発明の要旨の範囲内における任意の変形を適宜に施すことが可能である。
[Modification]
The configuration described above is merely an example for favorably implementing the present invention. Therefore, arbitrary modifications within the scope of the present invention can be made as appropriate.
 上記の実施形態では、網膜色素上皮層に相当する層領域と近似曲線(近似曲面)との間の距離に基づいてドルーゼン(と考えられる画像領域)を検出しているが、OCT計測の感度を向上させるなどしてブルッフ膜に相当する画像領域(膜領域と呼ぶ)を検出し、層領域と膜領域との間の距離に基づいてドルーゼンを検出するように構成してもよい。ドルーゼンはブルッフ膜と網膜色素上皮層との間に発生するものであるから、この変形例に係る処理を実行することで、ドルーゼンの形態をより高精度、高確度で把握することが可能となる。 In the above embodiment, drusen (image area considered) is detected based on the distance between the layer area corresponding to the retinal pigment epithelium layer and the approximate curve (approximate curved surface), but the sensitivity of OCT measurement is improved. For example, an image region corresponding to a Bruch's film (referred to as a film region) may be detected by improving the detection, and drusen may be detected based on a distance between the layer region and the film region. Since drusen occurs between the Bruch's membrane and the retinal pigment epithelial layer, it is possible to grasp drusen's morphology with higher accuracy and accuracy by executing the processing according to this modification. .
 なお、上記実施形態は、OCT画像からの検出が困難なブルッフ膜の代わりに、ドルーゼン(突出)が無いと仮定した状態の網膜色素上皮層の形態を近似する近似曲線を用いるものである。 In the above-described embodiment, instead of the Bruch's film that is difficult to detect from the OCT image, an approximate curve that approximates the morphology of the retinal pigment epithelium layer in a state where there is no drusen (protrusion) is used.
 更なる変形例として、OCT画像から網膜色素上皮層に相当する層領域を特定するとともに、ブルッフ膜に相当する膜領域の特定を試みることができる。膜領域が特定された場合には、膜領域と層領域との間の距離に基づいて連結領域を特定して形態情報を生成する。一方、膜領域の特定に失敗した場合には、上記実施形態のように近似曲線(近似曲面)を求め、層領域と近似曲線(近似曲面)とに基づいて連結領域を特定して形態情報を生成する。 As a further modification, it is possible to specify a layer region corresponding to the retinal pigment epithelium layer from the OCT image and to specify a membrane region corresponding to the Bruch's membrane. When the film region is specified, the connected region is specified based on the distance between the film region and the layer region, and the form information is generated. On the other hand, if the identification of the membrane region fails, an approximate curve (approximate curved surface) is obtained as in the above embodiment, and the connected region is specified based on the layer region and the approximate curve (approximate curved surface) to obtain the form information. Generate.
 この変形例によれば、膜領域が特定された場合には高精度、高確度でドルーゼンの形態を把握できるとともに、膜領域が特定されなかった場合には近似曲線(又は近似曲面)を利用してドルーゼンの形態を把握できる。 According to this modification, when the membrane region is specified, the drusen form can be grasped with high accuracy and high accuracy, and when the membrane region is not specified, an approximate curve (or approximate curved surface) is used. You can grasp the form of drusen.
 上記の実施形態においては、参照ミラー114の位置を変更して信号光LSの光路と参照光LRの光路との光路長差を変更しているが、光路長差を変更する手法はこれに限定されるものではない。たとえば、被検眼Eに対して眼底カメラユニット2やOCTユニット100を移動させて信号光LSの光路長を変更することにより光路長差を変更することができる。また、特に被測定物体が生体部位でない場合などには、被測定物体を深度方向(z軸方向)に移動させることにより光路長差を変更することも有効である。 In the above embodiment, the position of the reference mirror 114 is changed to change the optical path length difference between the optical path of the signal light LS and the optical path of the reference light LR, but the method of changing the optical path length difference is limited to this. Is not to be done. For example, the optical path length difference can be changed by moving the fundus camera unit 2 or the OCT unit 100 with respect to the eye E to change the optical path length of the signal light LS. It is also effective to change the optical path length difference by moving the measurement object in the depth direction (z-axis direction), especially when the measurement object is not a living body part.
[眼底解析プログラム]
 この発明に係る眼底解析プログラムについて説明する。この眼底解析プログラムは、眼底の層構造を描写する断層像を記憶する記憶部を有するコンピュータに後述の動作を実行させる。このコンピュータの例として、上記実施形態の演算制御ユニットがある。この眼底解析プログラムは、このコンピュータ自体に記憶されていてもよいし、このコンピュータと通信可能に接続されたサーバ等に記憶されていてもよい。
[Fundus analysis program]
A fundus analysis program according to the present invention will be described. The fundus analysis program causes a computer having a storage unit that stores a tomographic image depicting the layer structure of the fundus to execute an operation described later. An example of this computer is the arithmetic control unit of the above embodiment. This fundus analysis program may be stored in the computer itself, or may be stored in a server or the like connected to be communicable with the computer.
 この眼底解析プログラムに基づいてコンピュータが実行する動作について説明する。まず、コンピュータ(層領域特定部)は、断層像の画素の画素値に基づいて、網膜色素上皮層に相当する当該断層像中の層領域を特定する。次に、コンピュータ(近似曲線演算部)は、層領域の形状に基づく近似曲線を求める。続いて、コンピュータ(突出領域特定部)は、近似曲線が層領域よりも眼底の奥行方向に位置し、かつ近似曲線と層領域との間の奥行方向における距離が所定閾値以上である突出領域を特定する。更に、コンピュータ(連結領域特定部)は、この突出領域を含み、かつ近似曲線及び層領域に囲まれた連結領域を特定する。そして、コンピュータ(形態情報生成部)は、連結領域の形態を表す形態情報を生成する。コンピュータは、生成された形態情報を表示出力、印刷出力、送信出力することができる。 The operation performed by the computer based on this fundus analysis program will be described. First, the computer (layer region specifying unit) specifies a layer region in the tomographic image corresponding to the retinal pigment epithelium layer based on the pixel value of the pixel of the tomographic image. Next, the computer (approximate curve calculation unit) obtains an approximate curve based on the shape of the layer region. Subsequently, the computer (projection region specifying unit) determines a projection region in which the approximate curve is located in the depth direction of the fundus than the layer region, and the distance in the depth direction between the approximate curve and the layer region is equal to or greater than a predetermined threshold. Identify. Further, the computer (connected area specifying unit) specifies the connected area including the protruding area and surrounded by the approximate curve and the layer area. And a computer (form information generation part) produces | generates the form information showing the form of a connection area | region. The computer can display, print, and transmit the generated form information.
 このような眼底解析プログラムによれば、上記実施形態と同様に、眼底の断層像に基づいてドルーゼンを効果的に検出することが可能である。なお、この実施形態に係る眼底解析プログラムは、上記実施形態の眼底解析装置1が有する任意の機能をコンピュータに実行させるものであってよい。 According to such a fundus analysis program, drusen can be detected effectively based on a tomographic image of the fundus as in the above embodiment. Note that the fundus analysis program according to this embodiment may cause a computer to execute any function that the fundus analysis apparatus 1 of the above embodiment has.
 この発明に係る他の眼底解析プログラムについて説明する。この眼底解析プログラムは、眼底の層構造を描写する3次元断層像を記憶する記憶部を有するコンピュータに後述の動作を実行させる。このコンピュータの例として、上記実施形態の演算制御ユニットがある。 Next, another fundus analysis program according to the present invention will be described. This fundus analysis program causes a computer having a storage unit that stores a three-dimensional tomographic image that describes the layer structure of the fundus to perform an operation described later. An example of this computer is the arithmetic control unit of the above embodiment.
 この眼底解析プログラムに基づいてコンピュータが実行する動作について説明する。まず、コンピュータ(層領域特定部)は、3次元断層像の画素の画素値に基づいて、網膜色素上皮層に相当する当該断層像中の層領域を特定する。次に、コンピュータ(近似曲面演算部)は、層領域の形状に基づく近似曲面を求める。続いて、コンピュータ(突出領域特定部)は、近似曲面が層領域よりも眼底の奥行方向に位置し、かつ奥行方向における近似曲面と層領域との間の距離が所定閾値以上である突出領域を特定する。コンピュータ(連結領域特定部)は、この突出領域を含み、かつ近似曲面及び層領域に囲まれた連結領域を特定する。コンピュータ(形態情報生成部)は、連結領域の形態を表す形態情報を生成する。コンピュータは、生成された形態情報を表示出力、印刷出力、送信出力することが可能である。 The operation performed by the computer based on this fundus analysis program will be described. First, the computer (layer region specifying unit) specifies a layer region in the tomographic image corresponding to the retinal pigment epithelial layer based on the pixel values of the pixels of the three-dimensional tomographic image. Next, the computer (approximate curved surface calculation unit) obtains an approximate curved surface based on the shape of the layer region. Subsequently, the computer (projection region specifying unit) determines a projection region in which the approximate curved surface is located in the depth direction of the fundus than the layer region, and the distance between the approximate curved surface and the layer region in the depth direction is equal to or greater than a predetermined threshold. Identify. The computer (connected area specifying unit) specifies the connected area including the protruding area and surrounded by the approximate curved surface and the layer area. A computer (form information generation part) produces | generates the form information showing the form of a connection area | region. The computer can display, print, and transmit the generated form information.
 このような眼底解析プログラムによれば、上記実施形態と同様に、眼底の3次元断層像に基づいてドルーゼンを効果的に検出することが可能である。なお、この実施形態に係る眼底解析プログラムは、上記実施形態の眼底解析装置1が有する任意の機能をコンピュータに実行させるものであってよい。 According to such a fundus analysis program, drusen can be detected effectively based on a three-dimensional tomographic image of the fundus as in the above embodiment. Note that the fundus analysis program according to this embodiment may cause a computer to execute any function that the fundus analysis apparatus 1 of the above embodiment has.
 実施形態に係る眼底解析プログラムを、コンピュータによって読み取り可能な任意の記録媒体に記憶させることができる。この記録媒体としては、たとえば、光ディスク、光磁気ディスク(CD-ROM/DVD-RAM/DVD-ROM/MO等)、磁気記憶媒体(ハードディスク/フロッピー(登録商標)ディスク/ZIP等)などを用いることが可能である。また、ハードディスクドライブやメモリ等の記憶装置に記憶させることも可能である。 The fundus analysis program according to the embodiment can be stored in any recording medium readable by a computer. As this recording medium, for example, an optical disk, a magneto-optical disk (CD-ROM / DVD-RAM / DVD-ROM / MO, etc.), a magnetic storage medium (hard disk / floppy (registered trademark) disk / ZIP, etc.), etc. are used. Is possible. It can also be stored in a storage device such as a hard disk drive or memory.
 また、インターネットやLAN等のネットワークを通じてこのプログラムを送受信することも可能である。 It is also possible to send and receive this program through a network such as the Internet or a LAN.
[眼底解析方法]
 この発明に係る眼底解析方法について説明する。この眼底解析方法は、眼底の層構造を描写する断層像を解析するものであり、次のようなステップを含んで構成される。なお、この眼底解析方法を、次のようなステップを眼底解析装置(OCT装置等)に実行させるための装置制御方法として把握することも可能である。
[Fundamental analysis method]
A fundus analysis method according to the present invention will be described. This fundus analysis method analyzes a tomographic image describing the layer structure of the fundus and includes the following steps. This fundus analysis method can also be grasped as an apparatus control method for causing the fundus analysis apparatus (OCT apparatus or the like) to execute the following steps.
 (第1のステップ)
 断層像の画素の画素値に基づいて、網膜色素上皮層に相当する当該断層像中の層領域を特定する。
 (第2のステップ)
 層領域の形状に基づく近似曲線を求める。
 (第3のステップ)
 近似曲線が層領域よりも眼底の奥行方向に位置し、かつ近似曲線と層領域との間の奥行方向における距離が所定閾値以上である突出領域を特定する。
 (第4のステップ)
 突出領域を含み、かつ近似曲線及び層領域に囲まれた連結領域を特定する。
 (第5のステップ)
 連結領域の形態を表す形態情報を生成する。
(First step)
Based on the pixel value of the pixel of the tomographic image, a layer region in the tomographic image corresponding to the retinal pigment epithelium layer is specified.
(Second step)
An approximate curve based on the shape of the layer region is obtained.
(Third step)
A protruding region in which the approximate curve is located in the depth direction of the fundus than the layer region and the distance in the depth direction between the approximate curve and the layer region is greater than or equal to a predetermined threshold is specified.
(Fourth step)
A connected region including the protruding region and surrounded by the approximate curve and the layer region is specified.
(Fifth step)
Form information representing the form of the connected area is generated.
 このような眼底解析方法によれば、上記実施形態と同様に、眼底の断層像に基づいてドルーゼンを効果的に検出することが可能である。なお、この実施形態に係る眼底解析方法(又は装置の制御方法)は、上記実施形態の眼底解析装置1が実行可能な任意の処理を含んでいてもよい。 According to such a fundus analysis method, drusen can be detected effectively based on a tomographic image of the fundus as in the above embodiment. Note that the fundus analysis method (or device control method) according to this embodiment may include any process that can be executed by the fundus analysis device 1 of the above embodiment.
 この発明に係る他の眼底解析方法について説明する。この眼底解析方法は、眼底の層構造を描写する3次元断層像を解析するものであり、次のようなステップを含んで構成される。 Another fundus analysis method according to the present invention will be described. This fundus analysis method analyzes a three-dimensional tomographic image that describes the layer structure of the fundus, and includes the following steps.
 (第1のステップ)
 3次元断層像の画素の画素値に基づいて、網膜色素上皮層に相当する当該断層像中の層領域を特定する。
 (第2のステップ)
 層領域の形状に基づく近似曲面を求める。
 (第3のステップ)
 近似曲面が層領域よりも眼底の奥行方向に位置し、かつ近似曲面と層領域との間の奥行方向における距離が所定閾値以上である突出領域を特定する。
 (第4のステップ)
 突出領域を含み、かつ近似曲面及び層領域に囲まれた連結領域を特定する。
 (第5のステップ)
 連結領域の形態を表す形態情報を生成する。
(First step)
A layer region in the tomographic image corresponding to the retinal pigment epithelium layer is specified based on the pixel value of the pixel of the three-dimensional tomographic image.
(Second step)
An approximate curved surface based on the shape of the layer region is obtained.
(Third step)
A protruding region in which the approximate curved surface is located in the depth direction of the fundus occupying the layer region and the distance in the depth direction between the approximate curved surface and the layer region is greater than or equal to a predetermined threshold is specified.
(Fourth step)
A connected area including the protruding area and surrounded by the approximate curved surface and the layer area is specified.
(Fifth step)
Form information representing the form of the connected area is generated.
 このような眼底解析方法によれば、上記実施形態と同様に、眼底の3次元断層像に基づいてドルーゼンを効果的に検出することが可能である。なお、この実施形態に係る眼底解析方法(又は装置の制御方法)は、上記実施形態の眼底解析装置1が実行可能な任意の処理を含んでいてもよい。 According to such a fundus analysis method, drusen can be detected effectively based on a three-dimensional tomographic image of the fundus as in the above embodiment. Note that the fundus analysis method (or device control method) according to this embodiment may include any process that can be executed by the fundus analysis device 1 of the above embodiment.
1 眼底解析装置
2 眼底カメラユニット
41 光路長変更部
42 ガルバノスキャナ
100 OCTユニット
101 光源ユニット
115 CCDイメージセンサ
200 演算制御ユニット
210 制御部
211 主制御部
212 記憶部
220 画像形成部
230 画像処理部
231 層領域特定部
232 近似曲線演算部
232a 特徴部位特定部
232b 特徴部位補間部
232c 自由曲線演算部
232d 補正部
232e 直線部位特定部
232f 第1変形部
232g 曲線部位特定部
232h 第2変形部
232i 突出判定部
232j 第3変形部
233 突出領域特定部
233a 距離算出部
233b 距離判断部
233c 画像領域特定部
234 連結領域特定部
235 形態情報生成部
235a 乳頭領域判定部
235b 分布画像形成部
235c カウント部
235d サイズ算出部
240A 表示部
240B 操作部
300 層領域
310、320、330、340 特徴部位
400 自由曲線
400a 内挿曲線
410、420、430 制御点
500 推定曲線
500a 外挿曲線
510 補正線分
600 突出領域
700 連結領域
800 内境界膜領域
900 分布画像
E 被検眼
Ef 眼底
LS 信号光
LR 参照光
LC 干渉光
G 断層像
F 断層像のフレーム
DESCRIPTION OF SYMBOLS 1 Fundus analyzer 2 Fundus camera unit 41 Optical path length change part 42 Galvano scanner 100 OCT unit 101 Light source unit 115 CCD image sensor 200 Arithmetic control unit 210 Control part 211 Main control part 212 Storage part 220 Image formation part 230 Image processing part 231 Layer Region specifying unit 232 Approximate curve calculating unit 232a Feature site specifying unit 232b Feature site interpolating unit 232c Free curve calculating unit 232d Correction unit 232e Straight site specifying unit 232f First deforming unit 232g Curved site specifying unit 232h Second deforming unit 232i Protrusion determining unit 232j 3rd deformation | transformation part 233 Protrusion area | region specific | specification part 233a Distance calculation part 233b Distance determination part 233c Image area | region specific | specification part 234 Connection area | region specific | specification part 235 Shape information generation part 235a Papilla area | region determination part 235b Distribution image formation part 235c Count part 23 5d Size calculation unit 240A Display unit 240B Operation unit 300 Layer region 310, 320, 330, 340 Characteristic part 400 Free curve 400a Interpolation curve 410, 420, 430 Control point 500 Estimation curve 500a Extrapolation curve 510 Correction line segment 600 Protrusion region 700 Connection region 800 Inner boundary membrane region 900 Distribution image E Eye to be examined Ef Fundus LS Signal light LR Reference light LC Interference light G Tomographic image F Tomographic image frame

Claims (23)

  1.  眼底の層構造を描写する断層像を記憶する記憶部と、
     前記断層像の画素の画素値に基づいて、網膜色素上皮層に相当する当該断層像中の層領域を特定する層領域特定部と、
     前記層領域の形状に基づく近似曲線を求める近似曲線演算部と、
     前記近似曲線が前記層領域よりも前記眼底の奥行方向に位置し、かつ前記奥行方向における前記近似曲線と前記層領域との間の距離が所定閾値以上である突出領域を特定する突出領域特定部と、
     前記突出領域を含み、かつ前記近似曲線及び前記層領域に囲まれた連結領域を特定する連結領域特定部と、
     前記連結領域の形態を表す形態情報を生成する形態情報生成部と
     を有する眼底解析装置。
    A storage unit for storing a tomographic image depicting the layer structure of the fundus;
    Based on the pixel value of the pixel of the tomographic image, a layer region specifying unit that specifies a layer region in the tomographic image corresponding to the retinal pigment epithelium layer;
    An approximate curve calculation unit for obtaining an approximate curve based on the shape of the layer region;
    A protruding region specifying unit that specifies a protruding region in which the approximate curve is located in the depth direction of the fundus than the layer region, and the distance between the approximate curve and the layer region in the depth direction is equal to or greater than a predetermined threshold. When,
    A connected region specifying unit that specifies the connected region that includes the protruding region and is surrounded by the approximate curve and the layer region;
    A fundus analysis apparatus comprising: a form information generation unit that generates form information representing the form of the connected region.
  2.  前記近似曲線演算部は、
     前記層領域の形状に基づいて、前記層領域中の複数の特徴部位を特定する特徴部位特定部を含み、
     前記複数の特徴部位に基づいて前記近似曲線を求める
     ことを特徴とする請求項1に記載の眼底解析装置。
    The approximate curve calculation unit includes:
    Based on the shape of the layer region, including a feature site identification unit that identifies a plurality of feature sites in the layer region,
    The fundus analysis apparatus according to claim 1, wherein the approximate curve is obtained based on the plurality of characteristic portions.
  3.  前記特徴部位特定部は、
     前記層領域の形状に基づいて前記奥行方向における前記層領域中の最深部位を特定して特徴部位とし、
     前記最深部位を通過しかつ前記層領域に接する直線を求め、
     前記層領域と前記直線との接点を更なる特徴部位とする
     ことを特徴とする請求項2に記載の眼底解析装置。
    The characteristic site specifying part is:
    Identify the deepest part in the layer area in the depth direction based on the shape of the layer area as a characteristic part,
    Find a straight line that passes through the deepest part and touches the layer region,
    The fundus analysis apparatus according to claim 2, wherein a contact point between the layer region and the straight line is a further characteristic part.
  4.  前記特徴部位特定部は、前記最深部位を通過する直線を前記最深部位を中心として回転させていくことにより接点を順次に特定する
     ことを特徴とする請求項3に記載の眼底解析装置。
    The fundus analysis apparatus according to claim 3, wherein the characteristic part specifying unit sequentially specifies a contact point by rotating a straight line passing through the deepest part around the deepest part.
  5.  前記特徴部位特定部は、
     前記最深部位を通過する直線を前記最深部位を中心として回転させて接点を特定し、
     特定された接点を通過する直線を当該接点を中心として回転させて更なる接点を特定する
     ことを特徴とする請求項3に記載の眼底解析装置。
    The characteristic site specifying part is:
    Rotate a straight line passing through the deepest part around the deepest part to identify a contact point,
    The fundus analysis apparatus according to claim 3, wherein a further contact is specified by rotating a straight line passing through the specified contact around the contact.
  6.  前記近似曲線演算部は、隣接する2つの特徴部位の間の距離を求め、当該距離が所定値以上である場合に当該2つの特徴部位の間に新たな特徴部位を追加する特徴部位補間部を含む
     ことを特徴とする請求項2~請求項5のいずれか一項に記載の眼底解析装置。
    The approximate curve calculation unit obtains a distance between two adjacent feature parts, and adds a new feature part between the two feature parts when the distance is a predetermined value or more. The fundus analysis apparatus according to any one of claims 2 to 5, further comprising:
  7.  前記近似曲線演算部は、前記複数の特徴部位に基づく自由曲線を求め、前記自由曲線に基づいて前記近似曲線を求める
     ことを特徴とする請求項2~請求項6のいずれか一項に記載の眼底解析装置。
    The approximate curve calculation unit obtains a free curve based on the plurality of characteristic parts, and obtains the approximate curve based on the free curve. Fundus analyzer.
  8.  前記近似曲線演算部は、前記自由曲線としてスプライン曲線を求める
     ことを特徴とする請求項7に記載の眼底解析装置。
    The fundus analysis apparatus according to claim 7, wherein the approximate curve calculation unit obtains a spline curve as the free curve.
  9.  前記近似曲線演算部は、前記スプライン曲線として3次スプライン曲線を求める
     ことを特徴とする請求項8に記載の眼底解析装置。
    The fundus analysis apparatus according to claim 8, wherein the approximate curve calculation unit obtains a cubic spline curve as the spline curve.
  10.  前記近似曲線演算部は、
     前記層領域において実質的に直線的な部位を特定する直線部位特定部と、
     前記自由曲線が当該直線部位よりも前記奥行方向に位置する場合に、前記自由曲線の当該部位を当該直線部位の位置に合わせるように前記自由曲線を変形する第1変形部と
     を含み、
     前記第1変形部による変形結果に基づいて前記近似曲線を求める
     ことを特徴とする請求項7~請求項9のいずれか一項に記載の眼底解析装置。
    The approximate curve calculation unit includes:
    A linear part specifying part for specifying a substantially linear part in the layer region;
    A first deforming portion that deforms the free curve so as to match the position of the free curve with the position of the straight line when the free curve is located in the depth direction relative to the straight line; and
    The fundus analysis apparatus according to any one of claims 7 to 9, wherein the approximate curve is obtained based on a deformation result by the first deformation unit.
  11.  前記近似曲線演算部は、
     前記自由曲線において前記層領域よりも前記奥行方向の逆方向に位置する部位を特定する曲線部位特定部と、
     当該特定部位を前記層領域の位置に合わせるように前記自由曲線を変形する第2変形部と
     を含み、
     前記第2変形部による変形結果に基づいて前記近似曲線を求める
     ことを特徴とする請求項7~請求項10のいずれか一項に記載の眼底解析装置。
    The approximate curve calculation unit includes:
    A curve part specifying part for specifying a part located in a direction opposite to the depth direction from the layer region in the free curve;
    A second deforming part that deforms the free curve so as to align the specific part with the position of the layer region,
    The fundus analysis apparatus according to any one of claims 7 to 10, wherein the approximate curve is obtained based on a deformation result by the second deformation unit.
  12.  前記近似曲線演算部は、
     前記断層像のフレームの端部近傍において前記層領域が前記奥行方向の逆方向に突出しているか判定する突出判定部と、
     前記層領域が突出していると判定された場合に、当該突出部位に対応する前記自由曲線の部位を、前記フレームの中央側からの前記自由曲線の延長線に置換することで、前記自由曲線を変形する第3変形部と
     を含み、
     前記第3変形部による変形結果に基づいて前記近似曲線を求める
     ことを特徴とする請求項7~請求項11のいずれか一項に記載の眼底解析装置。
    The approximate curve calculation unit includes:
    A protrusion determination unit that determines whether the layer region protrudes in the direction opposite to the depth direction in the vicinity of the end of the frame of the tomographic image;
    When it is determined that the layer region protrudes, the free curve portion is replaced with an extension line of the free curve from the center side of the frame by replacing the free curve portion corresponding to the protruding portion. A third deforming portion that deforms, and
    The fundus analysis apparatus according to any one of claims 7 to 11, wherein the approximate curve is obtained based on a deformation result by the third deformation unit.
  13.  前記突出領域特定部は、
     前記奥行方向における前記近似曲線上の各点と前記層領域との間の距離を算出し、
     算出された距離が前記所定閾値以上であるか判断し、
     前記所定閾値以上であると判断された前記近似曲線上の点の集合と前記層領域とに挟まれた画像領域を特定して前記突出領域とする
     ことを特徴とする請求項1~請求項12のいずれか一項に記載の眼底解析装置。
    The protruding region specifying part is:
    Calculating the distance between each point on the approximate curve in the depth direction and the layer region;
    Determining whether the calculated distance is greater than or equal to the predetermined threshold;
    13. The image region sandwiched between a set of points on the approximate curve determined to be equal to or greater than the predetermined threshold and the layer region is specified as the projecting region. The fundus analysis apparatus according to any one of the above.
  14.  前記形態情報生成部は、
     前記断層像を解析して、前記眼底の視神経乳頭に相当する乳頭領域が当該断層像に含まれるか判定する乳頭領域判定部を含み、
     前記乳頭領域が含まれると判定された場合に、前記連結領域特定部により特定された前記連結領域のうち当該乳頭領域の近傍に位置する連結領域を除外して前記形態情報を生成する
     ことを特徴とする請求項1~請求項13のいずれか一項に記載の眼底解析装置。
    The form information generation unit
    Analyzing the tomographic image, including a nipple region determination unit that determines whether a nipple region corresponding to the optic nerve head of the fundus is included in the tomographic image,
    When it is determined that the nipple region is included, the morphological information is generated by excluding a connected region located in the vicinity of the nipple region among the connected regions specified by the connected region specifying unit. The fundus analysis apparatus according to any one of claims 1 to 13.
  15.  前記乳頭領域判定部は、
     前記断層像を解析して前記眼底の所定の特徴層に相当する特徴層領域を特定し、
     特定された特徴層領域の形状に基づいて前記乳頭領域が含まれるか判定する
     ことを特徴とする請求項14に記載の眼底解析装置。
    The nipple region determination unit
    Analyzing the tomographic image to identify a feature layer region corresponding to a predetermined feature layer of the fundus;
    The fundus analysis apparatus according to claim 14, wherein it is determined whether the nipple region is included based on the shape of the identified feature layer region.
  16.  前記所定の特徴層は、眼底において硝子体との境界をなす内境界膜であり、
     前記乳頭領域判定部は、
     前記特徴層領域において前記奥行方向に陥没している部位が存在するか判定し、
     当該陥没部位が存在すると判定された場合に前記乳頭領域が含まれると判定する
     ことを特徴とする請求項15に記載の眼底解析装置。
    The predetermined characteristic layer is an inner boundary film that forms a boundary with the vitreous body at the fundus,
    The nipple region determination unit
    It is determined whether there is a portion depressed in the depth direction in the feature layer region,
    The fundus analysis apparatus according to claim 15, wherein it is determined that the nipple region is included when it is determined that the depressed portion exists.
  17.  前記乳頭領域判定部は、被検眼が左眼であるか右眼であるかを示す左右眼情報の入力を受けて、
     被検眼が左眼である場合、前記断層像のフレームの少なくとも左側端部近傍を解析することにより前記判定を行い、
     被検眼が右眼である場合、前記フレームの少なくとも右側端部近傍を解析することにより前記判定を行う
     ことを特徴とする請求項14~請求項16のいずれか一項に記載の眼底解析装置。
    The nipple region determination unit receives input of left and right eye information indicating whether the eye to be examined is the left eye or the right eye,
    When the eye to be examined is the left eye, the determination is performed by analyzing at least the left end vicinity of the frame of the tomographic image,
    The fundus analysis apparatus according to any one of claims 14 to 16, wherein when the eye to be examined is a right eye, the determination is performed by analyzing at least the vicinity of the right end of the frame.
  18.  光源からの光を信号光と参照光とに分割し、被検眼の眼底を経由した信号光と参照光路を経由した参照光とを重畳させて干渉光を生成して検出する光学系と、
     干渉光の検出結果に基づいて眼底の断層像を形成する画像形成部と
     を更に備え、
     前記記憶部は、前記画像形成部により形成された断層像を記憶する
     ことを特徴とする請求項1~請求項17のいずれか一項に記載の眼底解析装置。
    An optical system that generates and detects interference light by dividing light from a light source into signal light and reference light, and superimposing signal light passing through the fundus of the eye to be examined and reference light passing through the reference optical path;
    An image forming unit that forms a tomographic image of the fundus based on the detection result of the interference light,
    The fundus analysis apparatus according to any one of claims 1 to 17, wherein the storage unit stores a tomographic image formed by the image forming unit.
  19.  眼底の層構造を描写する3次元断層像を記憶する記憶部と、
     前記3次元断層像の画素の画素値に基づいて、網膜色素上皮層に相当する当該断層像中の層領域を特定する層領域特定部と、
     前記層領域の形状に基づく近似曲面を求める近似曲面演算部と、
     前記近似曲面が前記層領域よりも前記眼底の奥行方向に位置し、かつ前記奥行方向における前記近似曲面と前記層領域との間の距離が所定閾値以上である突出領域を特定する突出領域特定部と、
     前記突出領域を含み、かつ前記近似曲面及び前記層領域に囲まれた連結領域を特定する連結領域特定部と、
     前記連結領域の形態を表す形態情報を生成する形態情報生成部と
     を有する眼底解析装置。
    A storage unit for storing a three-dimensional tomographic image depicting the layer structure of the fundus;
    A layer region specifying unit for specifying a layer region in the tomographic image corresponding to the retinal pigment epithelium layer based on the pixel value of the pixel of the three-dimensional tomographic image;
    An approximate curved surface calculation unit for obtaining an approximate curved surface based on the shape of the layer region;
    A protruding region specifying unit that specifies a protruding region in which the approximate curved surface is located in the depth direction of the fundus than the layer region, and the distance between the approximate curved surface and the layer region in the depth direction is equal to or greater than a predetermined threshold. When,
    A connected region specifying unit that specifies the connected region that includes the protruding region and is surrounded by the approximate curved surface and the layer region;
    A fundus analysis apparatus comprising: a form information generation unit that generates form information representing the form of the connected region.
  20.  眼底の層構造を描写する断層像を記憶する記憶部を有するコンピュータを、
     前記断層像の画素の画素値に基づいて、網膜色素上皮層に相当する当該断層像中の層領域を特定する層領域特定部、
     前記層領域の形状に基づく近似曲線を求める近似曲線演算部、
     前記近似曲線が前記層領域よりも前記眼底の奥行方向に位置し、かつ前記近似曲線と前記層領域との間の前記奥行方向における距離が所定閾値以上である突出領域を特定する突出領域特定部、
     前記突出領域を含み、かつ前記近似曲線及び前記層領域に囲まれた連結領域を特定する連結領域特定部、及び、
     前記連結領域の形態を表す形態情報を生成する形態情報生成部
     として機能させる眼底解析プログラム。
    A computer having a storage unit for storing a tomographic image depicting the layer structure of the fundus;
    A layer region specifying unit for specifying a layer region in the tomographic image corresponding to the retinal pigment epithelium layer based on the pixel value of the pixel of the tomographic image;
    An approximate curve calculation unit for obtaining an approximate curve based on the shape of the layer region,
    A protruding region specifying unit that specifies a protruding region in which the approximate curve is located in the depth direction of the fundus than the layer region, and a distance in the depth direction between the approximate curve and the layer region is a predetermined threshold or more. ,
    A connected region specifying unit that specifies the connected region that includes the protruding region and is surrounded by the approximate curve and the layer region; and
    A fundus analysis program that functions as a form information generation unit that generates form information representing the form of the connected region.
  21.  眼底の層構造を描写する3次元断層像を記憶する記憶部を有するコンピュータを、
     前記3次元断層像の画素の画素値に基づいて、網膜色素上皮層に相当する当該断層像中の層領域を特定する層領域特定部、
     前記層領域の形状に基づく近似曲面を求める近似曲面演算部、
     前記近似曲面が前記層領域よりも前記眼底の奥行方向に位置し、かつ前記奥行方向における前記近似曲面と前記層領域との間の距離が所定閾値以上である突出領域を特定する突出領域特定部、
     前記突出領域を含み、かつ前記近似曲面及び前記層領域に囲まれた連結領域を特定する連結領域特定部、及び、
     前記連結領域の形態を表す形態情報を生成する形態情報生成部
     として機能させる眼底解析プログラム。
    A computer having a storage unit for storing a three-dimensional tomographic image depicting a layer structure of the fundus;
    A layer region specifying unit that specifies a layer region in the tomographic image corresponding to the retinal pigment epithelium layer based on the pixel value of the pixel of the three-dimensional tomographic image;
    An approximate curved surface calculation unit for obtaining an approximate curved surface based on the shape of the layer region;
    A protruding region specifying unit that specifies a protruding region in which the approximate curved surface is located in the depth direction of the fundus than the layer region, and the distance between the approximate curved surface and the layer region in the depth direction is equal to or greater than a predetermined threshold. ,
    A connected region specifying part that includes the protruding region and specifies a connected region surrounded by the approximate curved surface and the layer region; and
    A fundus analysis program that functions as a form information generation unit that generates form information representing the form of the connected region.
  22.  眼底の層構造を描写する断層像を解析する眼底解析方法であって、
     前記断層像の画素の画素値に基づいて、網膜色素上皮層に相当する当該断層像中の層領域を特定するステップと、
     前記層領域の形状に基づく近似曲線を求めるステップと、
     前記近似曲線が前記層領域よりも前記眼底の奥行方向に位置し、かつ前記近似曲線と前記層領域との間の前記奥行方向における距離が所定閾値以上である突出領域を特定するステップと、
     前記突出領域を含み、かつ前記近似曲線及び前記層領域に囲まれた連結領域を特定するステップと、
     前記連結領域の形態を表す形態情報を生成するステップと
     を含む眼底解析方法。
    A fundus analysis method for analyzing a tomographic image depicting a layer structure of the fundus,
    Identifying a layer region in the tomographic image corresponding to the retinal pigment epithelial layer based on the pixel value of the pixel of the tomographic image;
    Obtaining an approximate curve based on the shape of the layer region;
    Identifying the protruding region in which the approximate curve is positioned in the depth direction of the fundus than the layer region, and the distance in the depth direction between the approximate curve and the layer region is a predetermined threshold or more;
    Identifying a connected region that includes the protruding region and is surrounded by the approximate curve and the layer region;
    Generating fundus information representing the form of the connected region.
  23.  眼底の層構造を描写する3次元断層像を解析する眼底解析方法であって、
     前記3次元断層像の画素の画素値に基づいて、網膜色素上皮層に相当する当該断層像中の層領域を特定するステップと、
     前記層領域の形状に基づく近似曲面を求めるステップと、
     前記近似曲面が前記層領域よりも前記眼底の奥行方向に位置し、かつ前記奥行方向における前記近似曲面と前記層領域との間の距離が所定閾値以上である突出領域を特定するステップと、
     前記突出領域を含み、かつ前記近似曲面及び前記層領域に囲まれた連結領域を特定するステップと、
     前記連結領域の形態を表す形態情報を生成するステップと
     を含む眼底解析方法。
    A fundus analysis method for analyzing a three-dimensional tomographic image depicting a layer structure of the fundus,
    Identifying a layer region in the tomographic image corresponding to the retinal pigment epithelial layer based on the pixel values of the pixels of the three-dimensional tomographic image;
    Obtaining an approximate curved surface based on the shape of the layer region;
    Identifying the protruding region in which the approximate curved surface is located in the depth direction of the fundus than the layer region, and the distance between the approximate curved surface and the layer region in the depth direction is equal to or greater than a predetermined threshold;
    Identifying a connected region including the protruding region and surrounded by the approximate curved surface and the layer region;
    Generating fundus information representing the form of the connected region.
PCT/JP2013/063628 2012-07-30 2013-05-16 Fundus analysis device, fundus analysis program, and fundus analysis method WO2014020966A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012168872A JP5996959B2 (en) 2012-07-30 2012-07-30 Fundus analyzer
JP2012-168872 2012-07-30

Publications (1)

Publication Number Publication Date
WO2014020966A1 true WO2014020966A1 (en) 2014-02-06

Family

ID=50027660

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/063628 WO2014020966A1 (en) 2012-07-30 2013-05-16 Fundus analysis device, fundus analysis program, and fundus analysis method

Country Status (2)

Country Link
JP (1) JP5996959B2 (en)
WO (1) WO2014020966A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018086157A1 (en) * 2016-11-09 2018-05-17 苏州微清医疗器械有限公司 Ultra-wide ocular fundus imaging system
CN110363782A (en) * 2019-06-13 2019-10-22 平安科技(深圳)有限公司 A kind of area recognizing method based on limb recognition algorithm, device and electronic equipment

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10143370B2 (en) * 2014-06-19 2018-12-04 Novartis Ag Ophthalmic imaging system with automatic retinal feature detection
JP6616673B2 (en) * 2015-11-27 2019-12-04 株式会社トプコン Corneal inspection device
JP7286422B2 (en) * 2019-06-11 2023-06-05 株式会社トプコン Ophthalmic information processing device, ophthalmic device, ophthalmic information processing method, and program
JP7387343B2 (en) * 2019-09-04 2023-11-28 キヤノン株式会社 Image processing device, image processing program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009061203A (en) * 2007-09-10 2009-03-26 Univ Of Tokyo Fundus observation apparatus, and fundus image processing device and program
WO2011013315A1 (en) * 2009-07-30 2011-02-03 株式会社トプコン Fundus analysis device and fundus analysis method
JP2011224264A (en) * 2010-04-22 2011-11-10 Canon Inc Tomographic image observation apparatus, and method and program for controlling display of tomographic image
JP2012061337A (en) * 2011-12-26 2012-03-29 Canon Inc Image forming apparatus, method for controlling the same, and computer program
JP2012075938A (en) * 2012-01-06 2012-04-19 Canon Inc Image processor and method for controlling the same, and computer program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009061203A (en) * 2007-09-10 2009-03-26 Univ Of Tokyo Fundus observation apparatus, and fundus image processing device and program
WO2011013315A1 (en) * 2009-07-30 2011-02-03 株式会社トプコン Fundus analysis device and fundus analysis method
JP2011224264A (en) * 2010-04-22 2011-11-10 Canon Inc Tomographic image observation apparatus, and method and program for controlling display of tomographic image
JP2012061337A (en) * 2011-12-26 2012-03-29 Canon Inc Image forming apparatus, method for controlling the same, and computer program
JP2012075938A (en) * 2012-01-06 2012-04-19 Canon Inc Image processor and method for controlling the same, and computer program

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018086157A1 (en) * 2016-11-09 2018-05-17 苏州微清医疗器械有限公司 Ultra-wide ocular fundus imaging system
US10575730B2 (en) 2016-11-09 2020-03-03 Suzhou Microclear Medical Instruments Co., Ltd. Ultra wide field fundus imaging system
CN110363782A (en) * 2019-06-13 2019-10-22 平安科技(深圳)有限公司 A kind of area recognizing method based on limb recognition algorithm, device and electronic equipment
CN110363782B (en) * 2019-06-13 2023-06-16 平安科技(深圳)有限公司 Region identification method and device based on edge identification algorithm and electronic equipment

Also Published As

Publication number Publication date
JP5996959B2 (en) 2016-09-21
JP2014023867A (en) 2014-02-06

Similar Documents

Publication Publication Date Title
JP5474435B2 (en) Fundus analysis apparatus and fundus analysis program
JP4971872B2 (en) Fundus observation apparatus and program for controlling the same
JP5061380B2 (en) Fundus observation apparatus, ophthalmologic image display apparatus, and program
JP5324839B2 (en) Optical image measuring device
JP6009935B2 (en) Ophthalmic equipment
JP5415902B2 (en) Ophthalmic observation device
JP5941761B2 (en) Ophthalmic photographing apparatus and ophthalmic image processing apparatus
JP5543171B2 (en) Optical image measuring device
JP5936254B2 (en) Fundus observation apparatus and fundus image analysis apparatus
JP2008206684A (en) Fundus oculi observation device, fundus oculi image processing device and program
JP2016041221A (en) Ophthalmological photographing device and control method thereof
JP5996959B2 (en) Fundus analyzer
JP6101475B2 (en) Ophthalmic observation device
JP5144579B2 (en) Ophthalmic observation device
JP6452977B2 (en) Ophthalmic imaging apparatus and control method thereof
JP6099782B2 (en) Ophthalmic imaging equipment
JP6158535B2 (en) Fundus analyzer
JP6503040B2 (en) Ophthalmic observation device
JP6557388B2 (en) Ophthalmic imaging equipment
JP6254729B2 (en) Ophthalmic imaging equipment
JP6106300B2 (en) Ophthalmic imaging equipment
JP6106299B2 (en) Ophthalmic photographing apparatus and ophthalmic image processing apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13826122

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13826122

Country of ref document: EP

Kind code of ref document: A1