WO2017090361A1 - Dispositif d'examen de la cornée - Google Patents

Dispositif d'examen de la cornée Download PDF

Info

Publication number
WO2017090361A1
WO2017090361A1 PCT/JP2016/081577 JP2016081577W WO2017090361A1 WO 2017090361 A1 WO2017090361 A1 WO 2017090361A1 JP 2016081577 W JP2016081577 W JP 2016081577W WO 2017090361 A1 WO2017090361 A1 WO 2017090361A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
corneal
cornea
light
Prior art date
Application number
PCT/JP2016/081577
Other languages
English (en)
Japanese (ja)
Inventor
釣 滋孝
秋葉 正博
Original Assignee
株式会社トプコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社トプコン filed Critical 株式会社トプコン
Publication of WO2017090361A1 publication Critical patent/WO2017090361A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/13Ophthalmic microscopes

Definitions

  • the present invention relates to a cornea inspection device.
  • a specular microscope is known as an apparatus for inspecting the cornea (see, for example, Patent Document 1).
  • a specular microscope is used to determine the health of the corneal endothelium.
  • the corneal endothelium is a very thin sheet-like tissue and has a structure in which hexagonal corneal endothelial cells are arranged in a paving stone shape. Endothelial cells have no regenerative function, and once damaged, the surrounding endothelial cells enlarge the area and interpolate the damaged site. When the density of endothelial cells decreases, swelling of the cornea occurs, and the transparency of the cornea cannot be maintained.
  • the specular microscope projects illumination light from the oblique direction to the cornea, detects corneal endothelial cells by detecting reflected light from the cornea with an image sensor, and analyzes the acquired image to analyze the acquired corneal endothelium. Find the size and shape of the cells.
  • the size of the corneal endothelial cell is determined, for example, by determining the density of the corneal endothelial cell.
  • the calculation of the density includes a process of counting the number of corneal endothelial cells depicted in the captured image and a process of statistically calculating the density from the obtained number and the area of the imaging range.
  • the shape of the corneal endothelial cell is specified by, for example, determining the area of the corneal endothelial cell.
  • FF-OCT full-field optical coherence tomography
  • Patent Document 3 discloses anterior ocular imaging using Fourier domain OCT (FD-OCT). Since FD-OCT does not require a mechanical scan in the depth direction as in FF-OCT, the scanning speed can be increased, and it is suitable for photographing a living eye with eye movement.
  • FD-OCT a spectral domain OCT (SD-OCT) that performs interference measurement using a low coherent light source and a spectroscope, and a swept source that performs interference measurement using a wavelength variable light source and a balanced photodiode.
  • SD-OCT spectral domain OCT
  • SS-OCT spectral domain OCT
  • the examination of the corneal endothelium using a specular microscope or FF-OCT has a problem that the range that can be photographed at one time is narrow.
  • a panoramic image is created by synthesizing a plurality of images obtained by a plurality of imaging.
  • the inspection time becomes long. End up.
  • FF-OCT a device for removing the influence of reflection on the cornea surface is required.
  • FD-OCT has not been applied to corneal endothelium examination until now.
  • An object of the present invention is to provide a cornea inspection apparatus capable of quickly performing a wide range inspection of the cornea.
  • the corneal inspection apparatus of the embodiment includes a data collection unit and a processing unit.
  • the data collection unit includes an optical deflector, an interferometer, and a detector.
  • the optical deflector is configured to be capable of two-dimensionally deflecting measurement light projected onto the eye to be examined.
  • the interferometer is configured to cause the return light of the measurement light from the eye to be interfered with the reference light.
  • the detector is configured to detect interference light generated by the interferometer.
  • the data collection unit is configured to collect data of a three-dimensional region of the eye to be examined including at least a part of the cornea.
  • the processing unit calculates an evaluation value representing the state of the corneal endothelium by processing the data collected by the data collecting unit.
  • Schematic showing an example of composition of a cornea inspection device concerning an embodiment Schematic showing an example of composition of a cornea inspection device concerning an embodiment. Schematic showing an example of composition of a cornea inspection device concerning an embodiment. Schematic showing an example of composition of a cornea inspection device concerning an embodiment.
  • inspection apparatus which concerns on embodiment Schematic for demonstrating an example of operation
  • inspection apparatus which concerns on a modification Schematic showing an example of a structure of the cornea test
  • the corneal examination apparatus is an ophthalmologic apparatus having a function of performing optical coherence tomography (OCT) of the anterior segment (cornea).
  • OCT optical coherence tomography
  • the type of OCT is not limited to the swept source OCT, and may be a spectral domain OCT.
  • the type of OCT applicable to the embodiment includes an optical deflector capable of two-dimensionally deflecting measurement light projected on the eye to be examined, and the data of the three-dimensional region of the anterior eye part is obtained. It is only necessary to have a configuration capable of executing a three-dimensional scan for collection.
  • the cornea inspection device of the embodiment may or may not have a function of acquiring a photograph (digital photograph) of the eye to be examined such as a fundus camera.
  • the cornea examination apparatus of the embodiment has a function capable of photographing the surface of the eye to be examined from the front direction or the oblique direction like a slit lamp microscope, an anterior ocular segment photographing camera, or a surgical microscope instead of the fundus camera. Good.
  • the corneal examination apparatus 1 shown in FIG. 1 has a function of taking a photograph of the fundus oculi Ef and an OCT of the eye E, and a function of taking a photograph of the anterior segment Ea and performing an OCT.
  • the cornea inspection device 1 includes a fundus camera unit 2, an OCT unit 100, and an arithmetic control unit 200.
  • the fundus camera unit 2 is provided with an optical system that is substantially the same as that of a conventional fundus camera.
  • the OCT unit 100 is provided with an optical system and a mechanism for performing OCT.
  • the arithmetic control unit 200 includes a processor.
  • a chin rest and a forehead support for supporting the face of the subject are provided at positions facing the fundus camera unit 2.
  • the “processor” is, for example, a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), an ASIC (Application Specific Integrated Circuit), a programmable logic device (eg, SPLD (Simple ProGLD). It means a circuit such as Programmable Logic Device (FPGA) or Field Programmable Gate Array (FPGA).
  • the processor implements the functions according to the embodiment by reading and executing a program stored in a storage circuit or a storage device.
  • the fundus camera unit 2 is provided with an optical system and mechanism for photographing the fundus oculi Ef of the eye E to be examined. Images obtained by photographing the fundus oculi Ef (called fundus images, fundus photographs, etc.) include observation images and photographed images. The observation image is obtained, for example, by moving image shooting using near infrared light. The captured image is, for example, a color image or monochrome image obtained using visible flash light, or a monochrome image obtained using near-infrared flash light.
  • the fundus camera unit 2 may be able to acquire a fluorescein fluorescent image, an indocyanine green fluorescent image, a spontaneous fluorescent image, or the like.
  • the fundus camera unit 2 includes an illumination optical system 10 and a photographing optical system 30.
  • the illumination optical system 10 irradiates the eye E with illumination light.
  • the imaging optical system 30 detects the return light of the illumination light from the eye E.
  • the measurement light from the OCT unit 100 is guided to the eye E through the optical path in the fundus camera unit 2, and the return light is guided to the OCT unit 100 through the same optical path.
  • the observation light source 11 of the illumination optical system 10 is, for example, a halogen lamp or an LED (Light Emitting Diode).
  • the light (observation illumination light) output from the observation light source 11 is reflected by the reflection mirror 12 having a curved reflection surface, passes through the condensing lens 13, passes through the visible cut filter 14, and is converted into near infrared light. Become. Further, the observation illumination light is once converged in the vicinity of the photographing light source 15, reflected by the mirror 16, and passes through the relay lenses 17 and 18, the diaphragm 19 and the relay lens 20.
  • the observation illumination light is reflected by the peripheral part of the perforated mirror 21 (area around the perforated part), passes through the dichroic mirror 46, and is refracted by the objective lens 22 so as to pass through the eye E (especially the fundus oculi Ef). Illuminate.
  • the return light of the observation illumination light from the eye E is refracted by the objective lens 22, passes through the dichroic mirror 46, passes through the hole formed in the central region of the perforated mirror 21, and passes through the dichroic mirror 55.
  • the light is reflected by the mirror 32 via the photographing focusing lens 31. Further, the return light passes through the half mirror 33A, is reflected by the dichroic mirror 33, and forms an image on the light receiving surface of the CCD image sensor 35 by the condenser lens.
  • the CCD image sensor 35 detects the return light at a predetermined frame rate, for example. Note that an observation image of the fundus oculi Ef is obtained when the photographing optical system 30 is focused on the fundus oculi Ef, and an anterior eye observation image is obtained when the focus is on the anterior eye segment.
  • the imaging light source 15 is a visible light source including, for example, a xenon lamp or an LED.
  • the light (imaging illumination light) output from the imaging light source 15 is applied to the fundus oculi Ef through the same path as the observation illumination light.
  • the return light of the imaging illumination light from the eye E is guided to the dichroic mirror 33 through the same path as the return light of the observation illumination light, passes through the dichroic mirror 33, is reflected by the mirror 36, and is reflected by the condenser lens 37.
  • An image is formed on the light receiving surface of the CCD image sensor 38.
  • the anterior eye lens 23 is inserted into the optical path when photographing the anterior eye part of the eye E (especially OCT), and moves the focus position of the light projected on the eye E to or near the anterior eye part.
  • the anterior eye lens 23 may be configured as an attachment that can be attached to the fundus camera unit 2, or may be configured to be inserted / retracted with respect to the optical path by an anterior eye lens moving mechanism 23A described later.
  • a fundus lens that is inserted into the optical path when photographing the fundus oculi Ef (especially OCT) may be provided.
  • the fundus lens may be an attachment, or may be configured to be inserted / retracted with respect to the optical path by a fundus lens moving mechanism (not shown).
  • the LCD 39 displays a fixation target for fixing the eye E to be examined.
  • a part of the light beam (fixed light beam) output from the LCD 39 is reflected by the half mirror 33A, reflected by the mirror 32, and passes through the photographing focusing lens 31 and the dichroic mirror 55, and then the hole of the aperture mirror 21. Pass through the department.
  • the fixation light beam that has passed through the hole of the aperture mirror 21 passes through the dichroic mirror 46, is refracted by the objective lens 22, and is projected onto the fundus oculi Ef.
  • the fixation position of the eye E can be changed.
  • a matrix LED in which a plurality of LEDs are two-dimensionally arranged, or a combination of a light source and a variable aperture (liquid crystal aperture or the like) can be used as the fixation light output unit.
  • the fundus camera unit 2 is provided with an alignment optical system 50 and a focus optical system 60.
  • the alignment optical system 50 generates an alignment index used for alignment of the optical system with respect to the eye E.
  • the focus optical system 60 generates a split index used for focus adjustment with respect to the eye E.
  • Alignment light output from the LED 51 of the alignment optical system 50 is reflected by the dichroic mirror 55 via the apertures 52 and 53 and the relay lens 54, and passes through the hole of the perforated mirror 21.
  • the light that has passed through the hole of the perforated mirror 21 passes through the dichroic mirror 46 and is projected onto the eye E by the objective lens 22.
  • the cornea-reflected light of the alignment light passes through the objective lens 22, the dichroic mirror 46, and the hole, part of which passes through the dichroic mirror 55, passes through the photographing focusing lens 31, is reflected by the mirror 32, and is half
  • the light passes through the mirror 33A, is reflected by the dichroic mirror 33, and is projected onto the light receiving surface of the CCD image sensor 35 by the condenser lens. Based on the received light image (alignment index image) by the CCD image sensor 35, manual alignment and auto alignment similar to the conventional one can be performed.
  • the focus optical system 60 is moved along the optical path (illumination optical path) of the illumination optical system 10 in conjunction with the movement of the imaging focusing lens 31 along the optical path (imaging optical path) of the imaging optical system 30.
  • the reflection bar 67 can be inserted into and removed from the illumination optical path.
  • the reflecting surface of the reflecting rod 67 is obliquely provided in the illumination light path.
  • the focus light output from the LED 61 passes through the relay lens 62, is separated into two light beams by the split target plate 63, passes through the two-hole aperture 64, is reflected by the mirror 65, and is reflected by the condenser lens 66. An image is once formed on the reflection surface 67 and reflected. Further, the focus light passes through the relay lens 20, is reflected by the perforated mirror 21, passes through the dichroic mirror 46, is refracted by the objective lens 22, and is projected onto the fundus oculi Ef.
  • the fundus reflected light of the focus light is detected by the CCD image sensor 35 through the same path as the corneal reflected light of the alignment light. Based on the received image (split index image) by the CCD image sensor 35, manual alignment and auto-alignment similar to the conventional one can be performed.
  • the photographing optical system 30 includes diopter correction lenses 70 and 71.
  • the diopter correction lenses 70 and 71 can be selectively inserted into a photographing optical path between the perforated mirror 21 and the dichroic mirror 55.
  • the diopter correction lens 70 is a plus (+) lens for correcting intensity hyperopia, for example, a + 20D (diopter) convex lens.
  • the diopter correction lens 71 is a minus ( ⁇ ) lens for correcting high myopia, for example, a ⁇ 20D concave lens.
  • the diopter correction lenses 70 and 71 are mounted on, for example, a turret plate. The turret plate is formed with a hole for the case where none of the diopter correction lenses 70 and 71 is applied.
  • the dichroic mirror 46 combines the optical path for fundus imaging and the optical path for OCT.
  • the dichroic mirror 46 reflects light in a wavelength band used for OCT and transmits light for fundus photographing.
  • a collimator lens unit 40, an optical path length changing unit 41, an optical scanner 42, an OCT focusing lens 43, a mirror 44, and a relay lens 45 are provided in this order from the OCT unit 100 side.
  • the optical path length changing unit 41 is movable in the direction of the arrow shown in FIG. 1, and changes the optical path length of the optical path for OCT. This change in the optical path length is used for correcting the optical path length according to the axial length of the eye E or adjusting the interference state.
  • the optical path length changing unit 41 includes, for example, a corner cube and a mechanism for moving the corner cube.
  • the optical scanner 42 is disposed at a position optically conjugate with the pupil of the eye E to be examined.
  • the optical scanner 42 changes the traveling direction of the measurement light LS that passes through the optical path for OCT. Thereby, the eye E is scanned with the measurement light LS.
  • the optical scanner 42 can deflect the measurement light LS in an arbitrary direction on the xy plane, and includes, for example, a galvanometer mirror that deflects the measurement light LS in the x direction and a galvanometer mirror that deflects the measurement light LS in the y direction.
  • the optical scanner 42 is an example of an optical deflector (described above) capable of two-dimensionally deflecting the measurement light LS projected on the eye to be examined.
  • the optical scanner 42 is optically conjugate with the fundus oculi Ef, and can scan the anterior eye part Ea or the xy plane in the vicinity thereof with the measurement light LS.
  • the OCT unit 100 is provided with an optical system for performing OCT of the eye E.
  • the configuration of this optical system is the same as that of the conventional swept source OCT. That is, this optical system divides the light from the wavelength sweep type (wavelength scanning type) light source into the measurement light and the reference light, and returns the return light of the measurement light from the eye E and the reference light via the reference light path.
  • An interference optical system that generates interference light by causing interference and detects the interference light is included.
  • a detection result (detection signal) obtained by the interference optical system is a signal indicating the spectrum of the interference light, and is sent to the arithmetic control unit 200.
  • the light source unit 101 includes a wavelength swept type (wavelength scanning type) light source that changes the wavelength of the emitted light at high speed, like a general swept source OCT.
  • the wavelength sweep type light source is, for example, a near infrared laser light source.
  • the light L0 output from the light source unit 101 is guided to the polarization controller 103 by the optical fiber 102 and its polarization state is adjusted. Further, the light L0 is guided to the fiber coupler 105 by the optical fiber 104 and is divided into the measurement light LS and the reference light LR.
  • the reference light LR is guided to the collimator 111 by the optical fiber 110, converted into a parallel light beam, and guided to the corner cube 114 via the optical path length correction member 112 and the dispersion compensation member 113.
  • the optical path length correction member 112 acts to match the optical path length of the reference light LR and the optical path length of the measurement light LS.
  • the dispersion compensation member 113 acts to match the dispersion characteristics between the reference light LR and the measurement light LS.
  • the corner cube 114 turns the traveling direction of the incident reference light LR in the reverse direction.
  • the incident direction and the emitting direction of the reference light LR with respect to the corner cube 114 are parallel to each other.
  • the corner cube 114 is movable in the incident direction of the reference light LR, and thereby the optical path length of the reference light LR is changed.
  • the optical path length changing unit 41 for changing the length of the optical path (measurement optical path, measurement arm) of the measurement light LS and the optical path (reference optical path, reference arm) of the reference light LR are used. Both corner cubes 114 for changing the length are provided, but only one of the optical path length changing unit 41 and the corner cube 114 may be provided. It is also possible to change the difference between the measurement optical path length and the reference optical path length using optical members other than these.
  • the reference light LR that has passed through the corner cube 114 is converted from a parallel light beam into a converged light beam by the collimator 116 via the dispersion compensation member 113 and the optical path length correction member 112, and enters the optical fiber 117.
  • the reference light LR incident on the optical fiber 117 is guided to the polarization controller 118 and its polarization state is adjusted.
  • the reference light LR is guided to the attenuator 120 by the optical fiber 119 and the amount of light is adjusted, and is guided to the fiber coupler 122 by the optical fiber 121. It is burned.
  • the measurement light LS generated by the fiber coupler 105 is guided by the optical fiber 127 and converted into a parallel light beam by the collimator lens unit 40, and the optical path length changing unit 41, the optical scanner 42, the OCT focusing lens 43, and the mirror 44. Then, the light passes through the relay lens 45, is reflected by the dichroic mirror 46, is refracted by the objective lens 22 (and the anterior eye lens 23), and enters the eye E to be examined. The measurement light LS is scattered and reflected at various depth positions of the eye E. The return light of the measurement light LS from the eye E travels in the reverse direction on the same path as the forward path, is guided to the fiber coupler 105, and reaches the fiber coupler 122 via the optical fiber 128.
  • the fiber coupler 122 combines (interferences) the measurement light LS incident through the optical fiber 128 and the reference light LR incident through the optical fiber 121 to generate interference light.
  • the fiber coupler 122 generates a pair of interference light LC by branching the interference light at a predetermined branching ratio (for example, 1: 1).
  • the pair of interference lights LC are guided to the detector 125 through optical fibers 123 and 124, respectively.
  • the detector 125 is, for example, a balanced photodiode (Balanced Photo Diode).
  • the balanced photodiode has a pair of photodetectors that respectively detect the pair of interference lights LC, and outputs a difference between detection results obtained by these.
  • the detector 125 sends the detection result (detection signal) to a DAQ (Data Acquisition System) 130.
  • DAQ Data Acquisition System
  • the clock KC is supplied from the light source unit 101 to the DAQ 130.
  • the clock KC is generated in synchronization with the output timing of each wavelength swept within a predetermined wavelength range by the wavelength sweep type light source in the light source unit 101.
  • the light source unit 101 optically delays one of the two branched lights obtained by branching the light L0 of each output wavelength, and then generates a clock KC based on the result of detecting these combined lights. Generate.
  • the DAQ 130 samples the detection signal input from the detector 125 based on the clock KC.
  • the DAQ 130 sends the sampling result of the detection signal from the detector 125 to the arithmetic control unit 200.
  • the arithmetic control unit 200 controls each part of the fundus camera unit 2, the display device 3, and the OCT unit 100.
  • the arithmetic control unit 200 executes various arithmetic processes.
  • the arithmetic control unit 200 performs signal processing such as Fourier transform on the spectrum distribution based on the detection result obtained by the detector 125 for each series of wavelength scans (for each A line).
  • a reflection intensity profile is formed.
  • the arithmetic control unit 200 forms image data by imaging the reflection intensity profile of each A line.
  • the arithmetic processing for that is the same as the conventional swept source OCT.
  • the arithmetic control unit 200 includes, for example, a processor, a RAM (Random Access Memory), a ROM (Read Only Memory), a hard disk drive, a communication interface, and the like. Various computer programs are stored in a storage device such as a hard disk drive.
  • the arithmetic control unit 200 may include an operation device, an input device, a display device, and the like.
  • Control system A configuration example of the control system of the cornea inspection device 1 is shown in FIG.
  • Control unit 210 controls each unit of the corneal examination apparatus 1.
  • Control unit 210 includes a processor.
  • the control unit 210 is provided with a main control unit 211 and a storage unit 212.
  • the main control unit 211 performs various controls.
  • the main control unit 211 includes an imaging focusing lens 31, CCDs (image sensors) 35 and 38, an LCD 39, an optical path length changing unit 41, an optical scanner 42, an OCT focusing lens 43, a focus optical system 60, a reflector 67,
  • the light source unit 101, the reference driving unit 114A, the detector 125, the DAQ 130, and the like are controlled.
  • the reference driving unit 114A moves the corner cube 114 provided in the reference optical path. Thereby, the length of the reference optical path is changed.
  • the fundus camera unit 2 is provided with an anterior eye lens moving mechanism 23A.
  • the anterior eye lens moving mechanism 23A inserts the anterior eye lens 23 into the optical path between the objective lens 22 and the eye E and retracts the anterior eye lens 23 from the optical path.
  • the anterior eye lens moving mechanism 23A includes, for example, a pulse motor that generates driving force under the control of the main control unit 211, and a mechanism that transmits the generated driving force to the anterior eye lens 23.
  • the cornea inspection apparatus 1 may include an optical system moving mechanism (not shown).
  • the optical system moving mechanism moves the fundus camera unit 2 (or at least a part of the optical system stored therein), for example, three-dimensionally.
  • the optical system moving mechanism may be capable of moving the OCT unit 100 (or at least a part of the optical system stored therein) three-dimensionally.
  • the storage unit 212 stores various data. Examples of the data stored in the storage unit 212 include image data of an OCT image, image data of a fundus image, and eye information to be examined.
  • the eye information includes subject information such as patient ID and name, left / right eye identification information, electronic medical record information, and the like.
  • the image forming unit 220 forms image data of a cross-sectional image of the fundus oculi Ef or the anterior segment Ea based on the sampling result of the detection signal input from the DAQ 130.
  • This processing includes signal processing such as noise removal (noise reduction), filter processing, and FFT (Fast Fourier Transform) as in the case of the conventional swept source OCT.
  • the image data formed by the image forming unit 220 is a group of image data (a group of image data) formed by imaging reflection intensity profiles in a plurality of A lines (lines in the z direction) arranged along the scan line. A scan image data).
  • the image forming unit 220 includes, for example, at least one of a processor and a dedicated circuit board.
  • image data and “image” based thereon may be identified. Further, the part of the eye E to be examined and the image representing it may be identified.
  • the data processing unit 230 performs image processing and analysis processing on the image formed by the image forming unit 220. For example, the data processing unit 230 executes correction processing such as image luminance correction and dispersion correction. Further, the data processing unit 230 performs image processing and analysis processing on an image (fundus image, anterior eye image, etc.) obtained by the fundus camera unit 2.
  • the data processing unit 230 includes, for example, at least one of a processor and a dedicated circuit board.
  • the data processing unit 230 includes a three-dimensional (3D) image forming unit 231, a partial image specifying unit 232, and an evaluation value calculating unit 233.
  • ⁇ Three-dimensional image forming unit 231 When a three-dimensional scan (raster scan or the like) of the eye E is performed, the image forming unit 220 forms a two-dimensional cross-sectional image (B scan image) corresponding to each scanning line.
  • the three-dimensional image forming unit 231 forms a three-dimensional image based on these two-dimensional cross-sectional images.
  • a three-dimensional image means image data in which pixel positions are defined by a three-dimensional coordinate system. Examples of 3D images include stack data and volume data.
  • Stack data is image data obtained by three-dimensionally arranging a plurality of cross-sectional images corresponding to a plurality of scanning lines based on the positional relationship of these scanning lines. That is, stack data is image data obtained by expressing a plurality of cross-sectional images originally defined by individual two-dimensional coordinate systems by a single three-dimensional coordinate system (that is, by embedding them in one three-dimensional space). is there.
  • Volume data (voxel data) is formed by executing known image processing such as interpolation processing for interpolating pixels between a plurality of cross-sectional images included in the stack data to form voxels.
  • a pseudo three-dimensional image (rendered image) is formed by performing rendering processing (volume rendering, MIP (Maximum Intensity Projection), etc.) on the volume data.
  • a two-dimensional cross-sectional image can be formed from a group of pixels arranged in an arbitrary cross section in the volume data. This image processing is called multi-planar reconstruction (MPR).
  • MPR multi-planar reconstruction
  • ⁇ Partial image specifying unit 232 When a three-dimensional image of the anterior segment Ea is formed by the three-dimensional image forming unit 231, the partial image specifying unit 232 analyzes the three-dimensional image, and thereby the partial image in the three-dimensional image corresponding to the corneal endothelium. (Corneal endothelium image) is specified. Some examples of this process are described below.
  • the partial image specifying unit 232 performs processing for dividing the three-dimensional image of the anterior segment Ea into a plurality of partial images.
  • An example of such processing is segmentation.
  • the partial image specifying unit 232 divides the three-dimensional image of the anterior segment Ea into at least a corneal endothelial image and other partial images.
  • the segmentation in this example is a process for specifying the boundary between the corneal endothelium image and another partial image. This process is executed based on the change in the pixel value of the three-dimensional image of the anterior segment Ea, as in the conventional case. For example, segmentation is performed so as to identify a characteristic value among the values of the pixel group arranged in each A line and select a pixel having the specified value as a pixel of the corneal endothelium image (or its boundary).
  • the partial image specifying unit 232 may obtain an approximate curve of the boundary between the corneal endothelial image and another partial image.
  • This approximate curve can be obtained by an arbitrary method, and examples thereof include a linear approximate curve, a logarithmic approximate curve, a polynomial approximate curve, a power approximate curve, an exponential approximate curve, and a moving average approximate curve.
  • the corneal endothelial image in each of the plurality of two-dimensional cross-sectional images corresponding to the plurality of scanning lines of the three-dimensional scan is specified, Based on them, a three-dimensional image (stack data, volume data, etc.) may be formed. That is, after forming a three-dimensional image from a plurality of two-dimensional cross-sectional images, a corneal endothelium image in the three-dimensional image may be specified, or after specifying a corneal endothelium image in each of the plurality of two-dimensional cross-sectional images, A three-dimensional image may be formed from the two-dimensional cross-sectional image. It is understood that both of these processes are included in the present invention.
  • the partial image specified by the partial image specifying unit 232 is not limited to the corneal endothelium image.
  • the entire cornea, corneal interface (corneal surface, corneal back surface), corneal epithelium or interface thereof, Bowman's membrane or interface thereof, corneal stroma or interface thereof, Dua layer or interface thereof, Descemet's membrane or interface thereof It is possible to specify a partial image corresponding to a surface, a crystalline lens or its boundary surface, an iris or its boundary surface, and the like.
  • Corneal endothelial cells are monolayer cells located at the deepest part of the cornea and arranged in a paving stone shape.
  • the shape of corneal endothelial cells is generally pentagonal to heptagonal, and most are hexagonal.
  • Corneal endothelial cells usually have a diameter of about 20 ⁇ m and an area of about 300 to 350 ⁇ m 2 .
  • a horizontal cross-sectional image formed by MPR or the like is used.
  • the horizontal cross-sectional image is an image representing a cross section (xy cross section) orthogonal to the z direction.
  • the horizontal cross-sectional image is formed from the three-dimensional image by the data processing unit 230 (for example, the partial image specifying unit 232).
  • the partial image specifying unit 232 first extracts an image area (cell area) of a cell depicted in the horizontal cross-sectional image based on the pixel values of the pixels constituting the horizontal cross-sectional image.
  • a cell boundary region has a high luminance
  • a cell inner region has a low luminance.
  • the partial image specifying unit 232 specifies an image region corresponding to the boundary region of the cell by performing threshold processing based on such characteristics, and thereby extracts the cell region. Note that the process of extracting a cell region is not limited to this, and any known technique for extracting a predetermined image region in an image can be applied. For example, binarization processing or filter processing can be used.
  • the partial image specifying unit 232 analyzes the cell region extracted by the above processing, and generates cell information indicating the form (size or shape) of the cell region.
  • the size (diameter, circumference, area, etc.) of the cell region is calculated with reference to the measurement magnification of the image.
  • the shape of the cell region can be specified based on the arrangement of pixels constituting the image region corresponding to the boundary region of the cell specified in the above processing. Further, the shape of the cell (the shape in the xy cross section, etc.) can be obtained based on this wire model by creating a wire model by thinning an image region corresponding to the boundary region of the cell, for example. Note that such a wire model generally includes a boundary region of a plurality of cells.
  • the boundary area of a single cell can be specified by searching a loop-shaped image area that does not include a part of the wire model inside. Further, the shape can be determined, for example, by calculating a differential coefficient at each position on the loop-shaped image region, or by using pattern matching or the like.
  • the process for determining whether the corneal endothelium is depicted in the horizontal cross-sectional image is, for example, determining whether the shape of each identified cell region is a substantially hexagon (for example, by image correlation processing with a hexagonal template image), When the substantially hexagonal cell region is present in a predetermined ratio or more, this horizontal cross-sectional image is determined to be an image of the corneal endothelium (an image including the corneal endothelium).
  • a corneal endothelium image can be specified by such processing.
  • the corneal endothelium image is obtained. It is possible to specify. In addition, it is possible to specify a corneal endothelium image by determining by pattern matching or the like whether or not the plurality of specified cell regions have a characteristic arrangement (paving stone-like arrangement) of the corneal endothelium.
  • the partial image specifying unit 232 performs the above processing on each of a plurality of horizontal cross-sectional images having different depth positions formed from the three-dimensional image, so that each horizontal cross-sectional image is a corneal endothelium image ( Whether a corneal endothelium image is included) is determined.
  • a group of horizontal cross-sectional images determined to be corneal endothelium images corresponds to the entire corneal endothelium image drawn by a three-dimensional image.
  • the corneal endothelium image (or other part), it is possible to refer to partial images of other characteristic parts. For example, it can be used that one of the boundaries of the corneal endothelium image (the boundary surface on the crystalline lens side) corresponds to the corneal back surface. As a specific example, first, an image corresponding to the cornea back surface (corneal back image) is specified. This process specifies the boundary between the cornea and the anterior chamber (aqueous humor). The boundary on the corneal surface side of the corneal endothelial image can be specified by analyzing the change in the pixel value from the specified corneal back surface toward the corneal surface direction.
  • the evaluation value calculation unit 233 calculates an evaluation value representing the state of the corneal endothelium of the eye E by analyzing the corneal endothelium image specified by the partial image specifying unit 232.
  • the evaluation value may include a parameter that can be acquired by a specular microscope.
  • Such parameters include the size and shape of corneal endothelial cells.
  • the size of the corneal endothelial cell can be obtained, for example, by determining the density of the corneal endothelial cell.
  • the calculation of the density includes a process of counting the number of corneal endothelial cells depicted in the captured image and a process of statistically calculating the density from the obtained number and the area of the imaging range.
  • the shape of the corneal endothelial cell is specified by, for example, determining the area of the corneal endothelial cell.
  • the evaluation value calculation unit 233 executes image processing and calculation processing similar to those in the conventional art.
  • the evaluation value may include a parameter that is different from a parameter that can be acquired by the specular microscope.
  • a parameter that can be acquired by the specular microscope For example, the volume, surface area, weight (estimated value), three-dimensional shape, etc. of the corneal endothelium image can be calculated as the evaluation value.
  • the evaluation value calculation unit 233 of the present embodiment includes an image conversion unit 2331 and a calculation unit 2332.
  • the corneal surface and the corneal back are curved, and the same applies to the corneal endothelium.
  • data of a rectangular parallelepiped three-dimensional region including the cornea is collected, and a three-dimensional image formed based on the data also has a rectangular parallelepiped definition region.
  • the corneal endothelium image in such a rectangular parallelepiped three-dimensional image is a curved two-dimensional partial image or a three-dimensional partial image having a curved surface and a back surface.
  • the image conversion unit 2331 converts such a curved corneal endothelium image into a flat image (flat image). For example, the image conversion unit 2331 forms a flat image by deforming a curved corneal endothelium image. Alternatively, the image conversion unit 2331 forms a flat image by projecting a curved corneal endothelium image onto a predetermined plane (for example, an xy plane). The number of flat images to be formed is arbitrary.
  • the flat image is formed, for example, by deforming or projecting an arbitrary cross section (curved cross section) in the three-dimensional partial image.
  • This cross section is, for example, a boundary surface (at least one of the front surface and the back surface) of the corneal endothelium image, an arbitrary surface between the front surface and the back surface of the corneal endothelium image (for example, a surface located between the front surface and the back surface).
  • the designation of the cross section is executed by the image conversion unit 2331 or the user.
  • the image conversion unit 2331 can form a flat image for each of one or more cross sections of the three-dimensional image.
  • the image conversion unit 2331 accumulates at least a part of the pixel values of the corneal endothelium image in a predetermined direction to obtain a two-dimensional image (projection image, shadowgram). ) Can be performed.
  • the integration direction is the z direction
  • the formed two-dimensional image is used as a flat image.
  • a flat image can be formed by deforming or projecting the formed two-dimensional image.
  • the calculation unit 2332 calculates an evaluation value representing the state of the corneal endothelium of the eye E by analyzing the flat image formed by the image conversion unit 2331.
  • the evaluation value includes at least one type.
  • the calculation unit 2332 executes processing based on the same image processing program or calculation program as that mounted on the specular microscope.
  • the calculation unit 2332 executes processing based on an image processing program or an arithmetic program provided in advance according to the evaluation value.
  • the processing for calculating the volume, surface area, and weight based on the image data may be the same as in conventional image analysis.
  • the weight value per unit volume (standard value, statistical value, etc.) of the corneal endothelium or corneal endothelial cell is referred to.
  • the user interface 240 includes a display unit 241 and an operation unit 242.
  • the display unit 241 includes the display device 3.
  • the operation unit 242 includes various operation devices and input devices.
  • the user interface 240 may include a device such as a touch panel in which a display function and an operation function are integrated. Embodiments that do not include at least a portion of the user interface 240 can also be constructed.
  • the display device may be an external device connected to the corneal examination apparatus.
  • ⁇ Operation mode> The operation of the cornea inspection device 1 will be described. An example of the operation is shown in FIG.
  • OCT of the anterior segment Ea of the eye E is performed.
  • the anterior segment lens 23 is disposed in the measurement optical path.
  • a three-dimensional region including the cornea is scanned and data is collected.
  • This three-dimensional scan is assumed to be a raster scan in which a plurality of linear scanning lines are arranged in parallel to each other.
  • Each scanning line has, for example, a line segment shape extending in the x direction, and the plurality of scanning lines are arranged at equal intervals in the y direction.
  • the image forming unit 220 forms a plurality of B scan images corresponding to the plurality of scanning lines based on the data collected in step S1.
  • the B scan image of this example is, for example, a two-dimensional cross-sectional image representing an xz cross section.
  • the three-dimensional image forming unit 231 forms a three-dimensional image (stack data, volume data, etc.) based on the plurality of B scan images formed in step S2.
  • the partial region specifying unit 232 specifies the corneal endothelium image corresponding to the corneal endothelium of the eye E by analyzing the three-dimensional image formed in step S3.
  • the image conversion unit 2331 flattens the corneal endothelium image specified in step S4. Thereby, a flat image representing the corneal endothelium is formed.
  • the calculating unit 2332 calculates an evaluation value representing the state of the corneal endothelium of the eye E by analyzing the flat image formed in step S5.
  • the type of evaluation value to be calculated is set in advance.
  • the main control unit 211 causes the display unit 241 to display the flat image formed in step S5 and the evaluation value calculated in step S6.
  • FIG. 5A shows an example of a three-dimensional image formed by the three-dimensional image forming unit 231 in step S3.
  • the three-dimensional image V is included corneal image G C is an image region corresponding to the cornea of the eye E.
  • step S4 the partial image specifying unit 232 analyzes the three-dimensional image V to specify a corneal endothelium image corresponding to the corneal endothelium of the eye E.
  • Corneal endothelium image is located in + z side of the cornea image G C.
  • Corneal endothelium image G E shown in FIG. 5B is an example of a partial region specified by the partial image selecting unit 232.
  • step S5 the image conversion unit 2331, by converting the corneal endothelium image G E, to form a flat image defined by the xy plane.
  • FIG. 5C shows an example of such image conversion.
  • the flat image G F representing the corneal endothelium of the eye E is formed.
  • the flat image G F as shown in FIG. 5D, a plurality of corneal endothelial cells arranged in a cobblestone shape is depicted.
  • an evaluation value representing the state of the corneal endothelium of the eye E is calculated (step S6).
  • step S7 the main controller 211, a flat image G F formed in step S5, and displays the evaluation value calculated in step S6 to the display unit 241.
  • it may be subjected to machining in a flat image G F.
  • This outline emphasis display may be, for example, an image representing the outline of a corneal endothelial cell.
  • the main controller 211 can display overlapping the contour image on a flat image G F.
  • FIG. 6 shows a part of the cornea inspection apparatus according to the first modification.
  • a data processing unit 230A shown in FIG. 6 is applied instead of the data processing unit 230 (see FIG. 3) of the above embodiment. That is, the cornea examination apparatus according to this modification includes the configuration shown in FIGS. 1 and 2, the configuration other than the data processing unit 230 shown in FIG. 3, and the data processing unit 230A shown in FIG.
  • the data processing unit 230A includes a three-dimensional image forming unit 231, a partial image specifying unit 232, and an evaluation value calculating unit 233, as in the above embodiment.
  • the evaluation value calculation unit 233 of the present modification may include an image conversion unit 2331 and a calculation unit 2332 as in the above embodiment, or may include other configurations.
  • the partial image specifying unit 232 analyzes the three-dimensional image formed by the three-dimensional image forming unit 231 to thereby obtain a corneal endothelium image (corresponding to the corneal endothelium of the eye E) ( (Partial image of 3D image) is specified. Further, the partial image specifying unit 232 analyzes the three-dimensional image formed by the three-dimensional image forming unit 231 to thereby correspond to the cornea surface image corresponding to the cornea surface of the eye E and the back surface in the three-dimensional image. At least one of the corneal back images to be identified is specified. Both the corneal surface image and the corneal back image are partial images of a three-dimensional image. The specification of the corneal surface image and the specification of the corneal back surface image may each be executed by image processing similar to the specification of the corneal endothelial image.
  • the data processing unit 230A is further provided with a shape information generation unit 234.
  • the shape information generation unit 234 generates shape information of at least one of the corneal surface image and the corneal back image specified by the partial image specifying unit 232.
  • the shape information may include, for example, at least one of the curvature and the curvature radius of the cornea of the eye E. Further, the shape information may include at least one of a curvature and a curvature radius at a plurality of positions of the cornea.
  • the curvature and the radius of curvature are calculated from, for example, an arbitrary measurement point in the B-scan image or the three-dimensional image and the shape of the vicinity region. This neighboring region is, for example, a region centered on the measurement point and having a predetermined radius.
  • image processing such as smoothing may be performed on the corneal curvature image or the corneal back surface image.
  • an evaluation value representing the state of the corneal endothelium of the eye E and shape information of at least one of the corneal surface and the corneal back surface can be obtained.
  • FIG. 7 shows a part of a cornea inspection device according to a second modification.
  • the data processing unit 230B illustrated in FIG. 7 is applied instead of the data processing unit 230 (see FIG. 3) of the above embodiment.
  • the data processing unit 230B includes a three-dimensional image forming unit 231, a partial image specifying unit 232, and an evaluation value calculating unit 233, as in the above embodiment.
  • the OCT scan is performed on a three-dimensional region including at least a part of the surface of the lens (the surface on the cornea side) in addition to the cornea of the eye E.
  • the partial image specifying unit 232 analyzes the three-dimensional image formed by the three-dimensional image forming unit 231 to thereby obtain a corneal endothelium image (corresponding to the corneal endothelium of the eye E) ( (Partial image of 3D image) is specified. Further, the partial image specifying unit 232 analyzes the three-dimensional image formed by the three-dimensional image forming unit 231 to thereby correspond to the cornea back image corresponding to the back surface of the cornea and the surface of the crystalline lens in the three-dimensional image. Identify the lens surface image.
  • the specification of the corneal back surface image and the specification of the lens surface image may each be executed by image processing similar to the specification of the corneal endothelial image.
  • the data processor 230B is further provided with a distance calculator 235.
  • the distance calculation unit 235 calculates the distance between the cornea back surface image specified by the partial image specifying unit 232 and the lens surface image. The calculation of the distance is obtained by, for example, a process of counting the number of pixels located on a line segment (measurement line segment) connecting an arbitrary pixel in the cornea back surface image and an arbitrary pixel in the lens surface image, A process of multiplying the number of pixels by a predetermined unit distance.
  • the measurement line segment is set parallel to the z direction, for example.
  • the measurement line segment may be set so as to pass through at least one of the pixel located at the center (vertex) of the cornea back surface and the pixel located at the center (vertex) of the crystalline lens surface.
  • an evaluation value indicating the state of the corneal endothelium of the eye E and a distance (anterior chamber depth) between the cornea and the lens can be obtained.
  • FIG. 8 shows a part of a cornea inspection device according to a third modification.
  • Adaptive optics is applied to the cornea inspection device of this modification.
  • Compensation optics is an aberration correction technique that detects wavefront aberration due to the influence of a measurement object and a medium and gives the measurement light an aberration that cancels it.
  • aberration occurs due to the distortion of the surface of the cornea and the lens and the nonuniformity of the refractive index distribution. Since such aberration factors vary among individuals, it is necessary to individually correct them.
  • Compensation optics is generally realized by a wavefront sensor that detects wavefront aberration, a wavefront deformer that deforms the wavefront, and a processor that controls the wavefront deformer to cancel the wavefront aberration detected by the wavefront sensor.
  • a wavefront sensor that detects wavefront aberration
  • a wavefront deformer that deforms the wavefront
  • a processor that controls the wavefront deformer to cancel the wavefront aberration detected by the wavefront sensor.
  • the optical system shown in FIG. 8 includes a part of the measurement arm of the optical system shown in FIG. 1 (collimator lens unit 40, optical path length changing unit 41, optical scanner 42, OCT focusing lens 43, mirror 44, relay lens 45, It is applied in place of the dichroic mirror 46).
  • the difference from the measurement arm of the above embodiment is that a beam splitter 47 and a wavefront sensor 48 are added, and that the mirror 44 is replaced with a deformable mirror 49.
  • the control unit 210 of this modification includes a wavefront control unit 213.
  • the beam splitter 47 forms an optical path branched from the measurement optical path. More specifically, the beam splitter 47 branches a part of the return light of the measurement light LS from the eye E from the measurement optical path.
  • the beam splitter 47 is, for example, a half mirror having an arbitrary division ratio.
  • a wavefront sensor 48 is provided in the optical path branched from the measurement optical path by the beam splitter 47.
  • the wavefront sensor 48 detects the wavefront aberration of the return light LS of the measurement light from the eye E.
  • a Shack-Hartmann sensor is used as the wavefront sensor 48.
  • the Shack-Hartmann sensor includes a microlens array arranged in a matrix and an area sensor (two-dimensional imaging device) arranged at a position away from the microlens array by its focal length. An aberration amount is detected by displacement of a plurality of projected spots.
  • the wavefront sensor 48 is not limited to the Shack-Hartmann sensor, and for example, a curvature sensor or a pyramid sensor can be used.
  • a detection signal from the wavefront sensor 48 is input to the wavefront controller 213.
  • the variable shape mirror 49 is an optical device configured to change the shape of the light reflecting surface.
  • the deformable mirror 49 is disposed at a position optically conjugate with the optical scanner 42.
  • a face sheet deformable mirror is used as the deformable mirror 49.
  • the face sheet deformable mirror includes a flexible sheet type mirror and a plurality of actuators provided on the back side thereof.
  • the actuator is, for example, a piezoelectric element or an actuator that operates by electrostatic force or magnetic force. By individually controlling the plurality of actuators, the sheet-type mirror is locally deformed into a concave shape or a convex shape.
  • the deformable mirror 49 is not limited to the face sheet deformable mirror, and for example, a bimorph deformable mirror or a MEMS deformable mirror can be used.
  • the deformable mirror 49 is controlled by the wavefront controller 213.
  • the wavefront control unit 213 changes the shape of the light reflecting surface of the deformable mirror 49 based on the detection signal from the wavefront sensor 48 (that is, the wavefront aberration of the return light of the measurement light LS).
  • the control amount for example, the operation amount of each actuator
  • the control amount of the deformable mirror 49 for canceling the wavefront aberration of the return light is canceled based on the detection signal from the wavefront sensor 48.
  • the calculation process and the process of controlling the deformable mirror 49 (for example, each actuator) based on this control amount are included.
  • the optical path toward the subject eye E is directed. Both the measurement light LS and its return light pass through the variable shape mirror 49.
  • the deformable mirror may be provided in at least one of the forward path and the backward path. In general, it is possible to configure the optical system so that at least one of the wavefronts of the measurement light and its return light is deformed by the variable shape mirror.
  • the cornea inspection device includes a data collection unit and a processing unit.
  • the data collection unit includes an optical deflector, an interferometer, and a detector.
  • the optical deflector can deflect the measurement light projected on the eye to be examined two-dimensionally.
  • the optical scanner 42 of the above embodiment is an example.
  • the interferometer causes the return light of the measurement light from the eye to be interfered with the reference light.
  • the optical element fiber coupler 105 that splits the light from the light source unit 101 into the measurement light LS and the reference light LR, the measurement arm, and the reference arm constitute an example of an interferometer.
  • the detector detects the interference light generated by the interferometer.
  • the detector 125 (and DAQ 130) of the above embodiment is an example.
  • the data collection unit collects data of a three-dimensional region of the eye to be examined including at least a part of the cornea.
  • the processing unit calculates an evaluation value representing the state of the corneal endothelium by processing the data collected by the data collecting unit.
  • the combination of the image forming unit 220 and the data processing unit 230 in the above embodiment is an example.
  • the processing unit may include a three-dimensional image forming unit, a partial image specifying unit, and an evaluation value calculating unit.
  • the three-dimensional image forming unit forms a three-dimensional image representing the three-dimensional region of the eye to be examined based on the data collected by the data collecting unit.
  • the three-dimensional image forming unit 231 of the above embodiment is an example.
  • the partial image specifying unit specifies a partial image in the three-dimensional image corresponding to the corneal endothelium by analyzing the three-dimensional image formed by the three-dimensional image forming unit.
  • the partial image specifying unit 232 of the above embodiment is an example.
  • the evaluation value calculation unit calculates an evaluation value representing the state of the corneal endothelium by analyzing the partial image specified by the partial image specifying unit.
  • the evaluation value calculation unit 233 of the above embodiment is an example.
  • the evaluation value calculation unit may include an image conversion unit and a calculation unit.
  • the image conversion unit converts the partial image specified by the partial image specifying unit into a flat image.
  • the image conversion unit 2331 of the above embodiment is an example.
  • the calculation unit calculates the evaluation value by analyzing the flat image formed by the image conversion unit.
  • the calculation unit 2332 of the above embodiment is an example.
  • the state of the corneal endothelium can be evaluated based on data collected by a three-dimensional scan using SD-OCT or SS-OCT.
  • the range that can be scanned at once by SD-OCT and SS-OCT is sufficiently wider than the single imaging range by the specular microscope and the scan range by FF-OCT. Therefore, it is possible to perform a wider range of inspections more quickly than when using a specular microscope or FF-OCT.
  • the cornea inspection apparatus may include a display control unit that performs various display controls.
  • the display control unit causes the display unit to display a flat image formed by the image conversion unit.
  • the display control unit causes the display unit to display the evaluation value calculated by the evaluation value calculation unit.
  • the main control unit 211 of the above embodiment is an example of a display control unit.
  • the display means may be provided in the corneal examination apparatus (for example, the display unit 241) or a display device provided outside the corneal examination apparatus.
  • an image representing the corneal endothelium (flat image) and an evaluation value representing the state can be presented.
  • the output mode of the flat image and the evaluation value is not limited to the display output.
  • a flat image or an evaluation value can be transmitted to an external computer or storage device, recorded on a recording medium, or printed on a printing medium.
  • the cornea inspection device may be configured to inspect the shape of the cornea in addition to the evaluation of the corneal endothelium.
  • the partial image specifying unit analyzes the three-dimensional image formed by the three-dimensional image forming unit, and thereby the corneal surface image corresponding to the corneal surface and the corneal back image corresponding to the back surface in the three-dimensional image. Specify at least one.
  • the cornea inspection device includes a shape information generation unit that generates shape information of at least one of a cornea surface image and a cornea back surface image.
  • the shape information generation unit 234 of the first modification is an example.
  • the corneal examination apparatus may be configured to measure the anterior chamber depth in addition to the evaluation of the corneal endothelium.
  • the partial image specifying unit analyzes the three-dimensional image formed by the three-dimensional image forming unit, and in this three-dimensional image, the cornea back surface image corresponding to the back surface of the cornea and the lens surface corresponding to the surface of the lens. Identify the image.
  • the cornea inspection device includes a distance calculation unit that calculates the distance between the cornea back surface image and the lens surface image.
  • the distance calculation unit 235 of the second modification is an example.
  • the measurable intraocular distance is not limited to the anterior chamber depth.
  • it can be configured to measure corneal thickness, lens thickness, axial length, and the like.
  • the measurable parameter is not limited to a one-dimensional scale (distance), and may be a two-dimensional scale (area, angle, etc.) or a three-dimensional scale (volume, solid angle, etc.).
  • the cornea inspection device may have an adaptive optics function.
  • a wavefront aberration detection unit and a wavefront deformation unit are provided in the data collection unit.
  • the wavefront aberration detector detects the wavefront aberration of the return light of the measurement light from the eye to be examined.
  • the wavefront sensor 48 of the above embodiment is an example.
  • the wavefront deforming unit deforms at least one of the measurement light and its return light based on the wavefront aberration detected by the wavefront aberration detecting unit.
  • the combination of the deformable mirror 49 and the wavefront controller 213 in the above embodiment is an example.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

Un mode de réalisation de la présente invention concerne un dispositif d'examen de la cornée équipé d'une unité de collecte de données et d'une unité de traitement de données. L'unité de collecte de données comprend un déflecteur de lumière, un interféromètre et un détecteur. Le déflecteur de lumière est conçu pour pouvoir dévier, en deux dimensions, une lumière illuminant un œil examiné. L'interféromètre est conçu pour provoquer le renvoi de la lumière de mesure par l'œil examiné pour qu'elle interfère avec une lumière de référence. Le détecteur est conçu pour détecter une lumière d'interférence produite par l'interféromètre. Grâce à ces éléments, l'unité de collecte de données est conçue pour collecter des données d'une région tridimensionnelle de l'œil examiné, comprenant au moins une partie de la cornée. L'unité de traitement calcule une valeur d'évaluation exprimant l'état de l'endothélium cornéen, par traitement des données collectées par l'unité de collecte de données.
PCT/JP2016/081577 2015-11-27 2016-10-25 Dispositif d'examen de la cornée WO2017090361A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015232189A JP6616673B2 (ja) 2015-11-27 2015-11-27 角膜検査装置
JP2015-232189 2015-11-27

Publications (1)

Publication Number Publication Date
WO2017090361A1 true WO2017090361A1 (fr) 2017-06-01

Family

ID=58764161

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/081577 WO2017090361A1 (fr) 2015-11-27 2016-10-25 Dispositif d'examen de la cornée

Country Status (2)

Country Link
JP (1) JP6616673B2 (fr)
WO (1) WO2017090361A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023189793A1 (fr) * 2022-03-29 2023-10-05 ソニーグループ株式会社 Dispositif d'observation médicale et dispositif de traitement d'informations

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6942536B2 (ja) * 2017-06-29 2021-09-29 株式会社トプコン レーザ治療装置
JP2019042304A (ja) * 2017-09-05 2019-03-22 株式会社ニデック 眼科用画像処理プログラム
JP7148114B2 (ja) * 2018-06-12 2022-10-05 株式会社トーメーコーポレーション 眼科装置
JP7149519B2 (ja) * 2018-09-27 2022-10-07 国立大学法人 筑波大学 眼測定装置及び方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009022502A (ja) * 2007-07-19 2009-02-05 Topcon Corp 角膜観察装置
JP2010268916A (ja) * 2009-05-20 2010-12-02 Topcon Corp 前眼部観察装置
JP2014023867A (ja) * 2012-07-30 2014-02-06 Topcon Corp 眼底解析装置、眼底解析プログラム及び眼底解析方法
JP2014104289A (ja) * 2012-11-29 2014-06-09 Canon Inc 補償光学装置、撮像装置、補償光学装置の制御方法およびプログラム
JP2014155875A (ja) * 2014-06-03 2014-08-28 Topcon Corp 眼科観察装置、その制御方法、及びプログラム

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015529101A (ja) * 2012-08-15 2015-10-05 オプトビュー,インコーポレーテッド 角膜実質マッピング

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009022502A (ja) * 2007-07-19 2009-02-05 Topcon Corp 角膜観察装置
JP2010268916A (ja) * 2009-05-20 2010-12-02 Topcon Corp 前眼部観察装置
JP2014023867A (ja) * 2012-07-30 2014-02-06 Topcon Corp 眼底解析装置、眼底解析プログラム及び眼底解析方法
JP2014104289A (ja) * 2012-11-29 2014-06-09 Canon Inc 補償光学装置、撮像装置、補償光学装置の制御方法およびプログラム
JP2014155875A (ja) * 2014-06-03 2014-08-28 Topcon Corp 眼科観察装置、その制御方法、及びプログラム

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023189793A1 (fr) * 2022-03-29 2023-10-05 ソニーグループ株式会社 Dispositif d'observation médicale et dispositif de traitement d'informations

Also Published As

Publication number Publication date
JP6616673B2 (ja) 2019-12-04
JP2017093992A (ja) 2017-06-01

Similar Documents

Publication Publication Date Title
JP6009935B2 (ja) 眼科装置
WO2017090361A1 (fr) Dispositif d'examen de la cornée
JP6607346B2 (ja) 前眼部光干渉断層撮影装置および前眼部光干渉断層撮影方法
JP6580448B2 (ja) 眼科撮影装置及び眼科情報処理装置
JP6703839B2 (ja) 眼科計測装置
JP5517571B2 (ja) 撮像装置および撮像方法
JP6552200B2 (ja) 光断層撮像装置、その制御方法、及びプログラム
JP2022040372A (ja) 眼科装置
JP2022075772A (ja) 眼科装置
WO2021153087A1 (fr) Dispositif ophthalmique, procédé de commande associé et support de stockage
JP6898716B2 (ja) 光断層撮像装置
JP6637743B2 (ja) 眼科装置
CN115334953A (zh) 多模态视网膜成像平台
JP7215862B2 (ja) 眼科撮影装置、その制御方法、プログラム、及び記録媒体
JP2019171221A (ja) 眼科撮影装置及び眼科情報処理装置
WO2021153086A1 (fr) Appareil ophthalmique, procédé de commande associé et support de stockage
JP6793416B2 (ja) 前眼部光干渉断層撮影装置および前眼部光干渉断層撮影方法
JP6775995B2 (ja) 光断層撮像装置、光断層撮像装置の作動方法、及びプログラム
JP7288110B2 (ja) 眼科装置
JP6108810B2 (ja) 眼科装置およびその制御方法
JP7096391B2 (ja) 眼科装置
JP6664992B2 (ja) 眼科撮影装置
WO2021176893A1 (fr) Dispositif ophtalmique, procédé de commande associé et support de stockage
JP7236832B2 (ja) 眼科撮影装置、その制御方法、プログラム、及び記録媒体
WO2020066324A1 (fr) Dispositif d'imagerie ophtalmologique, procédé de commande, programme, et support d'enregistrement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16868310

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16868310

Country of ref document: EP

Kind code of ref document: A1