WO2023008385A1 - 前眼部解析装置、前眼部解析方法、およびプログラム - Google Patents
前眼部解析装置、前眼部解析方法、およびプログラム Download PDFInfo
- Publication number
- WO2023008385A1 WO2023008385A1 PCT/JP2022/028662 JP2022028662W WO2023008385A1 WO 2023008385 A1 WO2023008385 A1 WO 2023008385A1 JP 2022028662 W JP2022028662 W JP 2022028662W WO 2023008385 A1 WO2023008385 A1 WO 2023008385A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- edge
- edges
- cornea
- unit
- anterior segment
- Prior art date
Links
- 238000004458 analytical method Methods 0.000 title claims abstract description 58
- 210000004087 cornea Anatomy 0.000 claims abstract description 85
- 238000005259 measurement Methods 0.000 claims abstract description 82
- 238000012545 processing Methods 0.000 claims description 128
- 238000000034 method Methods 0.000 claims description 45
- 238000001514 detection method Methods 0.000 claims description 39
- 230000008569 process Effects 0.000 claims description 23
- 238000003384 imaging method Methods 0.000 claims description 21
- 210000003560 epithelium corneal Anatomy 0.000 claims description 19
- 230000008878 coupling Effects 0.000 claims description 8
- 238000010168 coupling process Methods 0.000 claims description 8
- 238000005859 coupling reaction Methods 0.000 claims description 8
- 238000009499 grossing Methods 0.000 claims description 8
- 238000009795 derivation Methods 0.000 claims description 4
- 238000012014 optical coherence tomography Methods 0.000 description 105
- 210000001508 eye Anatomy 0.000 description 91
- 230000003287 optical effect Effects 0.000 description 57
- 239000010410 layer Substances 0.000 description 36
- 238000010586 diagram Methods 0.000 description 18
- 230000007246 mechanism Effects 0.000 description 18
- 210000004045 bowman membrane Anatomy 0.000 description 16
- 230000006870 function Effects 0.000 description 12
- 210000004220 fundus oculi Anatomy 0.000 description 11
- 230000011218 segmentation Effects 0.000 description 8
- 210000001519 tissue Anatomy 0.000 description 8
- 238000003708 edge detection Methods 0.000 description 6
- 230000003595 spectral effect Effects 0.000 description 5
- 230000006872 improvement Effects 0.000 description 4
- 210000003683 corneal stroma Anatomy 0.000 description 3
- 238000006073 displacement reaction Methods 0.000 description 3
- 210000002919 epithelial cell Anatomy 0.000 description 3
- 230000007704 transition Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000007796 conventional method Methods 0.000 description 2
- 210000000399 corneal endothelial cell Anatomy 0.000 description 2
- 210000000871 endothelium corneal Anatomy 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 230000004424 eye movement Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 239000002356 single layer Substances 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 210000005252 bulbus oculi Anatomy 0.000 description 1
- 210000004240 ciliary body Anatomy 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 210000000695 crystalline len Anatomy 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000003706 image smoothing Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 210000000554 iris Anatomy 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000012528 membrane Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 210000003786 sclera Anatomy 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/102—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for optical coherence tomography [OCT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/117—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for examining the anterior chamber or the anterior chamber angle, e.g. gonioscopes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10101—Optical tomography; Optical coherence tomography [OCT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30041—Eye; Retina; Ophthalmic
Definitions
- the present invention relates to an anterior segment analysis device, an anterior segment analysis method, and a program.
- OCT Optical Coherence Tomography
- Patent Document 1 describes a method of obtaining the shape of the tissue of the eye to be inspected by performing segmentation processing on a tomographic image of the eye to be inspected.
- Patent Document 2 describes a method of determining the surface shape of the interior of the eye, assuming that the eye, which is an object, moves or rotates.
- Patent Document 3 describes an ophthalmologic information processing apparatus that generates an analysis map of the thickness of the layer tissue of the eye.
- an anterior segment analysis device an acquisition unit that acquires a tomographic image of the anterior segment of the eye including the cornea of the subject's eye formed by OCT measurement; a detection unit that detects a plurality of edges included in the tomogram; a first joining unit that joins edges based on a first joining condition for each of the plurality of edges; a selection unit that selects a first edge and a second edge based on length from among the edges joined by the first joining unit; a second joining unit that joins edges based on a second joining condition using the first edge and the second edge as references; a determination unit that determines a boundary of a layer of the cornea of the tomogram using the edges joined by the second joining unit; have
- an anterior segment analysis method comprising: an acquisition step of acquiring a tomographic image of the anterior segment of the eye including the cornea of the subject's eye formed by OCT measurement; a detection step of detecting a plurality of edges included in the tomogram; a first combining step of combining edges based on a first combining condition for each of the plurality of edges; a selection step of selecting a first edge and a second edge based on length from among the edges joined in the first joining step; a second combining step of combining the edges based on a second combining condition using the first edge and the second edge as references; a determination step of determining boundaries of layers of the cornea of the tomogram using the edges joined in the second joining step; have
- another form of this invention has the following structures. That is, the program to the computer, an acquisition step of acquiring a tomographic image of the anterior segment of the eye including the cornea of the subject's eye formed by OCT measurement; a detection step of detecting a plurality of edges included in the tomogram; a first combining step of combining edges based on a first combining condition for each of the plurality of edges; A selection step of selecting a first edge and a second edge based on length from among the edges joined in the first joining step; a second combining step of combining the edges based on a second combining condition with the first edge and the second edge as references; determining a boundary of layers of the cornea of the anterior segment using the edges joined in the second joining step; to run.
- FIG. 1 is a functional configuration diagram showing an example of a system configuration according to one embodiment of the present invention
- FIG. FIG. 4 is a diagram for explaining an example of conventional detection of a corneal layer
- 4 is a flowchart of anterior segment analysis processing according to an embodiment of the present invention
- FIG. 4 is a diagram for explaining transition of an image to be processed according to one embodiment of the present invention
- FIG. 4 is a diagram for explaining transition of an image to be processed according to one embodiment of the present invention
- FIG. 4 is a diagram for explaining transition of an image to be processed according to one embodiment of the present invention
- 4 is a flow chart of processing for identifying a boundary of Bowman's membrane according to an embodiment of the present invention
- FIG. 4 is a diagram for explaining the inside and outside of a region according to one embodiment of the present invention.
- FIG. 4 is a diagram for explaining edge joining conditions according to an embodiment of the present invention.
- FIG. 4 is a diagram for explaining edge joining conditions according to an embodiment of the present invention.
- FIG. 4 is a diagram for explaining edge joining conditions according to an embodiment of the present invention;
- FIG. 4 is a diagram for explaining improvement of artifacts occurring in an OCT image;
- FIG. 4 is a diagram for explaining improvement of artifacts occurring in an OCT image;
- FIG. 4 is a diagram for explaining improvement of artifacts occurring in an OCT image;
- FIG. 4 is a diagram showing a display example of analysis results according to one embodiment of the present invention;
- FIG. 4 is a diagram showing a display example of analysis results according to one embodiment of the present invention;
- FIG. 4 is a diagram showing a display example of analysis results according to one embodiment of the present invention;
- FIG. 4 is a diagram
- Tissues in the eye to be examined include tissues in the anterior segment and tissues in the posterior segment.
- Tissues in the anterior segment include the cornea, iris, lens, ciliary body, zonules, or gonios.
- Tissues in the posterior segment of the eye include the fundus (predetermined layer region in the fundus).
- the cornea located in the anterior segment of the eye to which attention is paid in the present invention is formed of a plurality of layers. More specifically, it is a layered structure of corneal epithelial cells, Bowman's membrane, corneal stroma, Descemet's layer, and corneal endothelial cells in this order from the front side of the cornea.
- the boundaries of the layers of the fundus can be imaged relatively clearly, so the separation into single layers can be performed relatively easily from the imaged image.
- the anterior segment it is difficult to separate a plurality of layers forming the anterior segment as described above due to the difficulty in capturing a clear image in the imaging operation.
- an anterior segment analysis method for more accurately detecting the layer structure of the anterior segment, particularly the cornea will be described below.
- a tomographic image of the subject's eye is obtained and used.
- a tomogram is obtained, for example, by performing optical coherence tomography (OCT) on the subject's eye using an external ophthalmic system.
- OCT optical coherence tomography
- OCT data data obtained by OCT may be collectively referred to as OCT data.
- a measurement operation for forming OCT data may be called OCT measurement, and a scan for performing OCT measurement may be called OCT scan.
- an ophthalmologic system includes an anterior segment analysis device
- the ophthalmic system includes an OCT apparatus, and obtains a tomographic image of the eye by performing OCT measurement on the eye using the OCT apparatus.
- the anterior segment analysis device is configured to include an interface for transmitting and receiving data from an external device or recording medium via a network, and is configured to acquire OCT data, tomographic images, etc. from an external ophthalmic system. may have been
- the left-right direction (horizontal direction) orthogonal to the optical axis (measurement optical axis, inspection optical axis) of the device optical system of the ophthalmic system is defined as the X direction
- the vertical direction (vertical direction) orthogonal to the optical axis is defined as the Y direction
- the direction of the optical axis depth direction, front-rear direction
- the correspondence relationship between the three-dimensional coordinate system on the real space and the three-dimensional coordinate system in the inside of the system or in the three-dimensional data is not particularly limited, it is assumed that they are associated in advance.
- FIG. 1 shows a configuration example of an ophthalmologic system 1 including functions for executing the anterior segment analysis method according to the present embodiment.
- the parts of the system to which the anterior segment analysis method according to the present invention can be applied are shown representatively, but the ophthalmologic system 1 may further include a configuration other than that shown in FIG.
- the ophthalmic system 1 is an examination apparatus including an objective refraction measurement device (refraction measurement section) and an OCT device (OCT section).
- the ophthalmologic system 1 includes a measurement unit 10 , a control processing unit 50 , a moving mechanism 90 , an imaging unit 100 and a UI (User Interface) unit 110 .
- Measurement unit 10 includes refraction measurement unit 20, OCT unit 30, light projection unit 40, beam splitter BS1, and beam splitter BS2.
- Control processing unit 50 includes image forming unit 60 , data processing unit 70 , and control unit 80 .
- the refraction measurement unit 20 receives a control instruction from the control unit 80 and objectively measures the refractive power of the eye E to be examined.
- the refraction measurement unit 20 includes an optical system provided with one or more optical members for performing objective refraction measurement.
- the refraction measurement unit 20 has, for example, a configuration similar to that of a known refractometer.
- a typical refractometer includes a projection system and a light receiving system, as disclosed in Japanese Patent Application Laid-Open No. 2016-077774.
- the projection system of the refraction measurement unit 20 projects the light emitted from the light source onto the fundus Ef of the eye E to be examined.
- the projection system for example, projects light from the light source onto the fundus oculi Ef through a collimator lens, a focusing lens, a relay lens, a pupil lens, a perforated prism, a decentered prism, an objective lens, and the like.
- the light receiving system of the refraction measuring unit 20 receives the reflected light from the fundus oculi Ef through an objective lens, a decentered prism, a perforated prism, another pupil lens, another relay lens, another focusing lens, a conical prism, an imaging lens, and the like. through an imaging device (not shown). As a result, a ring pattern image formed on the imaging surface of the imaging device is detected.
- the refraction measuring unit 20 may be configured to project ring-shaped light onto the fundus oculi Ef and detect a ring pattern image formed by reflected light from the fundus oculi Ef. Further, the refraction measuring unit 20 projects a bright spot onto the fundus oculi Ef, converts the reflected light from the fundus oculi Ef into ring-shaped light, and detects a ring pattern image formed by the converted ring-shaped light.
- the OCT unit 30 receives a control instruction from the control unit 80, applies an OCT scan to the eye E to be examined, and obtains OCT data.
- the OCT data may be interference signal data, reflection intensity profile data obtained by applying Fourier transform to interference signal data, or image data obtained by imaging the reflection intensity profile data.
- an OCT image an example using image data (hereinafter referred to as an OCT image) will be described.
- the OCT technique that the OCT unit 30 can implement is typically Fourier domain OCT, and may be either spectral domain OCT or swept source OCT.
- the swept source OCT splits the light from the wavelength tunable light source into measurement light and reference light, and superimposes the return light of the measurement light projected onto the eye to be inspected on the reference light to generate interference light. Then, the interference light is detected by a photodetector, and the detection data (interference signal data) collected according to the wavelength sweep and measurement light scanning is subjected to Fourier transform or the like to form reflection intensity profile data.
- spectral domain OCT light from a low coherence light source (broadband light source) is split into measurement light and reference light, and the return light of the measurement light projected onto the subject's eye is superimposed on the reference light to generate interference light. do. Then, the spectrum distribution of this interference light is detected by a spectroscope, and the data detected by the spectroscope (interference signal data) is subjected to Fourier transform or the like to form reflection intensity profile data. That is, the swept-source OCT is an OCT technique that acquires the spectral distribution by time division, and the spectral domain OCT is an OCT technique that acquires the spectral distribution by spatial division.
- the OCT section 30 includes an optical system provided with one or more optical members for performing OCT measurement.
- the OCT section 30 has, for example, a configuration similar to that of a known OCT apparatus.
- a typical OCT apparatus includes a light source, an interference optical system, a scanning system, and a detection system, as disclosed in Japanese Patent Application Laid-Open No. 2016-077774.
- the light output from the light source is split into measurement light and reference light by an interference optical system.
- a reference beam is directed to the reference arm.
- the measurement light is projected onto the subject's eye E (for example, the fundus oculi Ef) through the measurement arm.
- a scanning system is provided on the measurement arm.
- the scanning system includes, for example, an optical scanner and can deflect the measurement light one-dimensionally or two-dimensionally.
- Optical scanners include one or more galvo scanners.
- a scanning system deflects the measurement light according to a predetermined scanning mode.
- the control unit 80 provided in the control processing unit 50 can control the scan system according to the scan mode.
- Scan modes include line scan, raster scan (three-dimensional scan), circle scan, concentric circle scan, radial scan, cross scan, multi-cross scan, and spiral scan.
- a line scan is a scan pattern along a linear trajectory.
- a raster scan is a scan pattern consisting of a plurality of line scans arranged parallel to each other.
- a circle scan is a scan pattern along a circular trajectory.
- a concentric circle scan is a scan pattern consisting of a plurality of concentrically arranged circle scans.
- a radial scan is a scan pattern consisting of a plurality of radially arranged line scans.
- a cross-scan is a scan pattern consisting of two line scans arranged orthogonally to each other.
- a multi-cross-scan is a scan pattern consisting of two line-scan groups that are orthogonal to each other (eg, each group contains five lines that are parallel to each other).
- a spiral scan is a scan pattern that extends spirally from the center.
- the measurement light projected onto the fundus oculi Ef is reflected and scattered at various depth positions (layer boundaries, etc.) of the fundus oculi Ef.
- the return light of the measurement light from the subject's eye E is synthesized with the reference light by the interference optical system.
- the return light of the measurement light and the reference light generate interference light according to the principle of superposition.
- This interference light is detected by a detection system.
- the detection system typically includes a spectrometer for spectral-domain OCT, and a balanced photodiode and a data acquisition system (DAQ) for swept-source OCT.
- DAQ data acquisition system
- the light projection unit 40 projects light onto the eye E to be inspected for aligning the eye E and the measurement unit 10 (OCT unit 30, apparatus optical system).
- Light projection section 40 includes a light source and a collimator lens. The optical path of the light projection section 40 is coupled to the optical path of the refraction measurement section 20 by the beam splitter BS2. The light output from the light source passes through the collimator lens, is reflected by the beam splitter BS2, and is projected onto the subject's eye E through the optical path of the refraction measuring unit 20.
- the reflected light from the cornea Ec (anterior segment) of the eye to be examined E passes through the optical path of the refraction measurement unit 20, and undergoes refraction measurement. It is guided to the light receiving system of the section 20 .
- An image (bright point image) based on reflected light from the cornea Ec of the subject's eye E is included in the anterior segment image acquired by the imaging unit 100 .
- the control processing unit 50 causes the display screen of the display unit (not shown) to display the anterior segment image including the bright spot image and the alignment mark.
- the user can move the optical system so as to guide the bright spot image within the alignment mark.
- the user can move the optical system while referring to the anterior segment image displayed on the display screen of the UI unit 110 .
- the control unit 80 controls the moving mechanism 90 so as to cancel the displacement between the alignment mark and the position of the luminescent spot image, so that the measurement unit 10 ( optical system). Further, the control unit 80 controls the moving mechanism 90 so as to satisfy a predetermined completion condition for alignment based on the position of a predetermined portion (for example, pupil center position) of the subject's eye E and the position of the luminescent spot image. Thus, the measurement unit 10 (optical system) can be moved relative to the eye E to be examined.
- a predetermined portion for example, pupil center position
- the beam splitter BS1 coaxially couples the optical path of the optical system (projection system and light receiving system) of the refraction measurement section 20 with the optical path of the optical system (interference optical system, etc.) of the OCT section 30 .
- a dichroic mirror is used as the beam splitter BS1.
- the beam splitter BS2 coaxially couples the optical path of the optical system of the light projection section 40 with the optical path of the optical system (projection system and light receiving system) of the refraction measurement section 20 .
- a half mirror is used as the beam splitter BS2.
- the ophthalmologic system 1 may have a function (fixation projection system) of receiving a control instruction from the control unit 80 and presenting a fixation target to the eye E to guide the line of sight of the eye E to be examined.
- the fixation target may be an internal fixation target presented to the subject's eye E, or an external fixation target presented to the fellow eye.
- An optical path coupling member (for example, a beam splitter) arranged between the OCT section 30 and the beam splitter BS1 coaxially couples the optical path of the fixation projection system to the optical path of the interference optical system of the OCT section 30. may be
- a control instruction from the control unit 80 can be received to change the projection position of the fixation target on the fundus oculi Ef by the fixation projection system.
- the fixation target may be projected onto the measurement optical axis of the optical system of the refraction measurement section 20 and the optical system of the OCT section 30 that are coaxially coupled.
- the fixation target may be projected at a position off the measurement optical axis on the fundus oculi Ef.
- the photographing unit 100 includes one or more anterior segment cameras for photographing the anterior segment of the eye E to be examined.
- the photographing unit 100 acquires an anterior segment image, which is a front image of the eye E to be examined.
- At least one anterior segment illumination light source (such as an infrared light source) may be provided in the vicinity of one or more anterior segment cameras.
- an anterior segment illumination light source may be provided in each of its upper and lower vicinity.
- the ophthalmologic system 1 can align the measurement unit 10 (optical system) with the subject's eye E using the front image acquired by the imaging unit 100 .
- the ophthalmologic system 1 identifies the three-dimensional position of the eye to be examined E by analyzing the front image obtained by photographing the anterior segment of the eye to be examined E, and measures the measurement unit 10 based on the identified three-dimensional position. may be aligned by relatively moving the .
- the ophthalmologic system 1 may perform alignment such that the displacement between the characteristic position of the subject's eye E and the position of the image formed by the light projected by the light projection unit 40 is cancelled.
- the imaging unit 100 includes one or more anterior segment cameras.
- the ophthalmological system 1 analyzes the acquired front image and extracts the plane perpendicular to the optical axis of the measurement unit 10 (horizontal direction (X direction) and vertical direction
- the two-dimensional position of the subject's eye E on the plane defined by (the Y direction) is specified.
- the ophthalmologic system 1 is provided with an optical system for specifying the position of the subject's eye E in the direction of the optical axis of the measurement unit 10 .
- an optical system for example, there is an optical lever type optical system disclosed in Japanese Patent Application Laid-Open No. 2016-077774.
- the ophthalmologic system 1 uses such an optical system to specify the three-dimensional position of the eye to be examined E from the position of the eye to be examined E in the direction of the (measurement) optical axis of the measurement unit 10 and the above two-dimensional position. Is possible.
- the two or more anterior eye cameras photograph the anterior eye of the subject's eye E from different directions.
- Two or more anterior segment cameras can image the anterior segment substantially simultaneously from two or more different directions. “Substantially simultaneously” means, for example, that, in imaging with two or more anterior eye cameras, a deviation in imaging timing to the extent that eye movement can be ignored is allowed.
- two or more anterior eye cameras can acquire images when the subject's eye E is in substantially the same position (orientation).
- the ophthalmic system for example, as disclosed in Japanese Patent Application Laid-Open No. 2013-248376, analyzes the acquired front image to identify the characteristic positions of the subject's eye E, and uses two or more anterior segment cameras.
- the three-dimensional position of the eye E to be inspected is specified from the position of and the specified characteristic positions of the eye E to be inspected.
- Shooting with two or more anterior segment cameras may be video shooting or still image shooting.
- video shooting by controlling the shooting start timing to match, or by controlling the frame rate and the shooting timing of each frame, it is possible to realize substantially simultaneous shooting of the anterior segment as described above.
- still image shooting this can be achieved by controlling the shooting timing to match.
- the control processing unit 50 executes various calculations and various controls for operating the ophthalmologic system 1 .
- the control processing unit 50 includes one or more processors and one or more storage devices. Storage devices include RAM (Random Access Memory), ROM (Read Only Memory), HDD (Hard Disk Drive), SSD (Solid State Drive), and the like.
- RAM Random Access Memory
- ROM Read Only Memory
- HDD Hard Disk Drive
- SSD Solid State Drive
- Various computer programs are stored in the storage device, and calculation and control according to the present embodiment are realized by the processor operating based on the programs.
- control processing unit 50 realizes each function of the image forming unit 60, the data processing unit 70, and the control unit 80 by having the processor execute various programs.
- the block configuration of the functions realized by the control processing unit 50 is an example, and may be further divided in detail corresponding to each process described above.
- the image forming unit 60 forms an image (such as a tomographic image) of the eye E to be inspected based on OCT data obtained by performing OCT measurement on the eye E to be inspected.
- the image forming section 60 constructs OCT data (typically image data) based on detection data obtained by the detection system of the OCT section 30 . Similar to conventional OCT data processing, the image forming unit 60 applies filter processing, fast Fourier transform (FFT), and the like to the detection data to obtain each A line (measurement light within the eye E to be examined). Construct reflection intensity profile data in the path). Further, the image forming section 60 constructs image data (A-scan data) of each A-line by applying imaging processing (image representation) to the reflection intensity profile data. Note that part of the functions of the image forming section 60 may be provided in the OCT section 30 .
- the image forming unit 60 transmits OCT data via a network (not shown).
- a network may be configured as an acquisition unit that acquires the
- the data processing unit 70 can execute processing for positioning the measuring unit 10 with respect to the eye E to be examined.
- the processing for performing alignment includes analysis processing of the front image of the subject eye E acquired using the imaging unit 100, processing of calculating the position of the subject eye E, processing of calculating the displacement of the measuring unit 10 with respect to the subject eye E, and the like. is mentioned.
- the data processing unit 70 identifies the surface shape of the cornea Ec of the eye to be examined E from the tomographic image of the eye to be examined E obtained by performing OCT measurement after the alignment, and furthermore, determines the shape data representing the structure of the cornea Ec. It is possible to generate For example, the shape data is obtained by subjecting a tomogram as an OCT image to segmentation processing. In this embodiment, analysis processing including segmentation processing is performed on the cornea Ec included in the anterior segment, and display processing is performed based on the analysis results. Details will be described later.
- the control section 80 controls each section of the ophthalmologic system 1 .
- the control unit 80 includes a storage device (not shown) as described above, and can store various types of information.
- the information stored in the storage device includes a program for controlling each part of the ophthalmologic system 1, information on the subject, information on the eye to be examined, measurement data obtained by the measurement unit 10, and processing results by the data processing unit 70. etc., but is not particularly limited.
- the control unit 80 can control the UI unit 110.
- the UI unit 110 functions as part of a user interface, functions as a display device that displays information and display screens in response to control instructions from the control unit 80, and functions as an operation device that receives operations from the user.
- the UI unit 110 may include, for example, a liquid crystal display (LCD) or an organic light-emitting diode (OLED) display in order to function as a display device.
- LCD liquid crystal display
- OLED organic light-emitting diode
- the control unit 80 can control the ophthalmic system 1 according to signals input via the UI unit 110 .
- the UI unit 110 may include various hardware keys (joystick, button, switch, etc.) provided in the ophthalmic system 1 in order to function as an operating device.
- the UI unit 110 may also include various peripheral devices (keyboard, mouse, joystick, operation panel, etc.) connected to the ophthalmologic system 1 .
- the UI unit 110 may also include various software keys (buttons, icons, menus, etc.) displayed on the touch panel.
- the moving mechanism 90 moves the measuring unit 10 housing the optical system (apparatus optical system) such as the refraction measuring unit 20, the OCT unit 30, the light projection unit 40, and the beam splitters BS1 and BS2 in the vertical and horizontal directions and the front and rear directions. It is a mechanism for The moving mechanism 90 can receive a control instruction from the control unit 80 and move the measuring unit 10 relative to the eye E to be examined.
- the moving mechanism 90 is provided with an actuator (not shown) that generates a driving force for moving the measuring section 10 and a transmission mechanism (not shown) that transmits this driving force.
- the actuator is composed of, for example, a pulse motor.
- the transmission mechanism is configured by, for example, a combination of gears, a rack and pinion, or the like.
- the control unit 80 controls the moving mechanism 90 by sending control signals to the actuators.
- the control over the moving mechanism 90 is used in alignment.
- the control unit 80 acquires the current position of the measurement unit 10 (apparatus optical system).
- the control unit 80 receives information indicating the content of movement control of the moving mechanism 90 and acquires the current position of the measuring unit 10 .
- the control unit 80 controls the moving mechanism 90 at a predetermined timing (when the apparatus is started, when patient information is input, etc.) to move the measuring unit 10 to a predetermined initial position.
- every time the control unit 80 controls the moving mechanism 90 it records the details of the control.
- a history of control contents is obtained.
- the control unit 80 acquires the control details up to the present by referring to the history, and obtains the current position of the measurement unit 10 based on the control details.
- the control over the moving mechanism 90 may be used in tracking. Tracking is to move the apparatus optical system according to the eye movement of the eye E to be examined. Alignment and focus adjustment are performed in advance when tracking is performed. Tracking is a function of maintaining a suitable positional relationship in which the position and focus are achieved by causing the position of the apparatus optical system to follow the movement of the eyeball.
- the cornea Ec located in the anterior segment has a layered structure and includes corneal epithelial cells, Bowman's membrane, corneal stroma, Descemet's layer, and corneal endothelial cells.
- FIG. 2 is a diagram for explaining an example of conventional detection of the boundary between layers of the cornea.
- An image 200 shows a portion of the cornea, and here shows an example of detection of a boundary 201 on the front side of the corneal epithelium, a boundary 202 between the corneal epithelium and Bowman's membrane, and a boundary 203 on the fundus side of the corneal endothelium.
- the thickness of the cornea is approximately 0.5 mm.
- the Bowman's membrane has a thickness of about 8 to 14 ⁇ m, which is thinner than the corneal epithelial cells and corneal stroma, which are the other components.
- the anterior segment analysis method according to the present embodiment enables detection of unevenness occurring inside the cornea as described above, and achieves detection with higher accuracy.
- the image forming unit 60 acquires the OCT data of the subject's eye to be processed.
- the measurement unit 10 and the movement mechanism 90 are operated to photograph the subject's eye, thereby acquiring new OCT data, or acquiring OCT data held in a storage device or the like.
- the OCT data to be acquired here may be specified by the user.
- preprocessing for OCT data gradation conversion processing, image enhancement processing, threshold processing, contrast conversion processing, binarization processing, edge detection processing, image averaging processing, image smoothing processing, filtering processing, and region extraction. processing, alignment processing, etc. may be performed. Note that the preprocessing is not limited to the above, and other processing may be performed in consideration of each subsequent processing.
- the data processing unit 70 detects the boundary on the front side of the corneal epithelium by performing edge detection processing on the OCT data acquired in step S101.
- the boundary detected here corresponds to the boundary 201 shown in FIG. That is, it corresponds to the edge of the cornea located on the frontmost side.
- Edge detection methods include, for example, the Canny Edge method, which is a known technique, but are not particularly limited.
- the data processing unit 70 detects the boundary on the front side of the corneal epithelium using an approximated curve based on the detected edges.
- step S103 based on the position of the boundary detected in step S102, the data processing unit 70 performs alignment processing on the original OCT image so that the boundary becomes a straight line. That is, the pixels forming the detected boundary are arranged in a straight line, and the positions of the pixels are changed while maintaining the positional relationship in a predetermined direction between the pixels forming the boundary and the other pixels. Therefore, the positional relationship (distance) in the thickness direction between the boundary pixels and the surrounding pixels does not change from the original image.
- the data processing unit 70 extracts a region of a predetermined size based on the boundary from the image after alignment processing obtained at step S103. As a result, a rectangular corneal image 401 as shown in FIG. 4A is obtained.
- the predetermined size of the region to be extracted is not particularly limited, but may be, for example, 50 pixels in the direction orthogonal to the boundary and 1024 pixels in the direction parallel to the boundary. Also, the upper end in the vertical direction corresponds to the straight line obtained at the boundary detected in step S102.
- step S105 the data processing unit 70 uses the corneal image obtained in step S104 to perform processing for identifying the boundary of Bowman's membrane. Details of this step will be described in detail with reference to FIG.
- step S106 the data processing unit 70 performs display processing based on the boundary of Bowman's membrane identified in step S105. A configuration example of a screen displayed by the UI unit 110 in the display process will be described later. Then, this processing flow ends.
- FIG. 6 is a flow chart showing details of the process of step S105 in FIG.
- step S201 the data processing unit 70 performs edge detection processing on the corneal image.
- Edge detection methods include, for example, the Canny Edge method, which is a known technique, but are not particularly limited.
- a corneal image 402 including a plurality of edges is obtained as shown in FIG. 4B.
- the data processing unit 70 performs area segmentation on the corneal image obtained in step S201.
- the corneal image is divided into a first region on the central side (inner side) of the cornea and a second region on the scleral side (outer side) of the cornea.
- FIG. 6 shows an example of dividing the corneal image 402 shown in FIG. 4B into a first region and a second region.
- the ranges of 0 to 200 and 800 to 1024 pixels in the horizontal direction are treated as the second region, and the ranges of 201 to 799 pixels are treated as the first region.
- the dividing method here is an example, and is not limited to this. For example, the range may be changed according to the size of the image. Also, although an example of dividing into two regions is shown here, it may be divided into more regions.
- step S203 the data processing unit 70 performs edge-to-edge connection processing on various edges included in the first region.
- 7A to 7C are diagrams for explaining conditions for edge coupling.
- the vertical direction that is, the thickness direction of the cornea
- the horizontal direction that is, the width direction of the cornea
- FIG. 7A shows that if the distance dx in the x-direction between the closer endpoints of two edges is greater than or equal to a predetermined value, no merging is performed. That is, if the distance dx between two edges is smaller than a predetermined value, the endpoints of the two edges are connected by a straight line.
- FIG. 7A shows that if the distance dx in the x-direction between the closer endpoints of two edges is greater than or equal to a predetermined value, no merging is performed. That is, if the distance dx between two edges is smaller than a predetermined value, the endpoints of the two edges are connected by a straight line.
- FIG. 7B shows that if the distance dy in the y direction between the closer endpoints of the two edges is greater than or equal to a predetermined value, no merging is performed. That is, if the distance dy between two edges is smaller than a predetermined value, the endpoints of the edges are connected by a straight line.
- FIG. 7C shows that when two edges overlap in the x-direction, if the overlap length dl is greater than or equal to a predetermined value, no merging is performed. That is, if the overlap length dl of two edges is smaller than a predetermined value, the end points between the edges are connected by a straight line.
- smoothing processing may be performed before combining, or other combining processing may be performed.
- the edges on the upper side that is, the front side of the cornea
- a method of combining using three conditions is shown, but any of these conditions may be used for combining, or other conditions may be used for combining. Also, which of the three conditions is preferentially applied is not particularly limited. Here, attention is paid to two edges whose end points are closest to each other, and the process is repeated.
- the thresholds of the conditions used in step S203 are indicated as Th dx1 , Th dy1 and Th dl1 .
- step S204 data processing unit 70 removes edges based on the length and intensity in the first region after the processing in step S203.
- edges whose length is equal to or less than the threshold Th l1 are removed.
- edges whose maximum pixel intensity is equal to or less than the threshold Th d1 are removed.
- the length here may be defined in each of the vertical direction and the horizontal direction, or may be defined by the number of pixels.
- the intensity here may be defined by a pixel value (for example, a luminance value).
- edges are removed based on both length and strength, but edges may be removed based on either one.
- step S205 the data processing unit 70 performs edge combining processing again in the same manner as in step S203 in the first region from which edge removal has been performed in step S204.
- the difference from step S203 is that the threshold used as the connection condition is changed.
- the thresholds for each condition used in step S205 are indicated as Th dx2 , Th dy2 and Th dl2 .
- Th dy2 >Th dy1
- Th dl2 >Th dl1 .
- edges that are farther apart are also treated as objects to be combined.
- the data processing unit 70 specifies the longest edge as the first edge and the second longest edge as the second edge among the edges included in the first region after the processing in step S205. do.
- the length here may be specified by the number of pixels forming the edge, or may be specified based on the length in the horizontal direction in the image.
- step S207 the data processing unit 70 performs edge combination processing in the first region after the processing in step S205, using the first edge and the second edge as references.
- the combining condition here may be the same as in step S203, but the threshold for combining is made different.
- the thresholds of each condition used in step S207 are indicated as Th dx3 , Th dy3 and Th dl3 .
- Th dl3 >Th dl2 >Th dl1 .
- edges that are more distant from the endpoints of the first edge and the second edge are also treated as objects to be combined. This process may result in a combination of the first edge and the second edge.
- step S208 the data processing unit 70 determines whether or not the first edge and the second edge have been combined in the process of step S207. If the first and second edges are combined (YES in step S208), the process of data processing unit 70 proceeds to step S210. On the other hand, if the first and second edges are not connected (NO in step S208), the process of data processing unit 70 proceeds to step S209.
- step S209 the data processing unit 70 performs noise determination processing on the shorter edge of the two edges that are not combined based on the first edge and the second edge. That is, of the two edges, the longer one is treated as a more reliable edge as a boundary.
- the noise determination processing here is performed based on the vertical distance between the two edges. If the vertical distance is smaller than the predetermined threshold Th dx4 , the data processing unit 70 determines the shorter edge to be noise. It is assumed that the threshold Th dx4 is defined in advance and stored in the storage device.
- the data processing unit 70 identifies one or two reference edges. More specifically, if the first edge and the second edge are combined or if one of them is determined to be noise, the longer edge is identified as the reference edge. On the other hand, if the first edge and the second edge are not combined and neither is determined to be noise, the two edges are identified as reference edges.
- step S211 the data processing unit 70 performs edge removal processing in the second region based on the positions of the one or two reference edges specified in step S210.
- edges exceeding a predetermined threshold are removed.
- edges whose length is less than or equal to the threshold Thl2 are removed.
- edges having the maximum intensity of the pixels forming the edge equal to or less than the threshold Th d2 are removed. It is assumed that the threshold Th l2 and the threshold Th d2 used here are defined in advance and held in the storage device. The length here may be defined in each of the vertical direction and the horizontal direction, or may be defined by the number of pixels.
- the intensity here may be defined by a pixel value (for example, a luminance value).
- the threshold Th l2 and the threshold Th d2 used in this step may be the same values as the threshold Th l1 and the threshold Th d1 used for edge removal of the first region in step S204, or may be different values. There may be. In this process, edges are removed based on both length and strength, but edges may be removed based on either one. By using the position of the reference edge as a reference, it is possible to efficiently remove edges with low boundary accuracy, reduce the processing load of the subsequent edge combination, and shorten the processing time. .
- step S212 the data processing unit 70 performs edge combination processing in the second region after the edge removal processing has been performed in step S211, using the one or two reference edges identified in step S210 as a reference.
- the joining condition here may be the same as in steps S203 and S205.
- the data processing unit 70 identifies one edge from among the edges after the edge combination processing at step S212.
- the corneal image contains 1 or 2 edges based on 1 or 2 reference edges. If only one edge can be specified at this time, that edge is specified.
- the edge located on the upper side, that is, on the front side of the cornea is specified.
- the edge points closer to each edge are connected by a straight line to form one edge, and then that edge is specified.
- step S214 the data processing unit 70 performs smoothing processing on the edges identified in step S213.
- a well-known method may be used for the smoothing process, and although it is not particularly limited, the smoothing process is adjusted so as not to excessively correct the irregularities shown in the region 204 in FIG. 2 .
- step S215 the data processing unit 70 identifies the edge on which the smoothing process was performed in step S214 as the boundary of Bowman's film.
- FIG. 4C is a corneal image 420 showing the boundary of Bowman's membrane identified by the processing up to this point.
- step S216 the data processing unit 70 associates the position (coordinates) of the boundary specified in step S215 in the corneal image with the original OCT data.
- coordinates and the like are associated based on respective processing parameters for boundary detection (step S102), alignment processing (step S103), and boundary extraction (step S104) of the corneal epithelium in FIG. Then, this processing flow is ended, and the process proceeds to step S106 in FIG.
- FIG. 8A to 8C are diagrams for explaining artifacts that occur when performing OCT measurements.
- a corneal image 800 as shown in FIG. 8A may be obtained as a result of the OCT measurement of the subject's eye.
- a strong vertical line artifact may occur at a position corresponding to the corneal vertex.
- Such artifacts can be caused by reflections from perpendicular to the incident light during OCT measurements.
- the edge of the region 811 where the artifact is located cannot be detected appropriately like the corneal image 810 shown in FIG. 8B.
- the distance between edges to be detected increases. This distance varies depending on the shooting conditions. Therefore, it is difficult to prescribe correction parameters and the like in consideration of all imaging conditions. In addition, if appropriate conditions are not set, noise and the like are excessively included, resulting in a decrease in detection accuracy.
- the anterior segment analysis method as described above, two long edges (the first edge and the second edge) are specified, and the boundary is detected based on these edges. . Therefore, even if an artifact is included, for example, as in the corneal image 800 of FIG. can be specified.
- Display screen An example of a display screen based on the analysis result obtained by the anterior segment analysis method according to the present embodiment will be described.
- the display screen shown below is displayed on the UI unit 110, for example.
- FIGS. 9A and 9B show examples of display screens that superimpose the detected boundaries on OCT data.
- FIG. 9A shows an example of OCT data 900 resulting from performing an OCT measurement on the cornea.
- FIG. 9B is an example of a display image 910 obtained by applying the anterior segment analysis method according to the present embodiment to the OCT data 900 to detect each boundary and superimposing the boundaries.
- the displayed image 910 shows three boundaries, the boundary on the front side of the corneal epithelium, the boundary between the corneal epithelium and Bowman's membrane, and the boundary on the fundus side of the corneal endothelium in order from the top.
- the OCT data as shown in FIG. 9A and the display screen as shown in FIG. 9B may be switchable, or both may be displayed side by side.
- the display method of the detected boundary is not particularly limited, and for example, the line type and line color may be switchable.
- FIG. 10 shows an example of a display screen 1000 for showing the membrane thickness derived based on the boundary obtained by the anterior segment analysis processing according to the present invention.
- a plurality of OCT data obtained by performing a plurality of OCT scans of the eye to be examined are each subjected to an anterior segment analysis process to identify the boundary, and the thickness of the corneal epithelium in the entire cornea is determined. derived.
- the thickness of the corneal epithelium corresponds to, for example, the distance from the front boundary of the corneal epithelium to the boundary between the corneal epithelium and Bowman's membrane. The thickness may be determined based on the length per pixel in OCT data.
- the thickness is derived from the boundary obtained from each of a plurality of OCT data (for example, 12 OCT data obtained by radial scanning), and the thickness in a sparse state is obtained. Generate a map. Furthermore, a dense thickness map may be generated by linear interpolation or the like from the thickness at each position on the cornea corresponding to each of a plurality of OCT data. Note that the interpolation method is not limited to linear interpolation, and other methods may be used.
- the number of measurements, the direction of measurement, etc. are not particularly limited.
- the OCT data may be obtained by performing radial scanning while rotating the corneal center by a predetermined angle with the center of the cornea as the axis of rotation.
- OCT data may be acquired by performing raster scanning multiple times so as to be parallel in a predetermined direction (for example, vertical direction).
- OCT data may be acquired based on various scan patterns as described above.
- a thickness map 1001 indicates the entire area of the cornea, and indicates the thickness of the corneal epithelium within the cornea in gradation.
- the cornea is divided into 25 regions, but the number of divided regions is not particularly limited.
- the numerical value shown in each region indicates the average value of the thickness in that region.
- "N" shown on the left side of the thickness map 1001 indicates the nasal side of the subject's eye
- "T” indicated on the right side of the thickness map 1001 indicates the temporal side of the subject's eye.
- a scale 1002 indicates the thickness value for the gradation indicated by the thickness map 1001 .
- it indicates that the thickness is shown in the range of 20 to 80 [ ⁇ m].
- a scale 1002 indicates the length in the horizontal direction of the thickness map 1001, and indicates a range of -4.5 to 4.5 [mm] with the center of the cornea as 0.
- a parameter group 1004 indicates values of various parameters corresponding to the thickness map 1001 .
- An icon ⁇ corresponding to the minimum thickness value and an icon ⁇ corresponding to the maximum thickness value are shown in the thickness map 1001 to indicate respective detection positions.
- the contents and configuration displayed on the display screen 1000 are examples, and other configurations may be used.
- the parameters to be displayed may be configured so that the user can designate them.
- the numerical values displayed on the thickness map 1001 may be configured to be switchable between display and non-display.
- the thickness map 1001 corresponding to one eye to be examined is displayed, but both eyes or comparison objects (for example, past measurement results) may be displayed side by side.
- the present embodiment it is possible to appropriately segment the local unevenness of the layers forming the cornea in the anterior segment.
- a program or application for realizing the functions of one or more embodiments described above is supplied to a system or device using a network or a storage medium, and one or more processors in the computer of the system or device can also be implemented by reading and executing the program.
- processor includes, for example, CPU (Central Processing Unit), GPU (Graphics Processing Unit), ASIC (Application Specific Integrated Circuit), programmable logic device (e.g., SPLD (Simple Programmable Logic Device), CPLD Device (Complex Programmable Logic Device), FPGA (Field Programmable Gate Array)) or other general-purpose or dedicated circuit.
- CPU Central Processing Unit
- GPU Graphics Processing Unit
- ASIC Application Specific Integrated Circuit
- programmable logic device e.g., SPLD (Simple Programmable Logic Device), CPLD Device (Complex Programmable Logic Device), FPGA (Field Programmable Gate Array)
- FPGA Field Programmable Gate Array
- a program for causing a computer to execute not only the method shown as an example above but also the control method for controlling the above device can be stored in any computer-readable recording medium.
- the recording medium include semiconductor memory, optical disk, magneto-optical disk (CD-ROM/DVD-RAM/DVD-ROM/MO, etc.), magnetic storage medium (hard disk/floppy (registered trademark) disk/ZIP, etc.). can be used. It is also possible to transmit and receive this program through a network such as the Internet or LAN.
- the configuration may be such that the operation of the device is controlled based on the result of processing executed by a processing unit arranged on a network or on the basis of a control instruction.
- the present invention is not limited to the above-described embodiments, but may be modified arbitrarily by those skilled in the art by combining each configuration of the embodiments with each other or based on the description of the specification and well-known technology. , adaptations, omissions, and additions are also contemplated by the present invention and fall within the scope of protection sought.
- an acquisition unit for example, an image forming unit 60
- a detection unit for example, a data processing unit 70
- a first connecting unit for example, a data processing unit 70
- a selection unit e.g., a data processing unit 70
- a second connecting unit e.g., data processing unit 70
- a determination unit e.g., data processing unit 70
- the detection unit Identifying the front boundary of the corneal epithelium in the tomogram, detecting the plurality of edges in a predetermined range from the identified front boundary;
- the anterior segment analysis device characterized in that: According to this configuration, it is possible to reduce the processing load associated with boundary detection and efficiently detect boundaries by performing detection in a predetermined range for a plurality of layers that constitute the cornea. becomes.
- the tomographic image is divided into a first region that is a region on the corneal vertex side of the eye to be examined and a second region that is a region on the sclera side of the cornea of the eye to be examined;
- the first coupling unit performs processing on the first region
- the second coupling unit performs processing on the first region and the second region
- the anterior segment analysis device according to (1) or (2) characterized by: According to this configuration, more accurate detection processing is performed on the vertex side of the cornea, and outer detection is performed based on the result. As a result, it is possible to reduce the processing load associated with boundary detection and efficiently detect boundaries without reducing the detection accuracy in the entire boundary detection area.
- the second combining unit differs in a combining condition for combining edges in the first region from a combining condition for combining edges in the second region;
- the anterior segment analysis device according to (3) characterized by: According to this configuration, it is possible to reduce the processing load associated with boundary detection and efficiently detect boundaries without lowering the detection accuracy in the entire boundary detection area. In particular, edges can be detected more efficiently in the second region.
- the anterior segment analysis device characterized by: With this configuration, it is possible to efficiently identify edges that are more likely to be boundaries based on their length.
- the second connecting portion may be the first edge.
- the anterior segment analysis device according to any one of (3) to (5), characterized by: According to this configuration, even if the edges included in the OCT image are interrupted in the width direction, it is possible to appropriately combine the edges and identify the boundary by interpolating the gap.
- the second coupling part is configured such that the first edge and the second edge of the cornea are in a range defined based on the imaging conditions for the OCT measurement in the first region. If not overlapping in the width direction, combine the first edge and the second edge to determine a boundary of the layers of the cornea;
- the anterior segment analysis device according to (6) characterized by: According to this configuration, even if an OCT image contains an artifact at a position such as the corneal vertex and the edge cannot be extracted due to the artifact, the edge can be interpolated and the boundary can be specified with high accuracy. becomes.
- the first combining condition and the second combining condition are the distance in the thickness direction of the cornea, the distance in the width direction of the cornea, and the overlap in the width direction of the cornea when combining edges.
- different conditions in degree and/or edge strength The anterior segment analysis device according to any one of (1) to (7), characterized by: According to this configuration, it is possible to appropriately combine the edges and specify the boundary by performing the combining process on the multiple edges included in the OCT image using the multiple combining conditions.
- the second combining unit further performs a smoothing process on the combined edges;
- the anterior segment analysis device according to any one of (1) to (8), characterized by: According to this configuration, even if unevenness such as a level difference occurs during the edge combining process, it is possible to adjust so as to approach the actual boundary by performing the smoothing process.
- the anterior segment analysis device according to any one of (1) to (9), characterized by: According to this configuration, the user of the ophthalmic system can easily grasp the detected layer structure of the cornea, and it is possible to promote efficiency in diagnosis and the like.
- a derivation unit for example, a data processing unit 70 that derives the thickness of the corneal epithelium of the eye to be examined based on the boundary determined by the determination unit; a generation unit (e.g., data processing unit 70) that generates a thickness map of the corneal epithelium of the eye to be examined using the thickness derived by the derivation unit;
- the anterior segment analysis device according to any one of (1) to (9), further comprising: According to this configuration, the thickness of the detected corneal layer structure can be expressed more visually, and the user's convenience of the ophthalmic system can be improved.
- An anterior segment analysis method comprising: According to this configuration, it is possible to appropriately segment the local unevenness of the layers forming the cornea in the cornea
- an acquisition step of acquiring a tomographic image of the anterior segment of the eye including the cornea of the subject's eye formed by OCT measurement for example, step S101
- a detection step of detecting a plurality of edges included in the tomogram for example, step S201
- a first combining step for example, steps S203 and S205
- a selection step for example, step S206
- a second combining step for example, step S212
- a determining step e.g., steps S213 and S215) of determining boundaries of layers of the cornea of the anterior segment using the edges joined in the second joining step
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Ophthalmology & Optometry (AREA)
- Biomedical Technology (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Heart & Thoracic Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Eye Examination Apparatus (AREA)
Abstract
Description
OCT計測により形成された被検眼の角膜を含む前眼部の断層像を取得する取得部と、
前記断層像に含まれる複数のエッジを検出する検出部と、
前記複数のエッジそれぞれに対し、第1の結合条件に基づいて、エッジ間の結合を行う第1の結合部と、
前記第1の結合部にて結合されたエッジの中から、長さに基づき、第1のエッジと第2のエッジを選択する選択部と、
前記第1のエッジと前記第2のエッジそれぞれを基準として、第2の結合条件に基づいて、エッジ間の結合を行う第2の結合部と、
前記第2の結合部にて結合されたエッジを用いて、前記断層像の角膜の層の境界を決定する決定部と、
を有する。
OCT計測により形成された被検眼の角膜を含む前眼部の断層像を取得する取得工程と、
前記断層像に含まれる複数のエッジを検出する検出工程と、
前記複数のエッジそれぞれに対し、第1の結合条件に基づいて、エッジ間の結合を行う第1の結合工程と、
前記第1の結合工程にて結合されたエッジの中から、長さに基づき、第1のエッジと第2のエッジを選択する選択工程と、
前記第1のエッジと前記第2のエッジそれぞれを基準として、第2の結合条件に基づいて、エッジ間の結合を行う第2の結合工程と、
前記第2の結合工程にて結合されたエッジを用いて、前記断層像の角膜の層の境界を決定する決定工程と、
を有する。
コンピュータに、
OCT計測により形成された被検眼の角膜を含む前眼部の断層像を取得する取得工程、
前記断層像に含まれる複数のエッジを検出する検出工程、
前記複数のエッジそれぞれに対し、第1の結合条件に基づいて、エッジ間の結合を行う第1の結合工程、
前記第1の結合工程にて結合されたエッジの中から、長さに基づき、第1のエッジと第2のエッジを選択する選択工程、
前記第1のエッジと前記第2のエッジそれぞれを基準として、第2の結合条件に基づいて、エッジ間の結合を行う第2の結合工程、
前記第2の結合工程にて結合されたエッジを用いて、前記前眼部の角膜の層の境界を決定する決定工程、
を実行させる。
以下、本発明の一実施形態に係る眼科システムが、前眼部解析装置を含む場合について説明する。眼科システムは、OCT装置を含み、そのOCT装置を用いて被検眼に対してOCT測定を実行することにより被検眼の断層像を取得する。しかし、この構成に限定するものではない。例えば、前眼部解析装置は、ネットワークを介して外部装置や記録媒体からデータを送受信するためのインターフェースを含むように構成され、外部の眼科システムからOCTデータや断層像等を取得するように構成されていてもよい。
図1は、本実施形態に係る前眼部解析方法を実行するための機能を含んで構成される眼科システム1の構成例を示す。ここでは、本発明に係る前眼部解析方法を適用可能なシステムの部位を代表的に示すが、眼科システム1は、図1に示す以外の構成を更に備えていてもよい。
屈折測定部20は、制御部80からの制御指示を受け、被検眼Eの屈折度数を他覚的に測定する。屈折測定部20は、他覚屈折測定を行うための1以上の光学部材が設けられた光学系を含む。屈折測定部20は、例えば、公知のレフラクトメータと同様の構成を有する。図示は省略するが、典型的なレフラクトメータは、日本国特開2016-077774号公報に開示されているように、投影系、および受光系を含む。
OCT部30は、制御部80からの制御指示を受け、被検眼EにOCTスキャンを適用してOCTデータを取得する。OCTデータは、干渉信号データでもよいし、干渉信号データにフーリエ変換を適用して得られた反射強度プロファイルデータでもよいし、反射強度プロファイルデータを画像化して得られた画像データでもよい。本実施形態では、画像データ(以下、OCT画像)を用いた例について説明する。
光投射部40は、被検眼Eと測定部10(OCT部30、装置光学系)との位置合わせを行うための光を被検眼Eに投射する。光投射部40は、光源、およびコリメータレンズを含む。光投射部40の光路は、ビームスプリッタBS2により屈折測定部20の光路に結合される。光源から出力された光は、コリメータレンズを経由し、ビームスプリッタBS2により反射され、屈折測定部20の光路を通じて被検眼Eに投射される。
ビームスプリッタBS1は、屈折測定部20の光学系(投影系及び受光系)の光路に、OCT部30の光学系(干渉光学系など)の光路を同軸に結合する。例えば、ビームスプリッタBS1としてダイクロイックミラーが用いられる。ビームスプリッタBS2は、屈折測定部20の光学系(投影系及び受光系)の光路に、光投射部40の光学系の光路を同軸に結合する。例えば、ビームスプリッタBS2としてハーフミラーが用いられる。
撮影部100は、被検眼Eの前眼部を撮影するための1以上の前眼部カメラを含む。撮影部100は、被検眼Eの正面画像である前眼部像を取得する。1以上の前眼部カメラの近傍に少なくとも1つの前眼部照明光源(赤外光源等)が設けられてよい。例えば、各前眼部カメラについて、その上方近傍と下方近傍のそれぞれに前眼部照明光源が設けられてよい。
制御処理部50は、眼科システム1を動作させるための各種演算や各種制御を実行する。制御処理部50は、1以上のプロセッサと、1以上の記憶装置とを含む。記憶装置としては、RAM(Random Access Memory)、ROM(Read Only Memory)、HDD(Hard Disk Drive)、SSD(Solid State Drive)などがある。記憶装置には各種のコンピュータプログラムが格納されており、それに基づきプロセッサが動作することによって本実施形態に係る演算や制御が実現される。
画像形成部60は、被検眼Eに対してOCT計測を実行することにより得られたOCTデータに基づいて、被検眼Eの画像(断層像等)を形成する。画像形成部60は、OCT部30の検出系による検出データに基づいてOCTデータ(典型的には画像データ)を構築する。画像形成部60は、従来のOCTデータ処理と同様に、フィルタ処理、高速フーリエ変換(FFT:Fast Fourier Transform)などを検出データに適用することにより、各Aライン(被検眼E内における測定光の経路)における反射強度プロファイルデータを構築する。更に、画像形成部60は、この反射強度プロファイルデータに画像化処理(画像表現)を適用することにより、各Aラインの画像データ(Aスキャンデータ)を構築する。なお、画像形成部60の機能の一部がOCT部30に設けられてもよい。
データ処理部70は、被検眼Eに対する測定部10の位置合わせを行うための処理を実行することが可能である。位置合わせを行うための処理として、撮影部100を用いて取得された被検眼Eの正面画像の解析処理、被検眼Eの位置の算出処理、被検眼Eに対する測定部10の変位の算出処理などが挙げられる。
制御部80は、眼科システム1の各部を制御する。制御部80は、上述したような記憶装置(不図示)を含み、各種の情報を保存することが可能である。記憶装置に保存される情報には、眼科システム1の各部を制御するためのプログラム、被検者の情報、被検眼の情報、測定部10により得られた測定データ、データ処理部70による処理結果などがあるが、特に限定されるものではない。
移動機構90は、屈折測定部20、OCT部30、光投射部40、ビームスプリッタBS1、BS2等の光学系(装置光学系)が収納された測定部10を上下左右方向及び前後方向に移動させるための機構である。移動機構90は、制御部80からの制御指示を受け、被検眼Eに対して測定部10を相対移動させることが可能である。例えば、移動機構90には、測定部10を移動するための駆動力を発生するアクチュエータ(不図示)と、この駆動力を伝達する伝達機構(不図示)とが設けられる。アクチュエータは、例えばパルスモータにより構成される。伝達機構は、例えば歯車の組み合わせやラック・アンド・ピニオンなどによって構成される。制御部80は、アクチュエータに対して制御信号を送ることにより移動機構90に対する制御を行う。
上述したように、前眼部に位置する角膜Ecは層構造となっており、角膜上皮細胞、ボーマン膜、角膜実質、デスメ層、角膜内皮細胞が含まれる。図2は、従来における角膜の各層の境界部の検出例を説明するための図である。画像200は、角膜の一部を示し、ここでは、角膜上皮の前面側の境界201、角膜上皮とボーマン膜との境界202、および、角膜内皮の眼底側の境界203の検出例を示している。角膜の厚みは約0.5mmである。そのうち、ボーマン膜は、その厚みが8~14μm程度であり、そのほかの構成要素である角膜上皮細胞や角膜実質に比べて薄い層である。
以下、本発明の一実施形態に係る前眼部解析方法の処理について説明する。本処理は、例えば、眼科システム1が備えるプロセッサが記憶装置に記憶された各種プログラムを読み出して実行することにより、制御処理部50として動作することで実現される。
図6は、図3のステップS105の工程の詳細を示すフローチャートである。
図8A~図8Cは、OCT測定を行った際に生じるアーティファクトを説明するための図である。被検眼のOCT測定を行った結果、図8Aに示すような角膜画像800が得られる場合がある。このとき、領域801に示すように、角膜頂点に相当する位置に縦線の強いアーティファクトが生じる場合がある。このようなアーティファクトは、OCT測定の際の入射光と垂直な個所からの反射によって生じ得る。アーティファクトが含まれる画像の解析処理を進めると、図8Bに示す角膜画像810のように、アーティファクトが位置する領域811のエッジが適切に検出できない。その結果、検出すべきエッジ間の距離が離れてしまう。この距離は、撮影条件によって異なる。そのため、すべての撮影条件を考慮して、補正パラメータ等を予め規定することは困難である。また、適切な条件設定ができていない場合には、ノイズなどを過剰に含めてしまい、検出の精度が低下してしまう。
本実施形態に係る前眼部解析方法により得られた解析結果に基づく表示画面の例について説明する。以下に示す表示画面は、例えば、UI部110にて表示される。
本発明において、上述した1以上の実施形態の機能を実現するためのプログラムやアプリケーションを、ネットワーク又は記憶媒体等を用いてシステム又は装置に供給し、そのシステム又は装置のコンピュータにおける1つ以上のプロセッサがプログラムを読出し実行する処理でも実現可能である。
(1) OCT計測により形成された被検眼の角膜を含む前眼部の断層像を取得する取得部(例えば、画像形成部60)と、
前記断層像に含まれる複数のエッジを検出する検出部(例えば、データ処理部70)と、
前記複数のエッジそれぞれに対し、第1の結合条件に基づいて、エッジ間の結合を行う第1の結合部(例えば、データ処理部70)と、
前記第1の結合部にて結合されたエッジの中から、長さに基づき、第1のエッジと第2のエッジを選択する選択部(例えば、データ処理部70)と、
前記第1のエッジと前記第2のエッジそれぞれを基準として、第2の結合条件に基づいて、エッジ間の結合を行う第2の結合部(例えば、データ処理部70)と、
前記第2の結合部にて結合されたエッジを用いて、前記断層像の角膜の層の境界を決定する決定部(例えば、データ処理部70)と、
を有することを特徴とする前眼部解析装置(例えば、眼科システム1、制御処理部50)。
この構成によれば、前眼部における角膜を構成する層の局所的な凹凸に対して適切にセグメンテーションが可能となる。特に、OCT計測において前眼部の撮像が不明瞭となっている場合でも、ボーマン膜の凹凸形状も精度良く検出することができる。
前記断層像において角膜上皮の前面側境界を特定し、
特定した前記前面側境界から所定の範囲を対象として前記複数のエッジを検出する、
ことを特徴とする(1)に記載の前眼部解析装置。
この構成によれば、角膜を構成する複数の層に対して予め規定された範囲にて検出を行うことで、境界の検出に係る処理負荷を低減させ、効率的に境界を検出することが可能となる。
前記第1の結合部は、前記第1の領域に対して処理を行い、
前記第2の結合部は、前記第1の領域と前記第2の領域に対して処理を行う、
ことを特徴とする(1)または(2)に記載の前眼部解析装置。
この構成によれば、角膜の頂点側にてより精度の高い検出処理を行い、その結果に基づいて外側の検出を行う。これにより、境界を検出する領域全体における検出精度を低下させることなく、境界の検出に係る処理負荷を低減させ、効率的に境界を検出することが可能となる。
ことを特徴とする(3)に記載の前眼部解析装置。
この構成によれば、境界を検出する領域全体における検出精度を低下させることなく、境界の検出に係る処理負荷を低減させ、効率的に境界を検出することが可能となる。特に、第2の領域においてより効率的にエッジの検出が可能となる。
ことを特徴とする(3)または(4)に記載の前眼部解析装置。
この構成によれば、境界である確度がより高いエッジを、長さに基づいて効率的に特定することが可能となる。
ことを特徴とする(3)~(5)のいずれかに記載の前眼部解析装置。
この構成によれば、OCT画像に含まれるエッジ間において、幅方向に途切れている場合でも、その間を補間することで、適切にエッジを結合して境界を特定することが可能となる。
ことを特徴とする(6)に記載の前眼部解析装置。
この構成によれば、OCT画像において、角膜頂点などの位置にアーティファクトが含まれ、アーティファクトにてエッジが抽出できないような場合であっても、エッジを補間し、精度良く境界を特定することが可能となる。
ことを特徴とする(1)~(7)のいずれかに記載の前眼部解析装置。
この構成によれば、OCT画像に含まれる複数のエッジに対して、複数の結合条件を用いて結合処理を行うことで、適切にエッジを結合して境界を特定することが可能となる。
ことを特徴とする(1)~(8)のいずれかに記載の前眼部解析装置。
この構成によれば、エッジの結合処理の際に段差などの凹凸が生じた場合でも、スムージング処理を行うことで、実際の境界に近づくように調整することが可能となる。
ことを特徴とする(1)~(9)のいずれかに記載の前眼部解析装置。
この構成によれば、検出された角膜の層構造を、眼科システムのユーザが容易に把握でき、診断等における効率化を促進することが可能となる。
前記導出部にて導出された厚みを用いて、前記被検眼の角膜上皮の厚みマップを生成する生成部(例えば、データ処理部70)と、
を更に有する、ことを特徴とする(1)~(9)のいずれかに記載の前眼部解析装置。
この構成によれば、検出された角膜の層構造に対する厚みをより視覚的に表現でき、眼科システムのユーザの利便性を向上させることが可能となる。
前記断層像に含まれる複数のエッジを検出する検出工程(例えば、ステップS201)と、
前記複数のエッジそれぞれに対し、第1の結合条件に基づいて、エッジ間の結合を行う第1の結合工程(例えば、ステップS203、ステップS205)と、
前記第1の結合工程にて結合されたエッジの中から、長さに基づき、第1のエッジと第2のエッジを選択する選択工程(例えば、ステップS206)と、
前記第1のエッジと前記第2のエッジそれぞれを基準として、第2の結合条件に基づいて、エッジ間の結合を行う第2の結合工程(例えば、ステップS212)と、
前記第2の結合工程にて結合されたエッジを用いて、前記断層像の角膜の層の境界を決定する決定工程(例えば、ステップS213、ステップS215)と、
を有することを特徴とする前眼部解析方法。
この構成によれば、前眼部における角膜を構成する層の局所的な凹凸に対して適切にセグメンテーションが可能となる。特に、OCT計測において前眼部の撮像が不明瞭となっている場合でも、ボーマン膜の凹凸形状も精度良く検出することができる。
OCT計測により形成された被検眼の角膜を含む前眼部の断層像を取得する取得工程(例えば、ステップS101)、
前記断層像に含まれる複数のエッジを検出する検出工程(例えば、ステップS201)、
前記複数のエッジそれぞれに対し、第1の結合条件に基づいて、エッジ間の結合を行う第1の結合工程(例えば、ステップS203、ステップS205)、
前記第1の結合工程にて結合されたエッジの中から、長さに基づき、第1のエッジと第2のエッジを選択する選択工程(例えば、ステップS206)、
前記第1のエッジと前記第2のエッジそれぞれを基準として、第2の結合条件に基づいて、エッジ間の結合を行う第2の結合工程(例えば、ステップS212)、
前記第2の結合工程にて結合されたエッジを用いて、前記前眼部の角膜の層の境界を決定する決定工程(例えば、ステップS213、ステップS215)、
を実行させるためのプログラム。
この構成によれば、前眼部における角膜を構成する層の局所的な凹凸に対して適切にセグメンテーションが可能となる。特に、OCT計測において前眼部の撮像が不明瞭となっている場合でも、ボーマン膜の凹凸形状も精度良く検出することができる。
10…測定部
20…屈折測定部
30…OCT(Optical Coherence Tomography)部
40…光投射部
50…制御処理部
60…画像形成部
70…データ処理部
80…制御部
90…移動機構
100…撮影部
110…UI(User Interface)部
BS1,BS2…ビームスプリッタ
E…被検眼
Ef…眼底
Ec…角膜
Claims (13)
- OCT計測により形成された被検眼の角膜を含む前眼部の断層像を取得する取得部と、
前記断層像に含まれる複数のエッジを検出する検出部と、
前記複数のエッジそれぞれに対し、第1の結合条件に基づいて、エッジ間の結合を行う第1の結合部と、
前記第1の結合部にて結合されたエッジの中から、長さに基づき、第1のエッジと第2のエッジを選択する選択部と、
前記第1のエッジと前記第2のエッジそれぞれを基準として、第2の結合条件に基づいて、エッジ間の結合を行う第2の結合部と、
前記第2の結合部にて結合されたエッジを用いて、前記断層像の角膜の層の境界を決定する決定部と、
を有することを特徴とする前眼部解析装置。 - 前記検出部は、
前記断層像において角膜上皮の前面側境界を特定し、
特定した前記前面側境界から所定の範囲を対象として前記複数のエッジを検出する、
ことを特徴とする請求項1に記載の前眼部解析装置。 - 前記断層像は、前記被検眼の角膜頂点側の領域である第1の領域と、前記被検眼の角膜の強膜側の領域である第2の領域とに分割され、
前記第1の結合部は、前記第1の領域に対して処理を行い、
前記第2の結合部は、前記第1の領域と前記第2の領域に対して処理を行う、
ことを特徴とする請求項1または2に記載の前眼部解析装置。 - 前記第2の結合部は、前記第1の領域におけるエッジを結合する際の結合条件と、前記第2の領域におけるエッジを結合する際の結合条件とを異ならせる、
ことを特徴とする請求項3に記載の前眼部解析装置。 - 前記第2の結合部は、前記第1の領域において、前記第1のエッジと前記第2のエッジとが前記角膜の幅方向において所定の長さ以上、重複している場合は、前記第1のエッジと前記第2のエッジとのうちの長い方を前記角膜の層の境界として決定する、
ことを特徴とする請求項3または4に記載の前眼部解析装置。 - 前記第2の結合部は、前記第1の領域において、前記第1のエッジと前記第2のエッジとが前記角膜の幅方向において重複していない場合は、前記第1のエッジと前記第2のエッジとを結合して、前記角膜の層の境界として決定する、
ことを特徴とする請求項3から5のいずれか一項に記載の前眼部解析装置。 - 前記第2の結合部は、前記第1の領域内の、前記OCT計測の撮影条件に基づいて規定される範囲において、前記第1のエッジと前記第2のエッジとが前記角膜の幅方向において重複していない場合は、前記第1のエッジと前記第2のエッジとを結合して、前記角膜の層の境界として決定する、
ことを特徴とする請求項6に記載の前眼部解析装置。 - 前記第1の結合条件と前記第2の結合条件とは、エッジを結合する際の、前記角膜の厚さ方向の距離、前記角膜の幅方向の距離、前記角膜の幅方向における重複度、および、エッジ強度の少なくともいずれかにおける条件が異なる、
ことを特徴とする請求項1から7のいずれか一項に記載の前眼部解析装置。 - 前記第2の結合部は更に、結合したエッジに対してスムージング処理を行う、
ことを特徴とする請求項1から8のいずれか一項に記載の前眼部解析装置。 - 前記決定部にて決定した前記角膜の層の境界を、前記断層像に識別可能に重畳表示させる表示制御部を更に有する、
ことを特徴とする請求項1から9のいずれか一項に記載の前眼部解析装置。 - 前記決定部にて決定された境界に基づいて、前記被検眼の角膜上皮の厚みを導出する導出部と、
前記導出部にて導出された厚みを用いて、前記被検眼の角膜上皮の厚みマップを生成する生成部と、
を更に有する、ことを特徴とする請求項1から9のいずれか一項に記載の前眼部解析装置。 - OCT計測により形成された被検眼の角膜を含む前眼部の断層像を取得する取得工程と、
前記断層像に含まれる複数のエッジを検出する検出工程と、
前記複数のエッジそれぞれに対し、第1の結合条件に基づいて、エッジ間の結合を行う第1の結合工程と、
前記第1の結合工程にて結合されたエッジの中から、長さに基づき、第1のエッジと第2のエッジを選択する選択工程と、
前記第1のエッジと前記第2のエッジそれぞれを基準として、第2の結合条件に基づいて、エッジ間の結合を行う第2の結合工程と、
前記第2の結合工程にて結合されたエッジを用いて、前記断層像の角膜の層の境界を決定する決定工程と、
を有することを特徴とする前眼部解析方法。 - コンピュータに、
OCT計測により形成された被検眼の角膜を含む前眼部の断層像を取得する取得工程、
前記断層像に含まれる複数のエッジを検出する検出工程、
前記複数のエッジそれぞれに対し、第1の結合条件に基づいて、エッジ間の結合を行う第1の結合工程、
前記第1の結合工程にて結合されたエッジの中から、長さに基づき、第1のエッジと第2のエッジを選択する選択工程、
前記第1のエッジと前記第2のエッジそれぞれを基準として、第2の結合条件に基づいて、エッジ間の結合を行う第2の結合工程、
前記第2の結合工程にて結合されたエッジを用いて、前記前眼部の角膜の層の境界を決定する決定工程、
を実行させるためのプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22849446.4A EP4378373A1 (en) | 2021-07-26 | 2022-07-25 | Anterior eye part analysis device, anterior eye part analysis method, and program |
CN202280052549.8A CN117858654A (zh) | 2021-07-26 | 2022-07-25 | 前眼部解析装置、前眼部解析方法以及程序 |
US18/423,609 US20240169547A1 (en) | 2021-07-26 | 2024-01-26 | Anterior segment analysis apparatus, anterior segment analysis method, and non-transitory computer-readable storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021121653A JP2023017402A (ja) | 2021-07-26 | 2021-07-26 | 前眼部解析装置、前眼部解析方法、およびプログラム |
JP2021-121653 | 2021-07-26 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/423,609 Continuation US20240169547A1 (en) | 2021-07-26 | 2024-01-26 | Anterior segment analysis apparatus, anterior segment analysis method, and non-transitory computer-readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023008385A1 true WO2023008385A1 (ja) | 2023-02-02 |
Family
ID=85086969
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/028662 WO2023008385A1 (ja) | 2021-07-26 | 2022-07-25 | 前眼部解析装置、前眼部解析方法、およびプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240169547A1 (ja) |
EP (1) | EP4378373A1 (ja) |
JP (1) | JP2023017402A (ja) |
CN (1) | CN117858654A (ja) |
WO (1) | WO2023008385A1 (ja) |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011189113A (ja) * | 2010-02-17 | 2011-09-29 | Canon Inc | 眼科画像表示装置、眼科画像表示方法、プログラム、及び記憶媒体 |
JP2013248376A (ja) | 2012-05-01 | 2013-12-12 | Topcon Corp | 眼科装置 |
JP2014500096A (ja) * | 2010-12-03 | 2014-01-09 | オプトビュー,インコーポレーテッド | 光干渉断層法を用いたスキャン及び処理 |
JP2015506772A (ja) * | 2012-02-10 | 2015-03-05 | カール ツアイス メディテック アクチエンゲゼルシャフト | フルレンジ・フーリエ領域光干渉断層撮影用セグメント化および高性能可視化技術 |
JP2015066084A (ja) * | 2013-09-27 | 2015-04-13 | 株式会社トーメーコーポレーション | 2次元断層画像処理装置、プログラムおよび2次元断層画像処理方法 |
JP2016077774A (ja) | 2014-10-22 | 2016-05-16 | 株式会社トプコン | 眼科装置 |
JP2017182739A (ja) * | 2016-03-31 | 2017-10-05 | 富士通株式会社 | 視線検出装置、視線検出方法及び視線検出用コンピュータプログラム |
JP6580448B2 (ja) | 2015-10-16 | 2019-09-25 | 株式会社トプコン | 眼科撮影装置及び眼科情報処理装置 |
JP2020048857A (ja) | 2018-09-27 | 2020-04-02 | 国立大学法人 筑波大学 | 眼測定装置及び方法 |
JP2020199106A (ja) | 2019-06-11 | 2020-12-17 | 株式会社トプコン | 眼科情報処理装置、眼科装置、眼科情報処理方法、及びプログラム |
JP2021121653A (ja) | 2020-01-31 | 2021-08-26 | Jnc株式会社 | コーティング組成物およびその利用品 |
-
2021
- 2021-07-26 JP JP2021121653A patent/JP2023017402A/ja active Pending
-
2022
- 2022-07-25 CN CN202280052549.8A patent/CN117858654A/zh active Pending
- 2022-07-25 EP EP22849446.4A patent/EP4378373A1/en active Pending
- 2022-07-25 WO PCT/JP2022/028662 patent/WO2023008385A1/ja active Application Filing
-
2024
- 2024-01-26 US US18/423,609 patent/US20240169547A1/en active Pending
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011189113A (ja) * | 2010-02-17 | 2011-09-29 | Canon Inc | 眼科画像表示装置、眼科画像表示方法、プログラム、及び記憶媒体 |
JP2014500096A (ja) * | 2010-12-03 | 2014-01-09 | オプトビュー,インコーポレーテッド | 光干渉断層法を用いたスキャン及び処理 |
JP2015506772A (ja) * | 2012-02-10 | 2015-03-05 | カール ツアイス メディテック アクチエンゲゼルシャフト | フルレンジ・フーリエ領域光干渉断層撮影用セグメント化および高性能可視化技術 |
JP2013248376A (ja) | 2012-05-01 | 2013-12-12 | Topcon Corp | 眼科装置 |
JP2015066084A (ja) * | 2013-09-27 | 2015-04-13 | 株式会社トーメーコーポレーション | 2次元断層画像処理装置、プログラムおよび2次元断層画像処理方法 |
JP2016077774A (ja) | 2014-10-22 | 2016-05-16 | 株式会社トプコン | 眼科装置 |
JP6580448B2 (ja) | 2015-10-16 | 2019-09-25 | 株式会社トプコン | 眼科撮影装置及び眼科情報処理装置 |
JP2017182739A (ja) * | 2016-03-31 | 2017-10-05 | 富士通株式会社 | 視線検出装置、視線検出方法及び視線検出用コンピュータプログラム |
JP2020048857A (ja) | 2018-09-27 | 2020-04-02 | 国立大学法人 筑波大学 | 眼測定装置及び方法 |
JP2020199106A (ja) | 2019-06-11 | 2020-12-17 | 株式会社トプコン | 眼科情報処理装置、眼科装置、眼科情報処理方法、及びプログラム |
JP2021121653A (ja) | 2020-01-31 | 2021-08-26 | Jnc株式会社 | コーティング組成物およびその利用品 |
Also Published As
Publication number | Publication date |
---|---|
US20240169547A1 (en) | 2024-05-23 |
JP2023017402A (ja) | 2023-02-07 |
CN117858654A (zh) | 2024-04-09 |
EP4378373A1 (en) | 2024-06-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9918634B2 (en) | Systems and methods for improved ophthalmic imaging | |
JP6057567B2 (ja) | 撮像制御装置、眼科撮像装置、撮像制御方法及びプログラム | |
US7828437B2 (en) | Fundus oculi observation device and fundus oculi image processing device | |
US8899749B2 (en) | Image processing apparatus, image processing method, image processing system, SLO apparatus, and program | |
JP4971872B2 (ja) | 眼底観察装置及びそれを制御するプログラム | |
US11826102B2 (en) | Ophthalmic device, control method therefor, and recording medium | |
JP7384987B2 (ja) | 眼科装置 | |
JP2023076659A (ja) | 眼科装置 | |
JP2015080679A (ja) | 画像処理装置、画像処理方法およびプログラム | |
JP6497872B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
WO2020050308A1 (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP7106304B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP7297952B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
JP6853690B2 (ja) | 眼科撮影装置 | |
WO2023008385A1 (ja) | 前眼部解析装置、前眼部解析方法、およびプログラム | |
JP2019187551A (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP7204345B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
WO2023282339A1 (ja) | 画像処理方法、画像処理プログラム、画像処理装置及び眼科装置 | |
JP2024099210A (ja) | 光コヒーレンストモグラフィ装置、その制御方法、及びプログラム | |
JP2023148522A (ja) | 眼科画像処理装置、眼科画像処理プログラム | |
JP2019201718A (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP2019084400A (ja) | 画像処理装置、画像処理方法及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22849446 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280052549.8 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022849446 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022849446 Country of ref document: EP Effective date: 20240226 |