WO2023182011A1 - 画像処理方法、画像処理装置、眼科装置、及びプログラム - Google Patents
画像処理方法、画像処理装置、眼科装置、及びプログラム Download PDFInfo
- Publication number
- WO2023182011A1 WO2023182011A1 PCT/JP2023/009452 JP2023009452W WO2023182011A1 WO 2023182011 A1 WO2023182011 A1 WO 2023182011A1 JP 2023009452 W JP2023009452 W JP 2023009452W WO 2023182011 A1 WO2023182011 A1 WO 2023182011A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- optical system
- numerical aperture
- image processing
- dimensional
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0016—Operational features thereof
- A61B3/0025—Operational features thereof characterised by electronic signal processing, e.g. eye models
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/102—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for optical coherence tomography [OCT]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
Definitions
- the technology of the present disclosure relates to an image processing method, an ophthalmological apparatus, and a program.
- Optical coherence tomography based on optical coherence tomography can measure the structure of the eye to be examined using reflected light from different layers in the fundus of the eye to be examined, and can produce one-dimensional depth direction, two-dimensional tomographic images, and three-dimensional tomographic images. images can be obtained.
- a technique is known that uses an optical coherence tomography to generate data regarding the structure of an eye to be examined (US Pat. No. 1,023,8281).
- the first aspect of the technology of the present disclosure is An image processing method in an image processing device performed by a processor, the method comprising: signal light in which light from a light source is irradiated onto an irradiated object by an optical system having a first numerical aperture, and return light from the irradiated object due to the irradiated light is propagated by an optical system having a second numerical aperture; obtaining information indicating the interference light obtained by detecting interference light with a reference light obtained by dividing the light from the light source; Information indicating the interference light is transmitted in a four-dimensional space between the frequency of the light source and the frequency of three-dimensional light indicating the irradiated object, and in an optical system with the first numerical aperture and the second numerical aperture.
- This image processing method includes performing a first process of projecting onto a four-dimensional frequency aperture formed by an optical system, and a second process of projecting the projected information onto a three-dimensional space.
- the second aspect of the technology of the present disclosure is An image processing device including a memory and a processor,
- the processor includes: signal light in which light from a light source is irradiated onto an irradiated object by an optical system having a first numerical aperture, and return light from the irradiated object due to the irradiated light is propagated by an optical system having a second numerical aperture; obtaining information indicating the interference light obtained by detecting interference light with a reference light obtained by dividing the light from the light source; Information indicating the interference light is transmitted in a four-dimensional space between the frequency of the light source and the frequency of three-dimensional light indicating the irradiated object, and in an optical system with the first numerical aperture and the second numerical aperture.
- This is an image processing device that performs a first process of projecting onto a four-dimensional frequency aperture formed by an optical system and a second process of projecting the projected information onto a three-dimensional space.
- the third aspect of the technology of the present disclosure is to the computer, Light from a light source is irradiated onto the eye to be examined through an optical system having a first numerical aperture, and return light from the eye to be examined due to the irradiated light is combined with signal light propagated by an optical system having a second numerical aperture, and signal light from the light source. obtaining information indicating the interference light obtained by detecting interference light with a reference light obtained by dividing the light; Information indicating the interference light is transmitted in a four-dimensional space between the frequency of the light source and the frequency of light from the eye to be examined by the signal light, and an optical system having the first numerical aperture and an optical system having the second numerical aperture.
- This is a program that executes a first process of projecting information onto a four-dimensional frequency aperture formed by the following, and a second process of projecting the projected information onto a three-dimensional space.
- FIG. 1 is a schematic configuration diagram of an ophthalmologic system according to an embodiment.
- FIG. 1 is a schematic configuration diagram of an ophthalmologic apparatus according to an embodiment. It is a conceptual diagram of an OCT image.
- FIG. 2 is a conceptual diagram of an example showing information regarding an OCT image in an OCT system in a four-dimensional frequency space.
- FIG. 4 is a conceptual diagram showing an example of a four-dimensional aperture A4 .
- FIG. 4 is a conceptual diagram showing a part of a four-dimensional aperture A4 .
- FIG. 2 is an explanatory diagram conceptually illustrating correction processing of an OCT image using a double projection method. It is a conceptual block diagram of the optical system in an OCT system.
- FIG. 1 is a schematic configuration diagram of an ophthalmologic apparatus according to an embodiment. It is a conceptual diagram of an OCT image.
- FIG. 2 is a conceptual diagram of an example showing information regarding an OCT image in an OCT system in a four-dimensional
- FIG. 3 is an explanatory diagram of functions realized by an image processing program. 3 is a flowchart showing an example of the flow of image processing.
- FIG. 2 is a conceptual diagram showing an example of an OCT image according to the present embodiment.
- FIG. 2 is a conceptual diagram showing a comparative example of OCT images.
- FIG. 1 shows a schematic configuration of an ophthalmologic system 100.
- the ophthalmology system 100 includes an ophthalmology apparatus 110, a server device (hereinafter referred to as “server”) 140, and a display device (hereinafter referred to as "viewer”) 150.
- the ophthalmologic apparatus 110 acquires fundus images.
- the server 140 stores a plurality of fundus images obtained by photographing the fundus of a plurality of patients by the ophthalmological device 110 and an axial length measured by an axial length measuring device (not shown) in a manner corresponding to the patient ID. memorize it.
- the viewer 150 displays fundus images and analysis results obtained by the server 140.
- a subject's eye is applied as an example of the "illuminated object" of the technology of the present disclosure.
- the ophthalmological device 110 is an example of an “ophthalmological device” according to the technology of the present disclosure.
- the ophthalmological apparatus is also an example of the "image processing apparatus" of the technology of the present disclosure.
- the ophthalmological apparatus 110, server 140, and viewer 150 are interconnected via a network 130.
- the network 130 is any network such as a LAN, WAN, the Internet, or a wide area Ethernet network.
- a LAN can be used as the network 130.
- the viewer 150 is a client in a client server system, and a plurality of viewers are connected via a network. Furthermore, a plurality of servers 140 may be connected via a network to ensure system redundancy. Further, if the ophthalmological apparatus 110 is provided with an image processing function and an image viewing function of the viewer 150, it becomes possible to acquire fundus images, image processing, and image viewing in a standalone state. Further, if the server 140 is provided with the image viewing function of the viewer 150, the configuration of the ophthalmological apparatus 110 and the server 140 enables fundus image acquisition, image processing, and image viewing.
- ophthalmological devices examination devices such as visual field measurement and intraocular pressure measurement
- diagnostic support devices that perform image analysis using AI (Artificial Intelligence) are connected to the ophthalmological device 110, the server 140, and the viewer via the network 130. 150.
- AI Artificial Intelligence
- SLO scanning laser ophthalmoscope
- OCT optical coherence tomography
- the horizontal direction is the "X direction”
- the vertical direction to the horizontal plane is the "Y direction” connecting the center of the pupil in the anterior segment of the eye 12 to be examined and the center of the eyeball.
- Let the direction be the "Z direction”. Therefore, the X, Y, and Z directions are perpendicular to each other.
- the ophthalmological apparatus 110 includes an imaging device 14 and a control device 16.
- the photographing device 14 includes an SLO unit 18 and an OCT unit 20, and acquires a fundus image of the eye 12 to be examined.
- the two-dimensional fundus image acquired by the SLO unit 18 will be referred to as an SLO image.
- a retinal tomographic image, en-face image, or the like created based on OCT data acquired by the OCT unit 20 may be referred to as an OCT image.
- the control device 16 has a CPU (Central Processing Unit) 16A, a RAM (Random Access Memory) 16B, a ROM (Read-Only Memory) 16C, and an input/output port (I/O) 16D. equipped with a computer ing.
- CPU Central Processing Unit
- RAM Random Access Memory
- ROM Read-Only Memory
- I/O input/output port
- the control device 16 includes an input/display device 16E connected to the CPU 16A via an I/O port 16D.
- the input/display device 16E has a graphic user interface that displays an image of the eye 12 to be examined and receives various instructions from the user.
- An example of a graphic user interface is a touch panel display.
- the control device 16 also includes an image processor 17 connected to the I/O port 16D.
- the image processor 17 generates an image of the eye 12 based on the data obtained by the imaging device 14.
- Control device 16 is connected to network 130 via communication interface 16F.
- the image processor 17 includes a memory 17M, which is a nonvolatile storage device that can store an image processing program to be described later.
- the control device 16 of the ophthalmological device 110 includes the input/display device 16E, but the technology of the present disclosure is not limited thereto.
- the control device 16 of the ophthalmologic device 110 may not include the input/display device 16E, but may include a separate input/display device that is physically independent of the ophthalmologic device 110.
- the display device may include an image processing processor unit, and the image processing processor unit may display the SLO image or the like based on the image signal output from the ophthalmological apparatus 110.
- the photographing device 14 operates under the control of the CPU 16A of the control device 16.
- the imaging device 14 includes an SLO unit 18, an imaging optical system 19, and an OCT unit 20.
- the photographing optical system 19 includes an optical scanner 22 and a wide-angle optical system 30.
- the optical scanner 22 two-dimensionally scans the light emitted from the SLO unit 18 in the X direction and the Y direction.
- the optical scanner 22 may be any optical element that can deflect a light beam, and for example, a polygon mirror, a galvano mirror, or the like can be used. Alternatively, a combination thereof may be used.
- the wide-angle optical system 30 combines the light from the SLO unit 18 and the light from the OCT unit 20.
- the wide-angle optical system 30 may be a reflective optical system using a concave mirror such as an elliptical mirror, a refractive optical system using a wide-angle lens, or a catadioptric optical system combining a concave mirror and lenses.
- a wide-angle optical system using an elliptical mirror, a wide-angle lens, or the like it is possible to photograph not only the central part of the fundus but also the retina in the peripheral part of the fundus.
- the wide-angle optical system 30 realizes observation in a wide field of view (FOV) region 12A in the fundus.
- the FOV region 12A indicates the range that can be photographed by the photographing device 14.
- the FOV area 12A may be expressed as a viewing angle.
- the viewing angle may be defined by an internal illumination angle and an external illumination angle.
- the external irradiation angle is an irradiation angle that defines the irradiation angle of the light beam irradiated from the ophthalmological apparatus 110 to the eye 12 to be examined, with the pupil 27 as a reference.
- the internal illumination angle is an illumination angle that defines the illumination angle of the light beam irradiated to the fundus F with the eyeball center O as a reference.
- the external illumination angle and the internal illumination angle have a corresponding relationship. For example, if the external illumination angle is 120 degrees, the internal illumination angle corresponds to approximately 160 degrees. In this embodiment, the internal illumination angle is 200 degrees.
- UWF-SLO fundus image an SLO fundus image obtained by photographing at an internal illumination angle of 160 degrees or more is referred to as a UWF-SLO fundus image.
- UWF is an abbreviation for UltraWide Field.
- the wide-angle optical system 30 with a super-wide viewing angle (FOV) of the fundus of the eye can photograph an area extending from the posterior pole of the fundus of the eye 12 to the equator.
- the ophthalmological apparatus 110 can photograph a region 12A with an internal illumination angle of 200° using the eyeball center O of the subject's eye 12 as a reference position.
- the internal illumination angle of 200° is 110° in terms of the external illumination angle with respect to the pupil of the eyeball of the eye 12 to be examined. That is, the wide-angle optical system 30 irradiates a laser beam from the pupil with an external illumination angle of 110° and photographs a fundus region of 200° with an internal illumination angle.
- the SLO system is realized by the control device 16, the SLO unit 18, and the photographing optical system 19 shown in FIG. Since the SLO system includes the wide-angle optical system 30, it is possible to photograph the fundus in a wide FOV region 12A.
- the SLO unit 18 includes a B light (blue light) light source, a G light (green light) light source, an R light (red light) light source, and an IR light (infrared light (e.g., near infrared light)) light source. It is equipped with a light source 40 that includes. Each color of light from the light source 40 is guided along the same optical path.
- the SLO unit 18 is configured to be able to switch between a light source that emits laser light of different wavelengths or a combination of light sources that emit light, such as a mode that emits R light and G light and a mode that emits infrared light.
- a light source that emits laser light of different wavelengths or a combination of light sources that emit light, such as a mode that emits R light and G light and a mode that emits infrared light.
- the technology of the present disclosure is not limited to providing four light sources: a B light source, a G light source, an R light source, and an IR light source.
- the SLO unit 18 may further include a white light source and emit light in various modes, such as a mode that emits G light, R light, and B light, or a mode that emits only white light. good.
- the light incident on the photographing optical system 19 from the SLO unit 18 is scanned in the X direction and the Y direction by the optical scanner 22.
- the scanning light passes through the wide-angle optical system 30 and the pupil 27 and is irradiated onto the fundus of the eye.
- the light reflected by the fundus enters the SLO unit 18 via the wide-angle optical system 30 and the optical scanner 22.
- the SLO unit 18 includes a beam splitter 60 that guides light from the posterior segment (fundus) of the eye 12 to be examined to the detection element 70.
- the detection element 70 detects light from the posterior segment (fundus) of the eye 12 to be examined.
- the beam splitter 60 and the detection element 70 may be provided for each color.
- each beam splitter for B light, G light, R light, and IR light is arranged on the optical axis, and a detection element of a corresponding color is arranged downstream of each beam splitter.
- the beam splitter for B light may reflect B light and transmit light other than B light.
- a beam splitter for G light reflects G light and transmits light other than G light
- a beam splitter for R light reflects R light and transmits light other than R light
- a beam splitter for IR light reflects R light and transmits light other than R light.
- the beam splitter may reflect IR light.
- the image processor 17, which operates under the control of the CPU 16A, processes the light (reflected light reflected by the fundus) that is incident on the SLO unit 18 via the wide-angle optical system 30 and the optical scanner 22, that is, the light of each color.
- a UWF-SLO image is generated using the signal of the detection element 70.
- control device 16 controls the light sources 40 for each color so that they emit light at the same time.
- a G-color fundus image, an R-color fundus image, and a B-color fundus image whose respective positions correspond to each other are obtained.
- An RGB color fundus image is obtained from the G color fundus image, the R color fundus image, and the B color fundus image.
- the control device 16 controls the light sources 40 for each color so that they emit light simultaneously, and the fundus of the subject's eye 12 is simultaneously photographed using the G light and the R light, thereby creating a G-color fundus image and an R-color fundus image whose respective positions correspond to each other.
- a colored fundus image is obtained.
- An RG color fundus image is obtained from the G color fundus image and the R color fundus image.
- the wide-angle optical system 30 makes it possible to set the field of view (FOV) of the fundus to an ultra-wide angle, and to photograph a region beyond the equator from the posterior pole of the fundus of the eye 12 to be examined.
- FOV field of view
- the OCT system is realized by the control device 16, OCT unit 20, and imaging optical system 19 shown in FIG. Since the OCT system includes the wide-angle optical system 30, it is possible to perform OCT imaging of the peripheral part of the fundus, similar to the imaging of the SLO fundus image described above. That is, by using the wide-angle optical system 30 in which the viewing angle (FOV) of the fundus is set to an ultra-wide angle, it is possible to perform OCT imaging of a region extending from the posterior pole of the fundus of the eye 12 to the equator. OCT data of the peripheral part of the fundus can be acquired, and a 3D structure of the fundus can be obtained by image processing the tomographic image and the OCT data.
- FOV viewing angle
- the OCT unit 20 includes an irradiation/detection optical system 20E including a light source 20A, a sensor (detection element) 20B, a first optical coupler 20C, a reference optical system 20D, and a second optical coupler 200 (FIG. 3). .
- the light emitted from the light source 20A is branched by the first optical coupler 20C.
- One of the branched lights is input as measurement light to the photographing optical system 19 via the irradiation/detection optical system 20E.
- the measurement light passes through the wide-angle optical system 30 and the pupil 27 and is irradiated onto the fundus of the eye.
- the measurement light reflected by the fundus enters the OCT unit 20 via the wide-angle optical system 30 and the optical scanner 22, and enters the sensor 20B via the irradiation/detection optical system 20E and the first optical coupler 20C.
- the other light branched by the first optical coupler 20C is incident on the reference optical system 20D, and the returned light is incident on the sensor 20B via the first optical coupler 20C as reference light.
- the light incident on the sensor 20B that is, the measurement light reflected on the fundus of the eye and the reference light are interfered with to generate interference light.
- the interference light is received by the sensor 20B.
- An image processor 17 operating under the control of the CPU 16 generates OCT data detected by the sensor 20B. It is possible to generate a tomographic image and an OCT image based on the OCT data.
- the above-mentioned OCT unit 20 can scan a predetermined range (for example, a rectangular range of 6 mm x 6 mm) in one OCT imaging.
- the predetermined range is not limited to 6 mm x 6 mm, but may be a square range of 12 mm x 12 mm or 23 mm x 23 mm, or a rectangular range such as 14 mm x 9 mm, 6 mm x 3.5 mm, or any rectangular range. can.
- the diameter may be in a range of 6 mm, 12 mm, 23 mm, etc.
- the ophthalmological apparatus 110 can scan an area 12A with an internal illumination angle of 200°. That is, by controlling the optical scanner 22, OCT imaging of a predetermined range is performed.
- the ophthalmologic apparatus 110 can generate OCT data through the OCT imaging. Therefore, the ophthalmological apparatus 110 can generate a tomographic image (B-scan image) of the fundus, which is an OCT image, OCT volume data, and an en-face image (a frontal image generated from the OCT volume data) which is a cross section of the OCT volume data. can be generated.
- OCT data (or image data of an OCT image) is sent from the ophthalmological apparatus 110 to the server 140 via the communication interface 16F, and is stored in the storage device.
- FIG. 3 shows an example of an image (OCT image) of dots lined up on the optical axis obtained simultaneously by an OCT system using an optical system (e.g. objective lens) with different numerical apertures (hereinafter referred to as NA). shows.
- the axis in the depth direction at the same position on the x-axis is c ⁇ .
- c indicates a luminous flux
- c ⁇ ⁇ .
- the depth of focus of the optical system is indicated as DOF.
- the dot image obtained by the OCT system becomes an incomplete dot image as the position in the depth direction in the fundus moves away from the focal position (e.g. focal depth) of the optical system.
- the resolution gradually decreases as the position from which information regarding the fundus of the eye is acquired moves away from the focal position (for example, depth of focus) of the photographing optical system 19. Therefore, as the distance from the focal depth increases, the image gradually becomes an incomplete dot image having blurring, distortion, and the like.
- the reason why the OCT image restoration by the OCT system is incomplete is considered to be that the information acquired by the OCT system does not reflect all the information regarding the fundus of the eye 12 to be examined and is incomplete.
- FIG. 4 shows an example of information regarding OCT images.
- ⁇ indicates the frequency of light from the light source 20A (in this embodiment, the frequency of light irradiated to the fundus), and fx, fy, and fz indicate object frequencies (in this embodiment, the frequencies of reflected light on the fundus).
- optical system for example, objective lens
- NA NA
- the object frequencies (fx, fy, fz) are uniform in the ⁇ direction. Furthermore, in the 4-dimensional frequency space, since the frequency in the real space is within a predetermined range, a window function indicating the 4-dimensional aperture A 4 (4-D aperture) is defined as an equipment function of the OCT system. Ru.
- a four-dimensional aperture A4 is an aperture space in a four-dimensional frequency space. Therefore, the OCT system acquires only the frequencies within the four-dimensional aperture A4 among the object frequencies.
- the frequency (four-dimensional information) within the four-dimensional aperture A4 is not entirely acquired, but is integrated in the fz direction at the time of detection, and is acquired as three-dimensional information. In this way, when the frequency within the four-dimensional aperture A4 is integrated in the fz direction, part of the four-dimensional information disappears.
- this embodiment provides an OCT system that can reduce information loss in which part of the four-dimensional information disappears.
- the frequency of the image is the ⁇ integral of the object frequency within the four-dimensional aperture A4 . Therefore, by applying the ⁇ integral of the object frequency within the four-dimensional aperture A4 , it is possible to reduce information loss in which part of the four-dimensional information disappears.
- an optical system that makes the four-dimensional aperture A4 sufficiently thin is provided as an OCT system that reduces information loss. This makes it possible to reduce information loss in which part of the four-dimensional information disappears even after fz integration.
- the NA of the optical system on the irradiation side that irradiates light to the eye 12 to be examined is different from the NA of the optical system on the detection side that detects light from the eye 12 to be examined (for example, light reflected from the fundus).
- the optical system is formed to have a NA.
- an optical system with an NA of zero includes an optical system with an NA that is larger than zero and has an aperture angle that is smaller than the image height formed by the light beam.
- an optical system having an NA of zero includes, for example, an optical system in which an optical system is formed so that the incident light or the output light becomes a parallel light beam, and the NA can be considered to be zero.
- an optical system with zero NA uses full-field irradiation, which uniformly illuminates the object on the irradiated side, such as the eye 12, or illuminates a predetermined area of the object on the irradiated side all at once. It is applicable to the optical system of an OCT system.
- the object frequency within the 4-dimensional aperture A 4 will have a sufficient amount of information (including fz information), and even if fz integration is performed, information regarding the eye 12 to be examined (4-dimensional information) It is possible to reduce information loss where part of the information is lost.
- the optical systems are formed for the eye 12 to be examined so that the NA of the irradiation-side optical system and the NA of the detection-side optical system are different.
- an optical system is formed in which the NA of the irradiation side optical system is sufficiently small, that is, the NA is zero, and an optical system is formed in which the NA of the detection side optical system is larger than the NA of the irradiation side optical system.
- FIG. 5 it becomes possible to form the four-dimensional aperture A4 thinly. More specifically, as shown by the four-dimensional aperture A4 at a predetermined frequency (frequency ⁇ of light from the light source 20A) in FIG. Become. This continuous line-shaped aperture becomes the four-dimensional aperture A 4 (FIG. 5) in this embodiment.
- the optical systems for forming the four-dimensional aperture A4 thinly are such that one optical system has an NA of zero or close to zero, and the other optical system has an NA larger than the NA of one optical system.
- the four-dimensional aperture A4 formed thinly is an example of the "four-dimensional frequency aperture" of the technology of the present disclosure.
- the NA of the optical system on the side that irradiates light to the eye 12 to be examined is assumed to be an optical system with zero NA
- the NA of the optical system on the side that detects light from the eye 12 to be examined is assumed to be an NA of a finite value.
- the detection-side optical system may have an NA of zero
- the irradiation-side optical system may have a finite NA.
- the NA may be set to a finite value.
- the NA of the optical system on the irradiation side and the NA of the optical system on the detection side may be formed to be different, and by making the NA of the two different, the four-dimensional aperture A can be made smaller than when both have the same NA. 4 can be made thinner.
- an OCT image is corrected using a four-dimensional aperture A4 that can reduce information loss. That is, the OCT image is corrected by a two-step process (double projection method) using frequency information of the OCT image.
- the first stage process involves projecting the frequency information (I) of the OCT image G1 onto a four-dimensional aperture A4 formed thinly.
- the information loss of the information (four-dimensional information) regarding the eye 12 to be examined is reduced. Therefore, by the double projection method described above, it is possible to correct, for example, a distorted OCT image into an image equivalent to a normal microscope image. Furthermore, since information loss is reduced, resolution is also maintained, making it possible to generate OCT images with higher precision than when the thin four-dimensional aperture A4 is not used.
- the four-dimensional aperture A 4 is a quantity that does not depend on the object frequency, It is defined by an optical system with a first numerical aperture and an optical system with a second numerical aperture.
- the four-dimensional aperture can be expressed as a window function that can express a physical quantity determined by the configuration of the optical system in the OCT system.
- the object frequency is a physical quantity expressing the eye 12 to be examined using light components. Since the object frequency has no ⁇ dependence, it becomes a uniform four-dimensional physical quantity in the ⁇ direction. Note that the frequency ⁇ is the optical frequency of the light source and is also the optical frequency of the signal light.
- a value of a three-dimensional function obtained by fz-integrating a four-dimensional function (obtainable image frequency) obtained by multiplying this object frequency by a four-dimensional aperture A4 is obtained as a detected value by the optical system. That is, the value of a three-dimensional function indicating the fz integral of a four-dimensional function obtained by multiplying the object frequency by the four-dimensional aperture A4 is acquired by the sensor of the optical system of the OCT system.
- an aperture surface may be defined within the four-dimensional aperture A4 .
- This aberration surface is a predetermined surface that is farthest from the origin within the 4-dimensional aperture A 4 (for example, the outermost surface of the 4-dimensional aperture A 4 , which is the surface farthest from the origin within the 4-dimensional aperture A 4 ). is preferred.
- ⁇ indicates the distance in the depth direction of the eye 12 to be examined (in the example of time-type domain OCT, the amount of adjustment for adjusting the optical path length in the reference optical system 20D, for example, the amount of movement of a mirror).
- the information ( ⁇ I) of the projected image G2 at the four-dimensional aperture A4 can be expressed by the following (2).
- the information ( ⁇ I') of the image G3 projected from the four-dimensional aperture A4 can be expressed by the following (3).
- the control device 16 uses the image processor 17 operating under the control of the CPU 16A to generate an OCT image using equations (1) to (3) described above. The process of generating the OCT image will be described later.
- the sensor 20B of the OCT unit 20 includes a pair of lenses 20B-1 and a detection element 20B-2.
- the sensor 20B collimates the light branched by the first optical coupler 20C, that is, the interference light obtained by interfering the measurement light reflected from the fundus and the reference light, into parallel light by a pair of lenses 20B-1.
- the light is focused on the detection element 20B-2.
- the reference optical system 20D includes a pair of lenses 20D-1 and a mirror 20D-2.
- the light branched by the first optical coupler 20C that is, the measurement light
- the mirror 20D-2 is configured to be movable in the optical axis direction (the direction indicated by the arrow ⁇ in FIG. 8).
- the light reflected by the mirror 20D-2 is returned to the first optical coupler 20C as a reference light via a pair of lenses 20D-1.
- the OCT system according to this embodiment uses TD-OCT (Time-Domain OCT), which is known as time-based domain OCT.
- TD-OCT Time-Domain OCT
- the OCT system may have a configuration according to the OCT system to which it is applied.
- TD-OCT time domain OCT
- the mirror 20D-2 may be moved and swept.
- SD-OCT of Fourier domain type OCT spectroscopic detection may be performed with the mirror 20D-2 fixed.
- a light source that emits multiple lights of multiple wavelengths may be used.
- SS-OCT a one-pixel detector, for example, may be used as the detection element 20B-2, the mirror 20D-2 may be fixed, and the wavelength of the broadband light source may be swept.
- a broadband light source such as a laser device that emits a broadband laser beam for wavelength sweeping may be used as the light source 20A.
- an optical system with zero NA full-field irradiation is used to uniformly illuminate an object on the irradiated side, such as the eye 12, or to illuminate a predetermined area of the object on the irradiated side all at once.
- the optical system of the OCT system can be applied. That is, the technology of the present disclosure is not limited to the laser scanning type optical system that scans the laser beam described above, but can also be applied to an optical system using a CCD (Charge Coupled Device) camera or the like. It is possible. In this case, a CCD camera including an element such as a CCD may be used as the detection element 20B-2.
- CCD Charge Coupled Device
- the irradiation/detection optical system 20E includes a second optical coupler 200, an irradiation optical system 210, and a detection optical system 220.
- the second optical coupler 200 has the function of guiding the light emitted from the light source 20A (the measurement light branched by the first optical coupler 20C) to the irradiation optical system 210 as irradiation light, and the function of guiding the light emitted from the light source 20A to the irradiation optical system 210. (i.e., return light from the subject's eye 12) as detection light to the first optical coupler 20C (i.e., toward the sensor 20B).
- the eye 12 to be examined is irradiated with the light propagating through the irradiation optical system 210 as light from the light source 20A, and the sensor 20B is irradiated with the return light ( The reflected light from the subject's eye 12) propagates through the detection optical system 220 and enters as measurement light.
- the optical path of the irradiation optical system 210 in the irradiation/detection optical system 20E is shown by a solid line
- the optical path of the detection optical system 220 is shown by a dotted line.
- the irradiation optical system 210 includes a pair of lenses, a lens 212 and a lens 214.
- the lens 212 is arranged so that one end surface of the fiber 211 on the irradiation side is located at the focal position on the entrance side of the lens 212 on the optical axis of the irradiation optical system 210 .
- the fiber 211 on the irradiation side is formed of a single mode fiber, and the other end face is connected to the second optical coupler 200.
- the lens 214 is arranged so that the reflective surface of the optical scanner 22 is located at the focal position on the exit side of the lens 214 on the optical axis of the irradiation optical system 210.
- the lens 212 collimates the light from the light source 20A (the measurement light branched by the first optical coupler 20C) into parallel light, and the lens 214 converts the parallel light collimated by the lens 212 into the optical scanner 22. converges on the reflective surface of The light reflected by the optical scanner 22 is collimated into parallel light by the lens 12B of the eye 12 to be examined, and is irradiated onto the fundus 12C. Therefore, the light from the light source 20A via the irradiation optical system 210 is irradiated onto the fundus 12C of the eye 12 to be examined as parallel light.
- the irradiation optical system 210 is an optical system that emits parallel light toward the eye 12 to be examined, thereby forming an optical system with zero NA.
- the NA of the irradiation optical system 210 is an example of the "first numerical aperture” of the technology of the present disclosure. Further, the irradiation optical system 210 is an example of the "first numerical aperture optical system" of the technology of the present disclosure.
- the detection optical system 220 includes a beam splitter 222 and a pair of lenses including a lens 214 and a lens 216.
- the beam splitter 222 is arranged between the lens 212 and the lens 214 in the irradiation optical system 210, and extracts the return light (reflected light) from the eye 12 by a reflection function.
- the beam splitter 222 is arranged so that the reflective surface of the beam splitter 222 is located at the focal position of the exit side of the return light (reflected light) from the eye 12 of the lens 214 on the optical axis of the irradiation optical system 210. be done.
- the lens 224 is arranged so that the reflective surface of the beam splitter 222 is located at the focal point on the incident side of the lens 224 on the optical axis of the detection optical system 220. Further, the lens 226 is arranged such that one end surface of the fiber 228 on the detection side is located at the focal position on the output side of the lens 226 on the optical axis of the detection optical system 220.
- the detection side fiber 228 is formed by a single mode fiber, and the other end face is connected to the second optical coupler 200. Therefore, the lens 224 collimates the light from the beam splitter 222 into parallel light, and the lens 226 converges the parallel light collimated by the lens 224 onto the end face of the fiber 228 on the detection side. Therefore, in the detection optical system 220, the light reflected at the fundus 12C of the eye 12 to be examined is emitted toward the sensor 20B.
- the detection optical system 220 is an optical system having a focal point on the subject's eye 12 side, thereby forming an optical system that has a different NA from the irradiation optical system 210 and has a finite NA.
- the NA of the detection optical system 220 is an example of the "second numerical aperture” of the technology of the present disclosure. Further, the detection optical system 220 is an example of the "second numerical aperture optical system" of the technology of the present disclosure.
- an OCT image is generated in the ophthalmological apparatus 110 using information with reduced information loss of four-dimensional information obtained using the OCT system including the irradiation optical system 210 with zero NA described above.
- An OCT image is generated by executing an image processing program using the image processor 17 that operates under the control of the CPU 16A.
- the OCT image may be generated in an external device such as the server 140.
- An image processing program shown in FIG. 10 is stored in the ROM 16C of the ophthalmological apparatus 110 or the memory 17M of the image processor 17.
- the ROM 16C and the memory 17M are examples of "memory" in the technology of the present disclosure.
- the CPU 16A is an example of a “processor” in the technology of the present disclosure.
- the image processing program is an example of a "program” in the technology of the present disclosure.
- various functions are realized by the CPU 16A reading and executing an image processing program.
- the image processing program includes a display control function, an image processing function, and a processing function. That is, when the CPU 16A executes an image processing program having each of these functions, the CPU 16A operates as a display control section 204, an image processing section 206, and a processing section 208, as shown in FIG.
- the image processing function includes an image processing function using the above-described two-step processing method (double projection method).
- FIG. 10 image processing according to this embodiment will be described in detail using FIG. 10.
- the image processing shown in the flowchart of FIG. 10 is realized by the CPU 16A of the ophthalmological apparatus 110 reading out and executing the image processing program from the ROM 16C or the memory 17M.
- the image processing process shown in FIG. 10 is an example of the image processing method of the present disclosure.
- the image processing unit 206 acquires OCT data from the image processor 17.
- the OCT data is data obtained by performing OCT imaging on the eye 12 to be examined using the ophthalmological apparatus 110.
- the OCT data includes data at positions at different depths in the optical axis direction (Z-axis direction).
- the OCT data is obtained by performing OCT imaging using the ophthalmologic apparatus 110 as described above.
- step S202 the image processing unit 206 executes a first stage process of projecting information (OCT data) obtained by the OCT system onto a four-dimensional aperture. That is, the image processing unit 206 expresses the acquired OCT data using the above equation (1) and projects the frequency information (I) of the OCT image G1 onto the thinly formed four-dimensional aperture A4 . Information on the frequency of the OCT image is derived using the above equation (2). The derived information is temporarily stored in RAM 16B.
- OCT data projecting information obtained by the OCT system onto a four-dimensional aperture. That is, the image processing unit 206 expresses the acquired OCT data using the above equation (1) and projects the frequency information (I) of the OCT image G1 onto the thinly formed four-dimensional aperture A4 . Information on the frequency of the OCT image is derived using the above equation (2). The derived information is temporarily stored in RAM 16B.
- the image processing unit 206 projects the information of the OCT data projected onto the four-dimensional aperture A4 onto a three-dimensional space using the above equation ( 3 ).
- Information on an OCT image after double projection projected onto a three-dimensional space is derived. The derived information is temporarily stored in RAM 16B.
- step S206 the image processing unit 206 generates an OCT image using the information projected by the double projection method.
- the generated OCT image is stored in the RAM 16B by the processing unit 208.
- the OCT image corrected using the double projection method described above may generate an OCT image at a depth that is a predetermined position in the optical axis direction, that is, in the depth direction of the eye 12 to be examined.
- An OCT image may be generated at a depth that is a position of .
- an OCT image may be generated at a position at a predetermined depth, such as a depth 10 times the depth of focus, based on the focal position of the detection optical system 220 on the eye 12 side.
- a plurality of OCT images at different depths may be generated using the focal position as a reference, or each OCT image may be generated at a predetermined number of positions at predetermined intervals.
- each of the plurality of OCT images may be stored in the RAM 16B or the memory 17M.
- the predetermined depth, different depths, and predetermined number can be determined in advance, and can be arbitrarily set by the user, for example.
- image processing such as noise removal may be performed in order to improve the sharpness of the image.
- OCT image is generated. Since the generated OCT image has reduced information loss and maintained resolution, it becomes a highly accurate OCT image that faithfully reflects the condition of the fundus of the eye 12 to be examined.
- processing for generating information on a display screen that displays the generated OCT image may be added.
- the process of generating information on a display screen that displays OCT images can also generate information on a display screen that displays OCT images as predetermined samples side by side.
- an OCT image obtained by correction using the double projection method described above from one point of information detected by the ophthalmological apparatus 110 can be changed from a distorted OCT image as shown in FIG. 11B to a normal one as shown in FIG. 11A.
- the image is corrected to an OCT image equivalent to a microscopic image.
- the irradiation/detection optical system 20E in the OCT unit 20 is formed so that the NA of the irradiation optical system 210 and the NA of the detection optical system are different, and the OCT image is corrected by the double projection method. be.
- the NA of the irradiation optical system 210 is determined by irradiating the subject's eye with a parallel light beam, forming an optical system that makes the NA zero, and the detection optical system 220 forming an optical system so that the NA becomes 0.9. are doing. Furthermore, FIG. 11A shows an OCT image generated at a depth position that is 30 times the depth of focus. In FIG. 11B, as a comparative example, the NA of the irradiation optical system 210 and the NA of the detection optical system 220 are 0.9 and are formed to match.
- FIG. 11A by applying the OCT system according to this embodiment, it is possible to obtain an OCT image as a single point image (point image) from information on a single point in the detected eye 12. Become.
- FIG. 11B if the NAs of the irradiation optical system 210 and the detection optical system 220 are made to match, the OCT image will have a distorted shape, and an image in which the state of the fundus in the subject's eye 12 is deformed will be obtained. . Therefore, as can be understood from these FIGS. 11A and 11B, by forming the 4-dimensional aperture A 4 thinly and generating an OCT image from information using the double projection method in which the 4-dimensional aperture A 4 is projected through the 4-dimensional aperture A 4. , it becomes possible to generate OCT images with high precision.
- the irradiation optical system 210 and the detection optical system 220 have different NAs, and one of the optical systems (the irradiation optical system 210 in the above example) has an NA of zero.
- This makes it possible to treat the aperture related to the OCT image in the four-dimensional space in consideration of frequency as a thin aperture (four-dimensional aperture A 4 ) whose influence can be suppressed even when fz integration is performed.
- a thin aperture four-dimensional aperture A 4
- information loss of information (four-dimensional information) regarding the eye 12 to be examined is reduced.
- FIG. 10 Although the above embodiment describes a case where image processing (FIG. 10) is executed by the ophthalmological apparatus 110, the technology of the present disclosure is not limited to this, and the image processing (FIG. 10) is performed by the ophthalmological apparatus 110, the server 140, the viewer 150, or The present invention may be implemented using any of the additional image processing devices further provided in the network 130, or a combination of any of the additional image processing devices.
- the technology of the present disclosure includes the following technology because image processing is preferably realized using information obtained using the four-dimensional aperture A4 formed thinly.
- An image processing device comprising:
- the image processing unit 206 is an example of an “acquisition unit” and a “processing unit” of the technology of the present disclosure.
- a detection unit that detects interference light between a signal light obtained by irradiating a subject's eye with light from a light source and a reference light obtained by splitting the light from the light source; an irradiation optical system formed with a first numerical aperture so as to irradiate the eye to be examined with light from the light source; a detection optical system formed with a second numerical aperture different from the first numerical aperture so that return light from the eye to be examined due to the light irradiated by the irradiation optical system is propagated to the detection unit as the signal light; Based on the information indicating the interference light detected by the detection unit, the information indicating the interference light is detected in a four-dimensional space of the frequency of the light source and the frequency of three-dimensional light indicating the subject's eye, and A first process of projecting onto a four-dimensional frequency aperture formed by an optical system with a first numerical aperture and an optical system with a second numerical aperture, and a second process of projecting the projected information onto a three
- the sensor 20B is an example of the "acquisition unit” of the technology of the present disclosure.
- the irradiation optical system 210 is an example of the “irradiation optical system” of the technology of the present disclosure
- the detection optical system 220 is an example of the “detection optical system” of the technology of the present disclosure.
- the image processing unit 206 is an example of a “processing unit” and an “image generation unit” of the technology of the present disclosure.
- the processing performed by executing a program stored in a storage device such as a memory has been described, but at least part of the processing of the program may be realized by hardware.
- the process flow of the program described in the above embodiment is just an example, and unnecessary steps may be deleted, new steps may be added, or the processing order may be changed within the scope of the main idea. Good too.
- a program in which the processes described above are written in code that can be processed by a computer may be stored in a storage medium such as an optical disk and distributed.
- a CPU is used as an example of a general-purpose processor.
- a processor refers to a processor in a broad sense, and may include a general-purpose processor (for example, CPU: Central Processing Unit, etc.) or a dedicated processor (for example, GPU: Graphics Processing Unit, ASIC: Application Specific Integrated Circuit, FPGA: Field Programmable Gate Array, programmable logic devices, etc.).
- processors in the embodiments described above may be performed not only by one processor, but also by multiple processors working together, and multiple processors located at physically separate locations may work together. It may be something that is achieved through hard work.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Biophysics (AREA)
- Ophthalmology & Optometry (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Analytical Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Biochemistry (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Signal Processing (AREA)
- Eye Examination Apparatus (AREA)
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2024510010A JPWO2023182011A1 (enExample) | 2022-03-24 | 2023-03-10 | |
| US18/892,126 US20250110040A1 (en) | 2022-03-24 | 2024-09-20 | Image processing method, image processing device, and program |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2022-048770 | 2022-03-24 | ||
| JP2022048770 | 2022-03-24 |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/892,126 Continuation US20250110040A1 (en) | 2022-03-24 | 2024-09-20 | Image processing method, image processing device, and program |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2023182011A1 true WO2023182011A1 (ja) | 2023-09-28 |
Family
ID=88101350
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2023/009452 Ceased WO2023182011A1 (ja) | 2022-03-24 | 2023-03-10 | 画像処理方法、画像処理装置、眼科装置、及びプログラム |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US20250110040A1 (enExample) |
| JP (1) | JPWO2023182011A1 (enExample) |
| WO (1) | WO2023182011A1 (enExample) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2011528801A (ja) * | 2008-07-21 | 2011-11-24 | オプトビュー,インコーポレーテッド | 範囲が拡大されたイメージング |
| JP2016504947A (ja) * | 2013-02-01 | 2016-02-18 | カール ツアイス メディテック アクチエンゲゼルシャフト | 干渉撮像におけるサブアパーチャベースの収差測定及び補正用のシステム及び方法 |
| JP2016041222A (ja) * | 2014-08-19 | 2016-03-31 | 株式会社トプコン | 眼底撮影装置 |
| JP2017522066A (ja) * | 2014-06-10 | 2017-08-10 | カール ツァイス メディテック インコーポレイテッドCarl Zeiss Meditec Inc. | 改善された周波数領域干渉法による撮像システムおよび方法 |
| JP2019512086A (ja) * | 2016-02-12 | 2019-05-09 | ザ ジェネラル ホスピタル コーポレイション | 光コヒ−レンストモグラフィを用いた高速・長深度レンジの撮像装置及び方法 |
| JP2019518511A (ja) * | 2016-05-13 | 2019-07-04 | エコール・ポリテクニーク・フェデラル・ドゥ・ローザンヌ (ウ・ペ・エフ・エル)Ecole Polytechnique Federale De Lausanne (Epfl) | 傾斜照明による網膜の吸収、位相および暗視野撮像のためのシステム、方法、および装置 |
-
2023
- 2023-03-10 JP JP2024510010A patent/JPWO2023182011A1/ja active Pending
- 2023-03-10 WO PCT/JP2023/009452 patent/WO2023182011A1/ja not_active Ceased
-
2024
- 2024-09-20 US US18/892,126 patent/US20250110040A1/en active Pending
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2011528801A (ja) * | 2008-07-21 | 2011-11-24 | オプトビュー,インコーポレーテッド | 範囲が拡大されたイメージング |
| JP2016504947A (ja) * | 2013-02-01 | 2016-02-18 | カール ツアイス メディテック アクチエンゲゼルシャフト | 干渉撮像におけるサブアパーチャベースの収差測定及び補正用のシステム及び方法 |
| JP2017522066A (ja) * | 2014-06-10 | 2017-08-10 | カール ツァイス メディテック インコーポレイテッドCarl Zeiss Meditec Inc. | 改善された周波数領域干渉法による撮像システムおよび方法 |
| JP2016041222A (ja) * | 2014-08-19 | 2016-03-31 | 株式会社トプコン | 眼底撮影装置 |
| JP2019512086A (ja) * | 2016-02-12 | 2019-05-09 | ザ ジェネラル ホスピタル コーポレイション | 光コヒ−レンストモグラフィを用いた高速・長深度レンジの撮像装置及び方法 |
| JP2019518511A (ja) * | 2016-05-13 | 2019-07-04 | エコール・ポリテクニーク・フェデラル・ドゥ・ローザンヌ (ウ・ペ・エフ・エル)Ecole Polytechnique Federale De Lausanne (Epfl) | 傾斜照明による網膜の吸収、位相および暗視野撮像のためのシステム、方法、および装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| US20250110040A1 (en) | 2025-04-03 |
| JPWO2023182011A1 (enExample) | 2023-09-28 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP2023009530A (ja) | 画像処理方法、画像処理装置、及びプログラム | |
| JP4971872B2 (ja) | 眼底観察装置及びそれを制御するプログラム | |
| JP6195334B2 (ja) | 撮像装置、撮像方法およびプログラム | |
| JP5032203B2 (ja) | 眼底観察装置及びそれを制御するプログラム | |
| EP2478830A2 (en) | Optical coherence tomographic imaging method and apparatus | |
| US20090091766A1 (en) | Optical coherence tomographic apparatus | |
| JP6909109B2 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
| JP7419042B2 (ja) | 医用画像処理装置、光干渉断層撮影装置、医用画像処理方法、及びプログラム | |
| JP7368581B2 (ja) | 眼科装置、及び眼科情報処理装置 | |
| JP7264254B2 (ja) | 画像処理方法、画像処理装置、およびプログラム | |
| US20240350003A1 (en) | Ophthalmic imaging apparatus, controlling method of the same, and recording medium | |
| JP2019208857A (ja) | Oct装置 | |
| JP2022089086A (ja) | 画像処理方法、画像処理装置、及び画像処理プログラム | |
| WO2023182011A1 (ja) | 画像処理方法、画像処理装置、眼科装置、及びプログラム | |
| JP2020022723A (ja) | Oct装置及びoct画像処理プログラム | |
| WO2022177028A1 (ja) | 画像処理方法、画像処理装置、及びプログラム | |
| JP2007181632A (ja) | 眼底観察装置 | |
| JP6888643B2 (ja) | Oct解析処理装置、及びoctデータ処理プログラム | |
| JP7673803B2 (ja) | Oct信号処理方法、oct装置、およびプログラム | |
| JP6833455B2 (ja) | 眼科撮影装置及びその制御方法、眼科撮影システム、並びに、プログラム | |
| JP7677411B2 (ja) | 画像処理方法、画像処理装置、及びプログラム | |
| WO2023282339A1 (ja) | 画像処理方法、画像処理プログラム、画像処理装置及び眼科装置 | |
| JP7736914B2 (ja) | 眼底観察装置 | |
| JP7711369B2 (ja) | 眼科装置及び画像処理方法 | |
| JP6556199B2 (ja) | 撮像装置及び撮像方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23774610 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2024510010 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 23774610 Country of ref document: EP Kind code of ref document: A1 |