WO2023022183A1 - Dispositif ophtalmique, procédé de commande et programme - Google Patents

Dispositif ophtalmique, procédé de commande et programme Download PDF

Info

Publication number
WO2023022183A1
WO2023022183A1 PCT/JP2022/031142 JP2022031142W WO2023022183A1 WO 2023022183 A1 WO2023022183 A1 WO 2023022183A1 JP 2022031142 W JP2022031142 W JP 2022031142W WO 2023022183 A1 WO2023022183 A1 WO 2023022183A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
optical path
scan
oct
path length
Prior art date
Application number
PCT/JP2022/031142
Other languages
English (en)
Japanese (ja)
Inventor
泰士 田邉
真梨子 向井
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Publication of WO2023022183A1 publication Critical patent/WO2023022183A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions

Definitions

  • the present invention relates to an ophthalmologic apparatus, control method, and program.
  • tomographic images are obtained (see, for example, Japanese Patent Laid-Open No. 2012-148003), but it is desired to obtain OCT (Optical Coherence Tomography) data over a wide range of the subject's eye.
  • OCT Optical Coherence Tomography
  • An ophthalmologic apparatus generates interference light between signal light and reference light that scan an eye to be inspected, detects the generated interference light, and detects at least one of the signal light and the reference light. It is necessary to change the optical path length when scanning the eye to be inspected with the signal light in order to acquire OCT data in a specified range of the eye to be inspected.
  • a control method generates interference light between signal light and reference light that scan an eye to be inspected, detects the generated interference light, and detects at least one of the signal light and the reference light.
  • a program generates interference light between signal light and reference light that scan an eye to be inspected, detects the generated interference light, and detects at least one of the signal light and the reference light.
  • the optical path length if it is determined that the optical path length needs to be changed, adjusting the OCT unit such that the optical path length is changed; and the signal and transforming the OCT data based on a scan angle defining a scan position of the eye to be examined by light and a distance from the pupil to the fundus of the eye to be examined.
  • FIG. 1 is a block diagram of an ophthalmic system 100;
  • FIG. 1 is a schematic configuration diagram showing an example of the overall configuration of an ophthalmologic apparatus 110.
  • FIG. 3 is a block diagram of the functions of the CPU 16A of the ophthalmologic apparatus 110;
  • FIG. 4 is a flowchart showing a data processing program executed by a CPU 16A of the ophthalmologic apparatus 110;
  • FIG. 5 is a flow chart showing a program of scanning processing with reference length change in step 260 of FIG. 4;
  • FIG. FIG. 4 is a diagram showing the positional relationship between the area (420S-420E) where OCT image information can be acquired and the retinal thickness ML when scanning with signal light;
  • FIG. 10 is a diagram in which information of an OCT image acquired by scanning signal light is arranged and displayed as it is without considering the actual direction and size. It is the figure which displayed the information of the OCT image acquired by scanning the signal light in consideration of the actual direction and dimension.
  • FIG. 2 is a diagram showing an SLO image and an OCT image
  • image (1) in the figure is an OCT imaging range in a live SLO image 420LG to which areas R1, R2, R3, and R4 that can be imaged with the same reference length are added for convenience. 422 is designated
  • image (2) in the figure shows an OCT image of the designated OCT imaging range 422 by B-scan.
  • FIG. 2 is a diagram showing an SLO image and an OCT image
  • image (1) in the figure is an OCT imaging range in a live SLO image 420LG to which areas R1, R2, R3, and R4 that can be imaged with the same reference length are added for convenience. 422 is designated
  • image (2) in the figure shows an OCT image of the designated OCT
  • FIG. 19 is a diagram showing a correspondence table 190 of angles of scanning positions, optimum values of reference lengths, and ranges that can be photographed with the same optimum reference length values.
  • FIG. 4 is a diagram showing the angle of the scanning position, the optimum reference length, and the range that can be imaged with the same reference length;
  • FIG. 13 shows a projection-corrected OCT image; It is the figure which displayed the acquired OCT image, without considering an actual direction and an actual shape.
  • 14 is a diagram showing an image obtained by transforming each pixel position of the image of FIG. 13 so that the image of FIG. 13 is along the actual direction and the actual dimension;
  • FIG. FIG. 3 is a diagram showing a three-dimensional OCT image of an OCT image displayed without considering the actual direction and actual size.
  • FIG. 16 is a diagram showing a three-dimensional OCT image obtained by transforming each pixel position of the image of FIG. 15 along the actual direction and the actual dimension;
  • FIG. 3 shows OCT images of three regions of the fundus.
  • 18 is a diagram showing an image obtained by transforming each pixel position of the OCT image of FIG. 17 along the actual direction and the actual dimension;
  • FIG. It is a diagram showing images related to OCT images, images (1) to (3) in the figure show each of the three-dimensional OCT images obtained by dividing the fundus into three regions, and image (4) in the figure. ) indicates an image obtained by synthesizing the three-dimensional OCT images of images (1) to (3).
  • FIG. 24 is a diagram in which the images in FIG. 23 are arranged along the actual direction;
  • FIG. Fig. 3 shows three 3D OCT images; FIG.
  • FIG. 3 is a diagram showing three-dimensional OCT images GPA, GPB, and GPC acquired at three positions A, B, and C by moving the optical head in the fundus imaging apparatus that moves the optical head.
  • FIG. 10 is a diagram showing three-dimensional OCT images GPA, GPB, and GPC acquired at three positions A, B, and C by moving the optical head in another example of the fundus imaging apparatus that moves the optical head.
  • FIG. 4 is a diagram showing an OCT image of the fundus;
  • FIG. 29 is a diagram showing an OCT image obtained by transforming each pixel position of the OCT image of FIG. 28 along the actual direction and the actual dimension; This is an OCT image obtained by converting the OCT image of FIG. 29 so that the RPE layer 750 is flat.
  • FIG. 8 is a diagram showing an OCT image 850 acquired at a low resolution and how two regions 852 and 854 are specified in the OCT image 850;
  • FIG. 32 is a high resolution OCT image 852A of the first region 852 of FIG. 31;
  • 32 is a high resolution OCT image 854A of the second region 854 of the OCT image 850 of FIG. 31;
  • FIG. 12 is a diagram showing a display screen, in which image (1) in the figure shows an SLO image 870, images (2) to (4) in the figure, and images (5) to (6) in the figure, respectively.
  • the image shows an image obtained by designating a region for acquiring an OCT image and displaying the OCT image of the designated region.
  • scanning laser ophthalmoscope Scanning Laser Ophthalmoscope
  • OCT optical coherence tomography
  • an ophthalmic system 100 includes an ophthalmic device 110, an eye axial length measuring device 120, a management server device (hereinafter referred to as “server”) 140, and an image display device (hereinafter referred to as "viewer”). 150 and .
  • the ophthalmologic device 110 acquires a fundus image.
  • the axial length measuring device 120 measures the axial length of the subject.
  • the server 140 stores the fundus image obtained by photographing the fundus of the subject with the ophthalmologic apparatus 110 in association with the ID of the subject.
  • the viewer 150 displays medical information such as fundus images acquired from the server 140 .
  • Network 130 is any network such as LAN, WAN, the Internet, or a wide area Ethernet network.
  • LAN local area network
  • WAN wide area network
  • ophthalmologic equipment inspection equipment for visual field measurement, intraocular pressure measurement, etc.
  • diagnosis support device that performs image analysis using artificial intelligence are connected via the network 130 to the ophthalmic equipment 110, the eye axial length measuring device 120, and the server. 140 and viewer 150 .
  • SLO scanning laser ophthalmoscope
  • OCT optical coherence tomography
  • the horizontal direction is the "X direction”
  • the vertical direction to the horizontal plane is the "Y direction”.
  • the ophthalmologic device 110 includes an imaging device 14 and a control device 16 .
  • the imaging device 14 includes an SLO unit 18, an OCT unit 20, and an imaging optical system 19, and acquires a fundus image of the eye 12 to be examined.
  • the two-dimensional fundus image acquired by the SLO unit 18 is hereinafter referred to as an SLO fundus image.
  • a tomographic image of the retina, a front image (en-face image), and the like created based on the OCT data acquired by the OCT unit 20 are referred to as OCT images.
  • the control device 16 comprises a computer having a CPU (Central Processing Unit) 16A, a RAM (Random Access Memory) 16B, a ROM (Read-Only Memory) 16C, and an input/output (I/O) port 16D. ing.
  • the ROM 16C stores an image processing program, which will be described later.
  • the control device 16 may further include an external storage device and store the image processing program in the external storage device.
  • the image processing program is an example of the "program” of the technology of the present disclosure.
  • the ROM 16C (or external storage device) is an example of the “memory” and “computer-readable storage medium” of the technology of the present disclosure.
  • the CPU 16A is an example of the “processor” of the technology of the present disclosure.
  • the control device 16 is an example of the "computer program product” of the technology of the present disclosure.
  • the control device 16 has an input/display device 16E connected to the CPU 16A via an I/O port 16D.
  • the input/display device 16E has a graphic user interface that displays an image of the subject's eye 12 and receives various instructions from the user. Graphic user interfaces include touch panel displays.
  • the control device 16 has a communication interface (I/F) 16F connected to the I/O port 16D.
  • the ophthalmologic apparatus 110 is connected to an axial length measuring instrument 120 , a server 140 and a viewer 150 via a communication interface (I/F) 16F and a network 130 .
  • the control device 16 of the ophthalmic device 110 includes the input/display device 16E, but the technology of the present disclosure is not limited to this.
  • the controller 16 of the ophthalmic device 110 may not have the input/display device 16E, but may have a separate input/display device physically separate from the ophthalmic device 110.
  • the display device comprises an image processor unit operating under the control of the CPU 16A of the control device 16.
  • the image processor unit may display an SLO fundus image, an OCT image, etc., based on the image signal output by the CPU 16A.
  • the imaging device 14 operates under the control of the CPU 16A of the control device 16.
  • the imaging device 14 includes an SLO unit 18 , an imaging optical system 19 and an OCT unit 20 .
  • the imaging optical system 19 includes a first optical scanner 22 , a second optical scanner 24 and a wide-angle optical system 30 .
  • the first optical scanner 22 two-dimensionally scans the light emitted from the SLO unit 18 in the X direction and the Y direction.
  • the second optical scanner 24 two-dimensionally scans the light emitted from the OCT unit 20 in the X direction and the Y direction.
  • the first optical scanner 22 and the second optical scanner 24 may be optical elements capable of deflecting light beams, such as polygon mirrors and galvanometer mirrors. Moreover, those combinations may be sufficient.
  • the wide-angle optical system 30 includes an objective optical system (not shown in FIG. 2) having a common optical system 28, and a synthesizing section 26 that synthesizes the light from the SLO unit 18 and the light from the OCT unit 20.
  • the objective optical system of the common optical system 28 may be a reflective optical system using a concave mirror such as an elliptical mirror, a refractive optical system using a wide-angle lens, or a catadioptric system combining concave mirrors and lenses. good.
  • a wide-angle optical system with an elliptical mirror and wide-angle lens it is possible to image not only the central part of the fundus where the optic disc and macula exist, but also the equatorial part of the eyeball and the peripheral part of the fundus where vortex veins exist. It becomes possible.
  • the wide-angle optical system 30 realizes observation in a wide field of view (FOV: Field of View) 12A at the fundus.
  • the FOV 12A indicates a range that can be photographed by the photographing device 14.
  • FIG. FOV12A can be expressed as a viewing angle.
  • a viewing angle can be defined by an internal illumination angle and an external illumination angle in this embodiment.
  • the external irradiation angle is an irradiation angle defined by using the pupil 27 as a reference for the irradiation angle of the light beam irradiated from the ophthalmologic apparatus 110 to the eye 12 to be examined.
  • the internal illumination angle is an illumination angle defined with the center O of the eyeball as a reference for the illumination angle of the luminous flux that illuminates the fundus.
  • the external illumination angle and the internal illumination angle are in correspondence. For example, an external illumination angle of 120 degrees corresponds to an internal illumination angle of approximately 160 degrees. In this embodiment, the internal illumination angle is 200 degrees.
  • UWF-SLO fundus image an SLO fundus image obtained by photographing at an angle of view of 160 degrees or more with an internal irradiation angle is referred to as a UWF-SLO fundus image.
  • UWF is an abbreviation for UltraWide Field.
  • the SLO system is implemented by the control device 16, SLO unit 18, and imaging optical system 19 shown in FIG. Since the SLO system includes the wide-angle optical system 30, it enables fundus imaging with a wide FOV 12A.
  • the SLO unit 18 includes a plurality of light sources, for example, a B light (blue light) light source 40, a G light (green light) light source 42, an R light (red light) light source 44, and an IR light (infrared (for example, near infrared) light source). and optical systems 48, 50, 52, 54, and 56 that reflect or transmit the light from the light sources 40, 42, 44, and 46 and guide them to one optical path.
  • Optical systems 48, 50, 56 are mirrors and optical systems 52, 54 are beam splitters.
  • the B light is reflected by the optical system 48, transmitted through the optical system 50, and reflected by the optical system 54, the G light is reflected by the optical systems 50 and 54, and the R light is transmitted by the optical systems 52 and 54.
  • IR light is reflected by the optical systems 56, 52 and directed to one optical path, respectively.
  • the SLO unit 18 is configured to be switchable between a light source that emits laser light of different wavelengths, such as a mode that emits G light, R light, and B light, and a mode that emits infrared light, or a combination of light sources that emit light.
  • a light source that emits laser light of different wavelengths
  • FIG. 2 includes four light sources, a B light (blue light) light source 40, a G light source 42, an R light source 44, and an IR light source 46
  • SLO unit 18 may further include a white light source to emit light in various modes, such as a mode that emits only white light.
  • the light incident on the imaging optical system 19 from the SLO unit 18 is scanned in the X direction and the Y direction by the first optical scanner 22 .
  • the scanning light passes through the wide-angle optical system 30 and the pupil 27 and irradiates the posterior segment of the eye 12 to be examined.
  • Reflected light reflected by the fundus enters the SLO unit 18 via the wide-angle optical system 30 and the first optical scanner 22 .
  • the SLO unit 18 includes a beam splitter 64 that reflects B light and transmits light other than B light from the posterior segment (for example, fundus) of the eye 12 to be inspected, and G light that has passed through the beam splitter 64.
  • a beam splitter 58 that reflects light and transmits light other than G light is provided.
  • the SLO unit 18 has a beam splitter 60 that reflects the R light and transmits other than the R light out of the light transmitted through the beam splitter 58 .
  • the SLO unit 18 has a beam splitter 62 that reflects IR light out of the light transmitted through the beam splitter 60 .
  • the SLO unit 18 has a plurality of photodetection elements corresponding to a plurality of light sources.
  • the SLO unit 18 includes a B light detection element 70 that detects B light reflected by the beam splitter 64 and a G light detection element 72 that detects G light reflected by the beam splitter 58 .
  • the SLO unit 18 includes an R photodetector element 74 that detects R light reflected by the beam splitter 60 and an IR photodetector element 76 that detects IR light reflected by the beam splitter 62 .
  • Light reflected light reflected by the fundus
  • the beam splitter 64 in the case of B light, and is detected by the B light detection element 70 .
  • G light is transmitted through the beam splitter 64 , reflected by the beam splitter 58 , and received by the G light detection element 72 .
  • the incident light in the case of R light, passes through the beam splitters 64 and 58 , is reflected by the beam splitter 60 , and is received by the R light detection element 74 .
  • the incident light passes through beam splitters 64 , 58 and 60 , is reflected by beam splitter 62 , and is received by IR photodetector 76 .
  • the CPU 16A uses the signals detected by the B photodetector 70, G photodetector 72, R photodetector 74, and IR photodetector 76 to generate a UWF-SLO fundus image.
  • a UWF-SLO fundus image (also referred to as a UWF fundus image or an original fundus image as described later) includes a UWF-SLO fundus image obtained by photographing the fundus in G color (G color fundus image) and a fundus image in R color.
  • a UWF-SLO fundus image (R-color fundus image) obtained by photographing in color.
  • the UWF-SLO fundus image includes a UWF-SLO fundus image obtained by photographing the fundus in B color (B color fundus image) and a UWF-SLO fundus image obtained by photographing the fundus in IR (IR fundus image). image).
  • control device 16 controls the light sources 40, 42, 44 to emit light simultaneously.
  • a G-color fundus image, an R-color fundus image, and a B-color fundus image whose respective positions correspond to each other are obtained.
  • An RGB color fundus image is obtained from the G color fundus image, the R color fundus image, and the B color fundus image.
  • the control device 16 controls the light sources 42 and 44 to emit light at the same time, and the fundus of the subject's eye 12 is photographed simultaneously with the G light and the R light, thereby obtaining a G-color fundus image and an R-color fundus image corresponding to each other at each position.
  • a fundus image is obtained.
  • An RG color fundus image is obtained from the G color fundus image and the R color fundus image.
  • UWF-SLO fundus images include B-color fundus images, G-color fundus images, R-color fundus images, IR fundus images, RGB color fundus images, and RG color fundus images.
  • Each image data of the UWF-SLO fundus image is transmitted from the ophthalmologic apparatus 110 to the server 140 via the communication interface (I/F) 16F together with the subject information input via the input/display device 16E.
  • Each image data of the UWF-SLO fundus image and information of the subject are stored in the storage device in correspondence.
  • the information of the subject includes, for example, subject ID, name, age, visual acuity, distinction between right eye/left eye, and the like.
  • the subject information is entered by the operator through the input/display device 16E.
  • the OCT system is implemented by the control device 16, OCT unit 20, and imaging optical system 19 shown in FIG. Since the OCT system includes the wide-angle optical system 30, it enables fundus imaging with a wide FOV 12A, as in the above-described SLO fundus imaging.
  • the OCT unit 20 includes a light source 20A, a sensor (detection element) 20B, a first optical coupler 20C, a reference optical system 20D, a collimating lens 20E, and a second optical coupler 20F.
  • the light emitted from the light source 20A is split by the first optical coupler 20C.
  • One of the split beams is collimated by the collimating lens 20E and then enters the imaging optical system 19 as measurement light.
  • the measurement light is scanned in the X and Y directions by the second optical scanner 24 .
  • the scanning light passes through the wide-angle optical system 30 and the pupil 27 and illuminates the fundus.
  • the measurement light reflected by the fundus enters the OCT unit 20 via the wide-angle optical system 30 and the second optical scanner 24, passes through the collimating lens 20E and the first optical coupler 20C, and enters the second optical coupler 20F. do.
  • the other light emitted from the light source 20A and split by the first optical coupler 20C enters the reference optical system 20D as reference light, passes through the reference optical system 20D, and enters the second optical coupler 20F.
  • the CPU 16A performs signal processing such as Fourier transform on the detection signal detected by the sensor 20B to generate OCT data.
  • the CPU 16A generates OCT images such as tomographic images and en-face images based on the OCT data.
  • the OCT system acquires OCT data of the imaging area realized by the wide-angle optical system 30.
  • the OCT data, tomographic images, and en-face images generated by the CPU 16A are transmitted from the ophthalmologic apparatus 110 to the server 140 via the communication interface (I/F) 16F together with information on the subject.
  • Various OCT images such as OCT data, tomographic images and en-face images are associated with subject information and stored in a storage device.
  • the light source 20A exemplifies a wavelength sweep type SS-OCT (Swept-Source OCT), but SD-OCT (Spectral-Domain OCT), TD-OCT (Time-Domain OCT), etc.
  • SS-OCT Session-Coupled Device
  • SD-OCT Spectral-Domain OCT
  • TD-OCT Time-Domain OCT
  • the axial length measuring device 120 measures the axial length of the subject's eye 12 in the axial direction.
  • the axial length measuring device 120 transmits the measured axial length to the server 140 .
  • the server 140 stores the subject's eye axial length in association with the subject ID.
  • the control program for the ophthalmic equipment has an imaging control function, a display control function, an image processing function, and a processing function.
  • the CPU 16A executes the control program for the ophthalmologic equipment having these functions, the CPU 16A functions as an imaging control unit 202, a display control unit 204, an image processing unit 206, and a processing unit 208, as shown in FIG. .
  • the imaging control unit 202 is an example of the “determination unit” and the “control unit” of the technology of the present disclosure.
  • the image processing unit 206 is an example of the “conversion unit” and the “image forming unit” of the technology of the present disclosure.
  • the CPU 16A of the control device 16 of the ophthalmologic apparatus 110 executes the image processing program of the ophthalmologic apparatus, thereby realizing the control of the ophthalmologic apparatus shown in the flowchart of FIG.
  • the data processing shown in FIG. 4 is realized by the CPU 16A of the control device 16 of the ophthalmologic apparatus 110 executing the data processing program.
  • the processing unit 208 performs initial settings.
  • the initial setting specifically refers to, for example, a process of adjusting the optical system for focus, alignment, etc. in the ophthalmologic apparatus 110 .
  • the imaging control unit 202 acquires an SLO fundus image by executing imaging with the SLO system, and then the data processing proceeds to step 256 .
  • the live SLO image 420LG is displayed on the input/display device 16E.
  • the SLO image is displayed as a live view image on the input/display device 16E.
  • characters of areas 1 to 4, circular frames indicating boundaries between adjacent areas, and an OCT imaging range 422 are shown.
  • the letter 4, the circular frame, and the OCT imaging range 422 are not displayed.
  • the imaging control unit 202 and the image processing unit 206 acquire OCT images. Specifically, first, when step 254 ends, the imaging control unit 202 accepts specification of the OCT imaging range. Specifically, the staff operates the mouse while viewing the live SLO image 420LG displayed as a live view image on the input/display device 16E as shown in FIG. A range 422 is specified. At step 256 , the imaging control unit 202 accepts designation of the OCT imaging range 422 . In the example shown in FIG. 9, a B-scan line, which will be described later, is designated as the OCT imaging range 422. FIG. Note that a rectangular range of C-scan, which will be described later, may be designated as the OCT imaging range.
  • the imaging control unit 202 determines whether or not the specified OCT imaging range 422 is an area in which the reference length needs to be changed, based on the specified OCT imaging range 422. Next, this principle will be explained.
  • the photographing control unit 202 performs tomography of interference light obtained by interfering each return light obtained by irradiating the fundus from the pupil of the eye 12 to be inspected with the signal light and the reference light. By performing tomography with coherent light in this way, a tomographic region is imaged, and an OCT image showing the tomographic region is acquired by the imaging control unit 202 .
  • the optical path lengths of the signal light and the reference light are equal.
  • the fundus is spherical and the signal light reaches the fundus through the center of the pupil, not from the center of the eyeball, scanning the signal light through the center of the pupil results in the total
  • the optical path length changes according to the scanning position and no longer corresponds to the optical path length of the reference light (also called "reference length"). Therefore, it is necessary to change the reference length according to the scanning position.
  • the scanning position is geometrically determined by an angle centered on the center of the pupil, based on the optical axis passing through the center of the pupil and the center of the eyeball.
  • the relationship between the angle defining the scan position and the ideal value of the reference length at that scan position is also predetermined.
  • a correspondence table 190 is shown in FIG.
  • the correspondence table 190 associates the angle that determines each scanning position, the optimum value of the reference length at each scanning position, and the range that can be photographed with the optimum reference length.
  • the correspondence table 190 is stored in the ROM 16C of the control device 16.
  • the OCT imaging range is shown with reference to the scanning position P at the center of the location where OCT imaging is performed.
  • the OCT imaging range refers to a range in which even if the optical path lengths of the signal light and the reference light are not equal, if the difference is within a predetermined range, the return light and the reference light interfere and imaging by OCT is possible. .
  • FIG. 11 shows the range of angles ⁇ n in which return light and reference light interfere with each other and OCT imaging is possible when the reference length is fixed at LRn.
  • the range of angles ⁇ n in which imaging by OCT is possible is ( ⁇ n ⁇ n2) to ( ⁇ n+ ⁇ n1).
  • area 2 is defined as the range of angle ⁇ n based on angle ⁇ n
  • area 1 is defined as the range inside area 2, i.e., the range in which angle ⁇ is small
  • outside area 2 i.e., angle ⁇
  • Area 3 is defined as a range with a large value
  • Area 4 is defined as a range outside of Area 3.
  • Areas 1-4 are also illustrated in FIG.
  • the OCT imaging range 422 is within the range of area 2 shown in FIG. 11, there is no need to change the reference length for OCT imaging. However, if at least Area 1 or Area 3 adjacent to Area 2 is included, the reference length needs to be changed in order to perform OCT imaging.
  • the return light and the reference light are less likely to interfere with each other if the reference length is fixed at LRn, and the OCT image becomes unclear or the OCT image becomes blurred. can't get Therefore, it is necessary to change the reference length. That is, when the angle of view of the OCT imaging range 422 is larger than the predetermined angle of view ⁇ n, it is determined that the reference length needs to be changed.
  • the correspondence table 190 predetermines the range that can be photographed with the reference length of the optimum value for each angle that determines the scanning position.
  • the imaging control unit 202 changes the reference length by determining whether the part of the fundus to be imaged by OCT determined by the OCT imaging range 422 exceeds the range of ⁇ n as shown in FIG. 11 . Decide if it is necessary. Specifically, first, the imaging control unit 202 determines the scanning position P at the center of the OCT imaging range 422 from the OCT imaging range 422, and geometrically calculates the angle ⁇ n that determines the determined scanning position.
  • the imaging control unit 202 acquires from the correspondence table 190 the imaging range, which is the range in which imaging can be performed with the reference length of the optimum value corresponding to the calculated angle ⁇ n.
  • the imaging control unit 202 determines whether the location to be imaged by OCT exceeds the imaging range. If the location to be imaged by OCT does not exceed the imageable range, the determination in step 258 is negative, and the data processing proceeds to step 264 . If the location to be imaged by OCT exceeds the imageable range, the determination in step 258 is affirmative, and the data processing proceeds to step 260 .
  • the imaging control unit 202 executes scanning processing with reference length change, and at step 262, the image processing unit 206 executes image processing with projection correction. Data processing then proceeds to step 268 .
  • step 264 the imaging control unit 202 scans the designated OCT imaging range 422, and in step 266, the image processing unit 206 applies a predetermined Apply image processing. The data processing then proceeds to step 268 .
  • the display control section 204 causes the image (FIGS. 12, 15 to 34) obtained by the image processing performed by executing the processing of step 262 or step 266 to be processed by the input/display device 16E. display for The data processing then proceeds to step 270 .
  • step 270 the processing unit 208 transmits image data representing an image obtained by performing image processing by executing the processing in step 262 or step 266 together with patient identification information to the server 140, and then , terminate this data processing.
  • the server 140 receives the image data and applies image processing to the received image data. Then, the server 140 stores the image data obtained by the image processing in a predetermined storage area of the secondary storage device in association with the patient's identification information. The server 140 transmits image data obtained by image processing to the viewer 150 .
  • the viewer 150 displays the OCT image on the display based on the image data transmitted from the server 140.
  • the doctor diagnoses the subject's eye 12 while looking at the OCT image displayed on the display. Details of the OCT image and the like displayed on the display by the viewer 150 will be described later.
  • the image processor 206 determines the initial reference length L0, which is the optical path length of the reference light.
  • the initial reference length L0 may be derived using an initial reference length derivation table (not shown) that is a table that inputs the stored value L and the predetermined distance ⁇ L and outputs the initial reference length L0. .
  • the initial reference length L0 is the OCT image information from a position shifted from the surface of the fundus by a predetermined distance ⁇ L toward the center of the pupil to a position a predetermined distance (thickness) away in the depth direction including the thickness of the retina. used to obtain If the length of the reference length is matched to the surface of the fundus oculi, there is a possibility that the information of the OCT image will be acquired from the inside of the fundus due to an error in the configuration of the device, and as a result the information of the surface of the fundus will not be acquired.
  • the patient's axial length (L) is not limited to the value obtained by measuring with the axial length measuring device 120, and based on information such as myopia, hyperopia, gender difference, and/or regional characteristics, Values derived from stage model values may be used.
  • geographical features refer to features based on regional differences between Japan and other countries (eg, Europe, the United States, etc.), for example.
  • the axial length (L) of the patient's eye is, for example, 24 mm in the case of Japanese myopic eyes.
  • the measured length may be used, or the optical thickness obtained by multiplying the refractive index may be used.
  • the axial length measuring device 120 can be omitted.
  • the image processing unit 206 executes OCT A-scan. Then, the image processing unit 206 acquires the information of the OCT image of the retina obtained by executing the OCT A-scan.
  • A-scan means that the current scanning position of the fundus is irradiated with light through the center of the pupil, and a predetermined thickness in the depth direction along the direction in which the irradiated light travels from the current scanning position. It is to obtain the information of the OCT image (A-scan image) of the retina.
  • the current scanning position is determined by the current scanning angle in the X direction of the second optical scanner 24 and the current scanning angle in the Y direction.
  • B-scan is to scan the light in a predetermined direction through the center of the pupil to obtain information on the A-scan image of each scanning position along the predetermined direction of the fundus. In other words, it is to obtain an OCT image (B-scan image) of the retina of a plane including a predetermined direction in which light is scanned.
  • C-scan is to acquire a three-dimensional OCT image (C-scan image) obtained by scanning in a predetermined direction of B-scan in a direction perpendicular to the predetermined direction.
  • the imaging optical system 19 scans the signal light in the Y direction with the second optical scanner 24 to obtain an OCT image, and scans the signal light in the Y direction.
  • the scanning signal light is scanned in the X direction.
  • each position on the OCT imaging range 422 is irradiated with signal light during scanning in the X direction and the Y direction, so that OCT image information is obtained via the sensor 20B of the OCT unit 20. .
  • the image processing unit 206 determines whether or not the entire designated OCT imaging range has been scanned. At step 286 , if the entire OCT imaging range has not been scanned, the determination is negative, and the scanning process involving the change of this reference length proceeds to step 288 . At step 288, the image processing unit 206 changes the scanning angle of the second optical scanner 24 so that the designated OCT imaging range is scanned by the signal light.
  • the image processing unit 206 acquires the reference length corresponding to the scan angle changed at step 288 from the correspondence table 190 shown in FIG. Then, the image processing unit 206 calculates the amount of deviation from the previous length, and checks the reference optical system 20D (see FIG. 2) for the calculated amount of deviation. The scanning process with reference length change then returns to step 284 .
  • step 286 if the entire OCT imaging range has been scanned, the determination is affirmative, and the scanning process that accompanies this reference length change ends. Data processing then proceeds to step 262, as described above.
  • the scanning is performed in step 260. Change the reference length according to the angle (scanning position).
  • the image processing unit 206 stores each A obtained by scanning the entire OCT imaging range 422 in predetermined storage areas (M11 to M1n, . . . Mm1 to Mmn) of the RAM 16B.
  • predetermined storage areas M11 to M1n, . . . Mm1 to Mmn
  • the predetermined storage area of the RAM 16B has n ⁇ m storage areas of n rows and m columns. In the leftmost column of FIG. 7, there are n storage areas M11 to M1n.
  • the image processing unit 206 stores each pixel value (n pieces) of the A-scan image at the leftmost position of the OCT imaging range 422 in FIG. . Specifically, the image processing unit 206 stores the pixel value of the acquisition start position 420S (see also FIG. 6) in the storage area M11, stores the pixel value of the position next to the acquisition start position 420S in the next storage area, and so on. The pixel value at the pixel position of the acquisition end position 520E is stored in the storage area M1n. The image processing unit 206 performs this from the A-scan image at the leftmost position of the OCT imaging range 422 in FIG. 9 to the A-scan image at the rightmost position.
  • a B-scan image is obtained as shown in image (2) in FIG.
  • the B-scan image is generated by arranging the acquisition start positions of the OCT images of the A-scan image so that they are aligned on a straight line and the acquisition end positions are also aligned on a straight line.
  • the acquisition start position 420S nor the acquisition end position 420E of each OCT image of the A-scan image are aligned on a straight line. Therefore, the obtained B-scan image is different in direction and size from the actual tomographic area of the retina. As shown in the image (2) in FIG. 9, the B-scan image differs from the actual retinal tomographic area in the actual direction and actual size. cannot be grasped.
  • the image processing unit 206 corrects the obtained B-scan image so that it corresponds to the actual tomographic area of the retina in direction and size. This correction is called projection correction.
  • each pixel position of the B-scan image be I ini (x B-scan , y B-scan ).
  • Projection correction is performed by converting pixel positions I ini (x B-scan , y B-scan ) of the B-scan image to pixel positions (x proj , y proj ) as follows.
  • X proj (R( ⁇ )+Y B-scan ) sin(X B-scan / ⁇ )
  • Y proj (R( ⁇ )+Y B-scan )cos(X B-scan / ⁇ )
  • the scan angle ⁇ is the angle that defines the scan range, and is the angle formed with the eye axis of a straight line containing the center 400 of the pupil and the pixel position I ini (x B-scan , y B-scan ).
  • the distance R( ⁇ ) is the distance from the center 400 of the pupil of the subject's eye 12 to the scanning position P(0) of the fundus when the pixel position I ini (x B-scan , y B-scan ) is obtained. Yes, and varies with the scan angle ⁇ .
  • the axial length is obtained as described above, and the shape of the eye to be examined is determined from the obtained axial length and patient data corresponding to the patient ID, such as age, sex, and race (for example, Japanese).
  • the distance R( ⁇ ) from the pupil of the subject's eye 12 to the fundus for each angle is determined in advance and stored in the secondary storage device 17 .
  • is a constant and represents the number of horizontal pixels per degree when scanning at the scan angle
  • the OCT image of the retina is an image corresponding to the actual tomographic region of the retina (hereinafter also referred to as "real OCT image" for convenience of explanation) (see also FIGS. 8 and 12).
  • Image data for making the OCT image of the retina in the specified OCT imaging range into a real OCT image is used in step 268 in FIG.
  • the real OCT image of the retina is displayed on the display of input/display device 16E.
  • the OCT imaging range designated by executing the process of step 256 in FIG. 4 is linearly designated as the OCT imaging range 422 in image (1) in FIG. ing.
  • the technology of the present disclosure is not limited to this, and can also be applied when a region having a certain area is specified. If a region with a constant area is specified, a C-scan image is obtained. For the same reason as described above, the C-scan image differs in direction and size from the actual three-dimensional tomographic area of the retina, as shown in FIG. Therefore, also in this case, the image processing unit 206 performs projection correction on the C-scan image so as to correspond to the three-dimensional tomographic area, direction, and size of the actual retina.
  • the conversion formula in this case is as follows.
  • the scan angle is an angle ⁇ based on the reference direction (X direction) on a plane (horizontal plane) perpendicular to the plane corresponding to the angle ⁇ .
  • the pre-transformation pixel position Z ini (x ini , y ini , z ini ) is transformed to the post-transformation pixel position (x proj , y proj , z proj ) as follows.
  • the OCT image of the retina in the OCT imaging range obtained based on the image data obtained by converting to the pixel positions (x proj , y proj , z roj ) is a three-dimensional image of the actual retina as shown in FIG. It results in an OCT image that corresponds to the orientation and dimensions of the tomographic region.
  • the scanning angle is reflected, and the Z axis is in an oblique direction.
  • step 288 the scanning process proceeds to step 284, which differs from the scanning process with reference length change of step 260.
  • step 266 differs from the image processing involving projection correction of step 266 in that the projection correction processing is omitted.
  • the image processing in step 262 may be the same as the image processing involving projection correction in step 262 .
  • the return light reflected by the retina of the fundus of the eye and the reference light branched from the signal light interfere with each other.
  • the optical path length of the reference light is updated so that interference occurs when a tomographic region of the retina in the depth direction is imaged.
  • a B-scan image is generated by arranging sets of pixel values of the OCT image in the depth direction at each scanning position in the same direction over each scanning position.
  • Projection correction which is correction corresponding to the shape and size of the retina, is then performed on the generated B-scan image.
  • the interference between the return light reflected by the retina and the reference light is maintained, and the obtained OCT image is can be displayed on the input/display device 16E to correspond to the shape of the retina.
  • the optical path length of the reference light (LR) it is determined whether or not the optical path length of the reference light (LR) needs to be changed, and if it is determined that the optical path length needs to be changed, the optical path length is changed. . Therefore, a clear OCT image can be obtained. Further, in the first embodiment, based on the scan angle ( ⁇ ) that determines the scanning position of the eye to be examined by the signal light (LS) and the distance from the pupil of the eye to be examined to the fundus (R ( ⁇ )), OCT data , the OCT image can be made to correspond to the actual appearance of the fundus. As described above, in the first embodiment, it is possible to obtain an OCT image that corresponds to the actual appearance of the fundus, that is, in which the distance and angle of the fundus object are correctly reproduced on orthogonal coordinates (Cartesian coordinates).
  • the scanning process with reference length change (step 260 in FIG. 4) according to the second embodiment differs from the scanning process with reference length change according to the first embodiment in the following points.
  • the process of step 288 (change of scanning angle) is executed between the process of step 284 and the process of step 286. be.
  • the imaging control unit 202 determines whether the reference length (LR) needs to be changed. determine whether or not
  • the scan angle (scanning position ) when it is determined in step 258 in FIG. 4 that the designated OCT imaging range is an area requiring a change in the reference length, the scan angle (scanning position ), the reference length is updated accordingly.
  • step 260 it is determined in the process of step 260 whether or not the reference length (LR) needs to be changed, and if it is determined that the reference length (LR) needs to be changed, , to change the reference length.
  • LR2 is the optimum reference length determined according to the scanning position of the center of the OCT imaging range 422 shown in FIG. Therefore, even if the processing of step 288 (changing the scanning angle) is executed after the processing of step 284, if the range captured at the changed scanning angle is the range captured at the current reference length, No need to change the reference length.
  • the scanning process involving the reference length change proceeds to step 284 .
  • the range to be photographed at the changed scanning angle is changed to area 3, although the range of area 2 has been scanned up to now, it is necessary to change the reference length (LR). be judged.
  • the scan process with reference length change proceeds to step 290 where the reference length is changed.
  • the imaging control unit 202 adjusts the reference optical system 20D (see FIG. 2) of the OCT unit 20 so as to obtain the changed reference length.
  • the example described above is an example in which the acquisition start position 420S of each A-scan in the OCT imaging range 422 and the acquisition end position 422E of each A-scan are the same.
  • the A-scan OCT image information is obtained from the A-scan acquisition start position 42 as described above. It is acquired with a constant width from 0S to the acquisition end position 422E of A-scan. Therefore, even if the scanning position changes to some extent, as shown in the scanning directions 410 and 412 in FIG. A-scan OCT image information can be obtained with a length reference length.
  • the acquisition start position of A-scan will be an acquisition start position 420X that has entered the retina by a distance 420L, as shown in the scanning direction 414. Therefore, part of the A-scan OCT image information cannot be acquired at the same acquisition start position.
  • the imaging control unit 202 determines whether or not the scanning position at the changed scanning angle is the scanning position at which the acquisition start position needs to be changed. to decide.
  • the imaging control unit 202 starts acquisition. Do the following without changing the position. If it is determined that the scanning position at the changed scanning angle after the process of step 288 (changing the scanning angle) is the scanning position at which the acquisition start position needs to be changed, the imaging control unit 202 starts acquisition. Adjust the reference optical system 20D (see FIG. 2) of the OCT unit 20 so that the position is changed.
  • the image processing unit 206 aligns the acquisition start position and the acquisition end position of each OCT image for each set of a plurality of A-scan images obtained with the same reference length in the OCT imaging range. align to The image processing unit 206 arranges each pixel value of the A-scan image aligned in this way in the RAM 16B storage area of the control device 16 . As a result, a B-scan image is synthesized for each set of A-scan images.
  • the image processing unit 206 executes projection correction for each set of A-scan images obtained with the B-scan images having the same reference length.
  • the image processing unit 206 combines a plurality of images obtained by performing projection correction on the B-scan image for each set of A-scan images as a correction target, and performs all OCT in the specified OCT imaging range. form an image.
  • FIG. 9 the specific content of image processing involving projection correction will be described, taking as an example a case where the OCT imaging range 422 includes an area 2 and two areas 3 sandwiching the area 2.
  • FIG. A first scan of a portion corresponding to area 2 of the OCT imaging range 422, a second scan of a portion corresponding to area 2, and a third scan of a portion corresponding to area 3 are performed.
  • a first B-scan image 530A, a second B-scan image 532A, and a third B-scan image 534A are obtained as shown in FIG.
  • the reference lengths in the second B-scan image 532A and the third B-scan image 534A are the same.
  • the reference length in the first B-scan image 530A is the first optical path length (LR1)
  • the reference length in the second B-scan image 532A is the second optical path length (LR2)
  • the reference length in the third B-scan image 534A is Let the reference length be the third optical path length (LR3).
  • Second optical path length (LR2) Third optical path length (LR3)
  • projection correction is performed on each of the first B-scan image 530A, the second B-scan image 532A, and the third B-scan image 534A.
  • a first OCT image 530B, a second OCT image 532B, and a third OCT image 534B obtained by executing the projection correction are combined.
  • the scanning angle is changed so that areas 531A and 533A where the first B-scan image 530A, the second B-scan image 532A, and the third B-scan image 534A overlap are generated. do.
  • the signal light scanned in the vertical direction is scanned horizontally from the left side to the right side of the paper surface of FIG. Acquire OCT image information at each corresponding location.
  • each position corresponding to the OCT imaging range 422 is scanned horizontally from the left end to the right end of the second B-scan image 532A shown in FIG. Acquire OCT image information at . This results in a set of A-scan images corresponding to the second B-scan image 532A.
  • the horizontal scanning start position of the signal light scanned in the vertical direction is located at a position a predetermined distance back from the right end of the second B-scan image 532A. to change the scan angle.
  • OCT image information at each position corresponding to the OCT imaging range 422 is acquired while horizontally scanning the signal light scanned in the vertical direction to the right end of the first B-scan image 530A. This results in a set of A-scan images corresponding to the first B-scan image 530A.
  • the horizontal scanning start position of the signal light scanned in the vertical direction is located at a position a predetermined distance back from the right end of the first B-scan image 530A. to change the scan angle.
  • OCT image information at each position is acquired while horizontally scanning the signal light scanned in the vertical direction to the right end of the third B-scan image 534A. This results in a set of A-scan images corresponding to the third B-scan image 534A.
  • the overlapping portion may be subjected to image processing so that luminance and/or color changes are smooth (that is, they change at a constant rate of change).
  • a continuous tomographic image is obtained by synthesizing the first OCT image 530B, the second OCT image 532B, and the third OCT image 534B.
  • the first OCT image 530B is an example of the "first tomographic image” of the technique of the present disclosure.
  • the second OCT image 532B is an example of the "second tomographic image" of the technique of the present disclosure.
  • the scanning angle may be changed so that overlapping regions 531A and 533A do not occur as described above.
  • the brightness and/or color of adjacent portions of first B-scan image 530A, second B-scan image 532A, and third B-scan image 534A are smoothed (ie, change at a constant rate of change). may be subjected to image processing.
  • the retina has a different layer structure between the central part and the outermost part.
  • the central portion has seven layers, while the outermost portion has three layers. Therefore, when segmentation is performed to determine the boundaries of each layer, the calculation conditions may be switched so that seven layers are used in the central portion and three layers are used in the peripheral portion.
  • the OCT imaging range in step 256 of FIG. 7 is designated linearly like the OCT imaging range 422 shown in FIG. 9, but the technology of the present disclosure is not limited to this. , can also be applied when a region with a certain area is specified.
  • each C-scan image OCT volume image
  • the image processing unit 206 performs projection correction on each C-scan image.
  • a three-dimensional OCT image 550 corresponding to the central region is obtained, as shown in image (1) in FIG. Also, as shown in image (2) in FIG.
  • Template matching or feature point correspondence processing can be used for alignment processing when a plurality of images are pasted together.
  • SIFTT Scale-Invariant Feature Transform
  • SURF Speeded Up Robust Features
  • B-scan image composition processing, projection correction processing, and processing of combining a plurality of images are performed in this order, but the technology of the present disclosure is not limited to this.
  • the processes may be executed in the order of the B-scan image composition process, the process of combining a plurality of images, and the projection correction process.
  • B-scan images 602A, 602B, 604A, 604B, 606A, and 606B are generated by the image processing unit 206 with the same reference length. can get.
  • B-scan images 602A, 602B, 604A, 604B, 606A, and 606B are stitched together by the image processing unit 206 as shown in FIG.
  • the image processing unit 206 performs projection correction on the entire image obtained by pasting together the B-scan images 602A, 602B, 604A, 604B, 606A, and 606B. executed.
  • continuous regions are specified as the OCT imaging range. good too.
  • multiple B-scan images 602A, 606A, 606B are obtained that are spaced from each other. Projection correction is then performed on the separated B-scan images 602A, 606A, 606B as shown in FIG.
  • each OCT image corresponds to the depth direction at each scanning position with reference to an arc 650 corresponding to the innermost side (the side closest to the center of the eyeball) assumed in the cross section of the fundus. Rotate it so that it does, and then glue it together to synthesize it.
  • three-dimensional OCT images 592, 594, and 596 may be pasted together for synthesis.
  • an optical head may be used that includes at least optical and opto-mechanical components responsible for directing and scanning light to the fundus.
  • the example shown in FIG. 26 is an example of moving the optical head 802 using a combination of rotational movement and vertical movement.
  • the optical head 802 when photographing a peripheral area located below the central area of the fundus, the optical head 802 is moved upward by a moving mechanism (not shown) and placed at position A.
  • a moving mechanism (not shown) rotates the optical head 802 so that the optical axis of the optical head 802 coincides with the line connecting the center of the pupil 702 of the eye 701 and the center of the lower peripheral region.
  • the optical head 802 When imaging the central region of the fundus, the optical head 802 is moved downward by a moving mechanism (not shown) and placed at position B. A moving mechanism (not shown) rotates the optical head 802 so that the optical axis of the optical head 802 coincides with the line connecting the center of the pupil 702 of the eye 701 and the center of the eyeball.
  • the optical head 802 when photographing a peripheral region located above the central region of the fundus, the optical head 802 is moved downward by a moving mechanism (not shown) and placed at position C.
  • a moving mechanism (not shown) rotates the optical head 802 so that the optical axis of the optical head 802 coincides with the line connecting the center of the pupil 702 of the eye 701 and the center of the upper peripheral region.
  • the imaging control unit 202 obtains A-scan images while adjusting the reference length, and uses the obtained A-scan images to obtain B-scan images (obtained at positions A, B, and C). After generating the projected images GPA, GPB, GPC), projection correction is performed on the B-scan image.
  • the example shown in FIG. 27 differs from the example shown in FIG. 26 in that it has a second rotating shaft 803 and a link 804 .
  • the first rotation operation and the second rotation operation are selectively performed.
  • a first rotation operation refers to an operation in which the optical head 802 is rotated around the first rotation axis 801 .
  • a second rotating operation refers to an operation of rotating the optical head 802 around the second rotating shaft 803 .
  • the tip of the optical head 802 (the end close to the eye) is moved to the position B by rotating the link 804 around the second rotation axis 803 using a moving mechanism (not shown). Deploy. Next, a moving mechanism (not shown) rotates the optical head 802 about the first rotation axis 801 so that the optical axis of the optical head 802 coincides with the line connecting the center of the pupil 702 of the eye 701 and the center of the eyeball.
  • the movement mechanism rotates the link 804 around the second rotation shaft 803 to move the tip of the optical head 802. (End close to the eye) Move 801 upward.
  • the optical head 802 is moved along the first rotating shaft 801 so that the optical axis of the optical head 802 coincides with the line connecting the center of the pupil 702 of the eye 701 and the center of the lower peripheral region. rotate around.
  • the tip of the optical head 802 ( The end closest to the eye) 801 is moved downward and placed at position C.
  • a moving mechanism (not shown) moves the optical head 802 so that the optical axis of the optical head 802 coincides with the line connecting the center of the pupil 702 of the eye 701 and the center of the upper peripheral region. Rotate to center.
  • the imaging control unit 202 controls B-scan images (images GPA, GPB, and GPC acquired at positions A, B, and C) while adjusting the reference length. , and then perform projection correction on the B-scan image.
  • the display control unit 204 executes the process of step 268 in FIG. It is displayed on the display device 16E. Further, a layer image representing a reference layer in the retina, for example, a retinal pigment epithelium (RPE) layer with higher brightness than other layers, is corrected so as to follow a flat surface or a spherical surface with a predetermined curvature. and the corrected layer image may be displayed on the input/display device 16E.
  • RPE retinal pigment epithelium
  • projection correction is performed by the image processing unit 206 on the B-scan image shown in FIG. 28 as shown in FIG.
  • segmentation processing is performed to discriminate the boundaries of a plurality of layers in the OCT image, and the layer image showing the RPE layer 750 is corrected along the plane as shown in FIG.
  • the layer image 752 indicating the RPE layer 750 is used as a reference for the layer image 752 indicating the other layer, and the relative positional relationship (that is, the positional relationship based on the separation distance) with respect to the layer image 752 indicating the RPE layer 750 is maintained. is corrected so that These layer images are displayed on the input/display device 16E.
  • the RPE layer 750 is an example of the "reference layer" of the technology of the present disclosure.
  • the B-scan image shown in FIG. 28 is subjected to projection correction by the image processing unit 206 as shown in FIG.
  • the RPE layer 750 is then corrected to follow a non-planar spherical surface of predetermined curvature.
  • the layer images indicating other layers are corrected so that the relative positional relationship with respect to the corrected layer image is maintained with the corrected layer image of the RPE layer 750 as a reference. These layer images are displayed on the input/display device 16E.
  • scanning is performed at a predetermined resolution (step 260 in FIG. 4), but the technology of the present disclosure is not limited to this.
  • the imaging control unit 202 scans the signal light at a resolution lower than a predetermined resolution, and displays the obtained OCT image on the input/display device 16E.
  • the imaging control unit 202 scans only the designated portion of the displayed OCT image with the signal light at a higher resolution than the predetermined resolution.
  • the image processing unit 206 performs projection correction on the B-scan image obtained by scanning the signal light at high resolution.
  • the display control unit 204 displays the image obtained by executing the projection correction on the input/display device 16E.
  • the display control unit 204 may display the B-scan image obtained by scanning at high resolution on the input/display device 16E without performing projection correction.
  • an OCT image 850 obtained by scanning at low resolution is displayed on the input/display device 16E.
  • the imaging control unit 202 scans the designated areas with the signal light at high resolution.
  • the image processing unit 206 performs projection correction on B-scan images 852A and 854A obtained by scanning at high resolution.
  • the display control unit 204 displays the image obtained by executing the projection correction on the input/display device 16E.
  • the display control unit 204 may display the B-scan images 852A and 854A obtained by scanning at high resolution on the input/display device 16E without performing projection correction.
  • the number of areas to be scanned with high resolution may be singular or plural.
  • the user may specify the central area, the peripheral area, or an area in which the central area and the peripheral area are mixed.
  • the correspondence relationship between the first region 852 and the second region 854 specified by the user in the OCT image 850 (see FIG. 31) obtained by scanning at low resolution and the corresponding B-scan images 852A and 854A, Clarify as follows: That is, for example, the frame of each of the first region 852 and the B-scan image 852A corresponding to the first region 852 is the first color, and the second region 854 and the B-scan image 854A corresponding to the second region 854 are colored. is displayed in a second color different from the first color.
  • the image processing unit 206 corrects the layer image showing the RPE layer 750 so as to follow a spherical surface with a predetermined curvature, based on the image data obtained by executing the projection correction.
  • the image processing unit 206 corrects the image representing the other layer based on the layer image representing the RPE layer 750 so that the relative positional relationship with respect to the layer image representing the RPE layer 750 is maintained.
  • the display control unit 204 displays these images on the input/display device 16E.
  • the imaging control unit 202 further scans only the designated portion of the displayed OCT image with the signal light at high resolution, and the image processing unit 206 performs projection correction on the B-scan image A.
  • the display control unit 204 displays the image obtained by executing the projection correction on the input/display device 16E.
  • the image processing unit 206 may display the B-scan images 862A and 864A obtained by scanning on the input/display device 16E without performing projection correction.
  • the display control unit 204 displays the SLO fundus image 870 as shown in image (1) in FIG.
  • the display control unit 204 controls the low resolution
  • An OCT image 872A obtained by scanning with is displayed.
  • the imaging control unit 202 scans the designated area at high resolution
  • the image processing unit 206 scans the area shown in FIG. Projection correction is performed on the resulting B-scan image 874B, as shown in image (4).
  • the display control unit 204 displays the image obtained by executing the projection correction on the input/display device 16E. Note that the display control unit 204 may display the B-scan image 874B on the input/display device 16E without performing projection correction.
  • the imaging control unit 202 scans the designated area at high resolution.
  • the image processing unit 206 performs projection correction on the obtained B-scan image 876A, as shown in image (6) in FIG.
  • the display control unit 204 displays the image obtained by the projection correction on the input/display device 16E.
  • Image processing unit 206 may display B-scan image 876A on input/display device 16E without performing projection correction.
  • the screens of FIGS. 12 and 15 to 34 are displayed on the input/display device 16E of the ophthalmologic apparatus 110, the technology of the present disclosure is not limited to this.
  • the viewer 150 may display the screens of FIGS. 12 and 15 to 34 on the display of the viewer 150 based on the image data transmitted from the server 140, as described above. Accordingly, the viewer 150 displays the OCT image after the transformation (step 262).
  • the viewer 150 displays SLO fundus images, such as the relationship between image (2) in FIG. 34 and image (3) in FIG. 34 and the relationship between image (5) in FIG. 34 and image (6) in FIG.
  • the converted OCT image is displayed with the position of the OCT image corresponding to the position of the SLO fundus image.
  • Viewer 150 displays the transformed OCT image and the flattened OCT image (FIG. 30).
  • the system configuration of the technology of the present disclosure uses a value based on information such as the gender difference and regional characteristics as the patient's axial length (L), and omits the viewer except when the axial length measuring device 120 is omitted. You may do so.
  • the processing performed by the viewer is performed by the ophthalmologic apparatus 110 .
  • the viewer and server 140 may be omitted.
  • the processing performed by the viewer and server 140 is performed by the ophthalmologic apparatus 110 .
  • the ophthalmologic apparatus 110 executes all steps of the data processing in FIG. 4, but the technology of the present disclosure is not limited to this.
  • the ophthalmologic apparatus 110 may perform the processing from step 252 to step 260 or step 264 of FIG. 4, and the server 140 may perform step 262 or step 266.
  • FIG. The ophthalmic system 100 in this case is an example of the "system" of the technology of the present disclosure.
  • reference lengths are communicated so as to correspond to the optical path lengths of the signal light and return light.
  • the technology of the present disclosure is not limited to this, and the optical path lengths of the signal light and the return light may be changed to correspond to the reference lengths of the signal light and the return light.
  • the optical path lengths of the signal light and the return light may be made to correspond to the reference length.
  • a plurality of optical systems, such as first to fourth mirrors are arranged on the optical axis of the signal light as follows.
  • the signal light passes through the first mirror, is reflected by the second mirror, is reflected by the third mirror, is then reflected by the fourth mirror, and is reflected by the fourth mirror to the original optical path.
  • the first to fourth mirrors are arranged as follows.
  • the optical path length of the signal light can be changed by moving the second and third mirrors away from or closer to the first and fourth mirrors.
  • each component may exist either singly or two or more.
  • image processing may be performed only by a hardware configuration such as FPGA (Field-Programmable Gate Array) or ASIC (Application Specific Integrated Circuit).
  • FPGA Field-Programmable Gate Array
  • ASIC Application Specific Integrated Circuit
  • the technology of the present disclosure includes the following technology, as it includes both cases in which image processing is realized by software configuration using a computer and cases in which it is not.
  • a computer program product for image processing comprising a computer readable storage medium which is not itself a temporary signal, said computer readable storage medium having a program stored thereon, said The program generates interference light between the signal light (LS) and the reference light (LR) that scan the subject's eye, detects the generated interference light, and detects the signal light (LS) and the reference light (LR).
  • the eye to be examined is scanned with the signal light (LS) in order to acquire OCT data of a specified range of the eye to be examined by a computer of an ophthalmologic apparatus having an OCT unit configured to be able to change at least one optical path length. if so, adjusting the OCT unit such that the optical path length is changed.
  • a computer program product that causes the transforming steps to be performed.
  • the present disclosure includes the following technical items.
  • the present disclosure aims at obtaining sharp OCT images corresponding to the real appearance of the fundus.
  • an OCT unit configured to generate interference light between signal light and reference light that scan an eye to be inspected, detect the generated interference light, and change the optical path length of at least one of the signal light and the reference light; , a determination unit that determines whether or not the optical path length needs to be changed when the eye to be inspected is scanned with the signal light in order to acquire OCT data of a specified range of the eye to be inspected; a control unit that adjusts the OCT unit so that the optical path length is changed when it is determined that the optical path length needs to be changed; a conversion unit that converts the OCT data based on a scan angle that determines the scan position of the eye to be inspected by the signal light and a distance from the pupil of the eye to the eye to the fundus; An ophthalmic device with (Section 2) 2.
  • the ophthalmologic apparatus determines that the optical path length needs to be changed when the angle of view of the specified range is larger than a predetermined angle of view.
  • (Section 3) Item 3.
  • (Section 4) Item 4.
  • the ophthalmologic apparatus according to any one of Items 1 to 3, wherein the scan includes at least a first scan and a second scan having different optical path lengths.
  • the ophthalmic apparatus according to Item 6, wherein the third optical path length is the same as the second optical path length.
  • (Section 8) Using the first tomographic image obtained by using the first OCT data obtained by the first scan and converted by the conversion unit, and the second OCT data obtained by the second scan and converted by the conversion unit an image forming unit that forms a second tomographic image obtained by 8.
  • the image forming unit according to any one of items 4 to 7, wherein the image forming unit synthesizes the first tomographic image and the second tomographic image and performs image processing so as to form a continuous tomographic image. ophthalmic equipment. (Section 9) 9.
  • the ophthalmologic apparatus according to any one of items 1 to 8, wherein the OCT data is data for forming a two-dimensional tomographic image or a three-dimensional tomographic image. (Section 10) 10.
  • the ophthalmologic apparatus according to any one of items 1 to 9, further comprising an image forming unit that forms a tomographic image using the OCT data converted by the converting unit.
  • the image forming unit performs segmentation processing for determining each boundary of a plurality of layers in the tomographic image, A layer serving as a reference among the plurality of layers is displayed along a surface of a predetermined curvature, and layers other than the layer serving as a reference are displayed corresponding to the distance from the layer serving as a reference.
  • Item 11 The image forming unit performs segmentation processing for determining each boundary of a plurality of layers in the tomographic image, A layer serving as a reference among the plurality of layers is displayed along a surface of a predetermined curvature, and layers other than the layer
  • the ophthalmologic apparatus which performs a process of deforming the tomographic image.
  • An OCT unit configured to generate interference light between signal light and reference light that scans an eye to be inspected, detect the generated interference light, and change the optical path length of at least one of the signal light and the reference light.
  • a control method executed by a processor of an ophthalmic device comprising: determining whether it is necessary to change the optical path length when scanning the eye to be inspected with the signal light in order to acquire OCT data of a specified range of the eye to be inspected; if it is determined that the optical path length needs to be changed, adjusting the OCT unit so that the optical path length is changed; converting the OCT data based on a scan angle that determines the scan position of the eye to be inspected by the signal light and a distance from the pupil of the eye to the eye to the fundus;
  • a control method comprising: (Item 13) An OCT unit configured to generate interference light between signal light and reference light that scans an eye to be inspected, detect the generated interference light, and change the optical path length of at least one of the signal light and the reference light.
  • a computer of an ophthalmic device equipped with determining whether it is necessary to change the optical path length when scanning the eye to be inspected with the signal light in order to acquire OCT data of a specified range of the eye to be inspected; if it is determined that the optical path length needs to be changed, adjusting the OCT unit so that the optical path length is changed; converting the OCT data based on a scan angle that determines the scan position of the eye to be inspected by the signal light and a distance from the pupil of the eye to the eye to the fundus;
  • a program that acts as

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

Le dispositif ophtalmique de la présente invention comprend : une unité OCT conçue pour générer une lumière d'interférence à partir d'une lumière de référence et d'une lumière de signal utilisée pour balayer un œil examiné, conçue pour détecter la lumière d'interférence générée, et conçue pour pouvoir modifier la longueur de trajet optique d'au moins l'une de la lumière de signal et de la lumière de référence ; une unité de détermination qui détermine si la longueur de trajet optique doit être changée lorsque l'œil examiné est balayé à l'aide de la lumière de signal afin d'acquérir des données OCT sur une plage spécifique de l'œil examiné ; une unité de commande qui règle l'unité OCT de telle sorte que la longueur de trajet optique est modifiée, lorsqu'il est déterminé que la longueur de trajet optique doit être changée ; et une unité de conversion qui convertit les données OCT sur la base d'un angle de balayage qui définit la position du balayage de l'œil examiné à l'aide de la lumière de signal et de la distance de la pupille au fond de l'œil examiné.
PCT/JP2022/031142 2021-08-18 2022-08-17 Dispositif ophtalmique, procédé de commande et programme WO2023022183A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-133342 2021-08-18
JP2021133342 2021-08-18

Publications (1)

Publication Number Publication Date
WO2023022183A1 true WO2023022183A1 (fr) 2023-02-23

Family

ID=85239861

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/031142 WO2023022183A1 (fr) 2021-08-18 2022-08-17 Dispositif ophtalmique, procédé de commande et programme

Country Status (1)

Country Link
WO (1) WO2023022183A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116548910A (zh) * 2023-05-19 2023-08-08 北京至真互联网技术有限公司 一种眼科相干断层扫描仪的分辨率自适应调节方法及系统

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009183332A (ja) * 2008-02-04 2009-08-20 Topcon Corp 眼底観察装置、眼底画像処理装置及びプログラム
JP2011224264A (ja) * 2010-04-22 2011-11-10 Canon Inc 断層像観察装置、表示制御方法及びプログラム
JP2018171168A (ja) * 2017-03-31 2018-11-08 株式会社ニデック Oct装置
JP2020502501A (ja) * 2016-12-20 2020-01-23 ノバルティス アーゲー 広角光干渉断層撮影法のためのシステム及び方法
WO2020044712A1 (fr) * 2018-08-29 2020-03-05 株式会社トプコン Dispositif d'ophtalmologie, et procédé de commande associé
JP2020110256A (ja) * 2019-01-09 2020-07-27 キヤノン株式会社 眼科撮影装置、その制御装置、制御方法、およびプログラム
WO2020174856A1 (fr) * 2019-02-27 2020-09-03 株式会社トプコン Dispositif de traitement d'informations ophtalmiques, dispositif ophtalmique, procédé de traitement d'informations ophtalmiques et programme

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009183332A (ja) * 2008-02-04 2009-08-20 Topcon Corp 眼底観察装置、眼底画像処理装置及びプログラム
JP2011224264A (ja) * 2010-04-22 2011-11-10 Canon Inc 断層像観察装置、表示制御方法及びプログラム
JP2020502501A (ja) * 2016-12-20 2020-01-23 ノバルティス アーゲー 広角光干渉断層撮影法のためのシステム及び方法
JP2018171168A (ja) * 2017-03-31 2018-11-08 株式会社ニデック Oct装置
WO2020044712A1 (fr) * 2018-08-29 2020-03-05 株式会社トプコン Dispositif d'ophtalmologie, et procédé de commande associé
JP2020110256A (ja) * 2019-01-09 2020-07-27 キヤノン株式会社 眼科撮影装置、その制御装置、制御方法、およびプログラム
WO2020174856A1 (fr) * 2019-02-27 2020-09-03 株式会社トプコン Dispositif de traitement d'informations ophtalmiques, dispositif ophtalmique, procédé de traitement d'informations ophtalmiques et programme

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116548910A (zh) * 2023-05-19 2023-08-08 北京至真互联网技术有限公司 一种眼科相干断层扫描仪的分辨率自适应调节方法及系统
CN116548910B (zh) * 2023-05-19 2023-12-08 北京至真互联网技术有限公司 一种眼科相干断层扫描仪的分辨率自适应调节方法及系统

Similar Documents

Publication Publication Date Title
JP6217185B2 (ja) 眼科撮影装置及び眼科画像処理プログラム
JP5913999B2 (ja) 眼科撮像装置およびその制御方法
JP2023009530A (ja) 画像処理方法、画像処理装置、及びプログラム
WO2021106987A1 (fr) Dispositif de traitement d'image médicale, dispositif de tomographie à cohérence optique, procédé de traitement d'image médicale et programme
US11013400B2 (en) Ophthalmic apparatus
JP7134324B2 (ja) 眼科撮影装置、その制御方法、プログラム、及び記録媒体
JP7186587B2 (ja) 眼科装置
JP7090438B2 (ja) 眼科撮影装置、その制御方法、プログラム、及び記録媒体
US10321819B2 (en) Ophthalmic imaging apparatus
JP2021023686A (ja) 眼科装置、その制御方法、プログラム、及び、記録媒体
WO2023022183A1 (fr) Dispositif ophtalmique, procédé de commande et programme
US11311189B2 (en) Ophthalmic imaging apparatus, controlling method thereof, ophthalmic imaging method, and recording medium
US11089956B2 (en) Ophthalmologic apparatus and method of controlling the same
JP7378557B2 (ja) 眼科撮影装置、その制御方法、プログラム、及び記録媒体
JP2018068630A (ja) 断層画像取得装置及び方法
JP2020048857A (ja) 眼測定装置及び方法
US20210196116A1 (en) Ophthalmic imaging apparatus, controlling method of the same, and recording medium
JP7050488B2 (ja) 眼科撮影装置、その制御方法、プログラム、及び記録媒体
WO2023282339A1 (fr) Procédé de traitement d'image, programme de traitement d'image, dispositif de traitement d'image et dispositif ophtalmique
JP2020130266A (ja) 眼科装置
JP7472914B2 (ja) 画像処理方法、画像処理装置、及びプログラム
WO2022085501A1 (fr) Dispositif ophtalmique, son procédé de commande, programme, et support d'enregistrement
US20210204809A1 (en) Ophthalmic imaging apparatus, controlling method of the same, and recording medium
JP6942626B2 (ja) 眼科撮影装置、その制御方法、プログラム、及び記録媒体
WO2023182011A1 (fr) Procédé de traitement d'image, dispositif de traitement d'image, dispositif ophtalmologique, et programme

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22858501

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE