WO2021074960A1 - Procédé de traitement d'images, dispositif de traitement d'images et programme de traitement d'images - Google Patents

Procédé de traitement d'images, dispositif de traitement d'images et programme de traitement d'images Download PDF

Info

Publication number
WO2021074960A1
WO2021074960A1 PCT/JP2019/040481 JP2019040481W WO2021074960A1 WO 2021074960 A1 WO2021074960 A1 WO 2021074960A1 JP 2019040481 W JP2019040481 W JP 2019040481W WO 2021074960 A1 WO2021074960 A1 WO 2021074960A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
image processing
oct
fundus
light
Prior art date
Application number
PCT/JP2019/040481
Other languages
English (en)
Japanese (ja)
Inventor
真梨子 廣川
泰士 田邉
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Priority to PCT/JP2019/040481 priority Critical patent/WO2021074960A1/fr
Priority to JP2021552009A priority patent/JPWO2021074960A1/ja
Publication of WO2021074960A1 publication Critical patent/WO2021074960A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes

Definitions

  • the present invention relates to an image processing method, an image processing device, and an image processing program.
  • U.S. Pat. No. 7,857,449 discloses an ophthalmic apparatus that captures a tomographic image of the fundus. It is desired to acquire a tomographic image at a desired position of the fundus.
  • the image processing method of the first aspect of the technique of the present disclosure includes a step of detecting the vortex vein position from the fundus image and a step of setting a scanning area for acquiring a tomographic image including the vortex vein position.
  • the image processing apparatus of the second aspect of the technique of the present disclosure sets a scanning area for setting a vortex vein position detection unit that detects the vortex vein position from the fundus image and a scanning region for acquiring a tomographic image including the vortex vein position.
  • a unit and an image acquisition unit that scans the scanning region to acquire a tomographic image are provided.
  • the image processing program of the third aspect of the technique of the present disclosure sets a computer as a vortex vein position detecting unit for detecting the vortex vein position from the fundus image, and a scanning region for acquiring a tomographic image including the vortex vein position. It functions as a scanning area setting unit.
  • the ophthalmology system 100 includes an ophthalmology device 110, an axial length measuring device 120, a management server device (hereinafter referred to as “management server”) 140, and an image display device (hereinafter referred to as “image viewer””. ) 150 and.
  • the ophthalmic apparatus 110 acquires a fundus image.
  • the axial length measuring device 120 measures the axial length of the patient.
  • the management server 140 stores a plurality of fundus images, axial lengths, and tomographic images obtained by photographing the fundus of a plurality of patients by the ophthalmologic apparatus 110, corresponding to the IDs of the patients.
  • the image viewer 150 displays a fundus image acquired from the management server 140.
  • the image viewer 150 includes a display 156, a mouse 155M, and a keyboard 155K.
  • the ophthalmic apparatus 110, the axial length measuring instrument 120, the management server 140, and the image viewer 150 are connected to each other via the network 130.
  • the image viewer 150 is a client in a client-server system, and a plurality of image viewers 150 are connected via a network. Further, a plurality of servers 140 may be connected via a network in order to ensure system redundancy.
  • the ophthalmic apparatus 110 has an image processing function and an image viewing function of the image viewer 150
  • the ophthalmic apparatus 110 can acquire an image of the fundus of the eye, process an image, and view an image in a stand-alone state.
  • the server 140 is provided with the image viewing function of the image viewer 150, the ophthalmic apparatus 110 and the server 140 can be configured to acquire a fundus image, perform image processing, and view an image.
  • a diagnostic support device that performs image analysis using other ophthalmic devices (inspection devices such as visual field measurement and intraocular pressure measurement) and AI (Artificial Intelligence) is connected to the ophthalmic device 110, the server 140, and an image via the network 130. It may be connected to the viewer 150.
  • ophthalmic devices inspection devices such as visual field measurement and intraocular pressure measurement
  • AI Artificial Intelligence
  • the axial length measuring device 120 has two modes, a first mode and a second mode, for measuring the axial length, which is the length of the eye to be inspected 12 in the axial direction.
  • the first mode after guiding the light from a light source (not shown) to the eye 12 to be inspected, the interference light between the reflected light from the fundus and the reflected light from the cornea is received, and the interference signal indicating the received interference light is generated.
  • the axial length is measured based on this.
  • the second mode is a mode in which the axial length is measured using ultrasonic waves (not shown).
  • the axial length measuring device 120 transmits the axial length measured by the first mode or the second mode to the management server 140.
  • the axial length may be measured in the first mode and the second mode.
  • the average of the axial lengths measured in both modes is transmitted to the management server 140 as the axial length.
  • the axial length is stored as patient information in the management server 140 as one of the subject's data, and is also used for fundus image analysis.
  • SLO scanning laser ophthalmoscope
  • OCT optical coherence tomography
  • the horizontal direction is the "X direction” and the direction perpendicular to the horizontal plane is the "Y direction", connecting the center of the pupil of the anterior segment of the eye 12 to the center of the eyeball.
  • the direction is "Z direction”. Therefore, the X, Y, and Z directions are perpendicular to each other.
  • the ophthalmic device 110 includes a photographing device 14 and a control device 16.
  • the imaging device 14 includes an SLO unit 18 and an OCT unit 20 to acquire a fundus image of the fundus of the eye to be inspected 12.
  • the two-dimensional fundus image acquired by the SLO unit 18 is referred to as an SLO image.
  • a tomographic image of the retina, an frontal image (en-face image), or the like created based on the OCT data acquired by the OCT unit 20 is referred to as an OCT image.
  • the OCT image corresponds to the "tomographic image" of the technique of the present disclosure.
  • the control device 16 includes a computer having a CPU (Central Processing Unit) 16A, a RAM (Random Access Memory) 16B, a ROM (Read-Only memory) 16C, and an input / output (I / O) port 16D. ing.
  • CPU Central Processing Unit
  • RAM Random Access Memory
  • ROM Read-Only memory
  • I / O input / output
  • the control device 16 includes an input / display device 16E connected to the CPU 16A via the I / O port 16D.
  • the input / display device 16E has a graphic user interface for displaying an image of the eye 12 to be inspected and receiving various instructions from the user.
  • the graphic user interface includes a touch panel display.
  • control device 16 includes an image processing device 17 connected to the I / O port 16D.
  • the image processing device 17 generates an image of the eye to be inspected 12 based on the data obtained by the photographing device 14.
  • the control device 16 is connected to the network 130 via the communication interface 16F.
  • the control device 16 of the ophthalmic device 110 includes the input / display device 16E, but the technique of the present disclosure is not limited to this.
  • the control device 16 of the ophthalmic apparatus 110 may not include the input / display device 16E, but may include an input / display device that is physically independent of the ophthalmic apparatus 110.
  • the display device includes an image processing processor unit that operates under the control of the display control unit 204 of the CPU 16A of the control device 16.
  • the image processing processor unit may display an SLO image or the like based on the image signal output instructed by the display control unit 204.
  • the photographing device 14 operates under the control of the photographing control unit 202 of the control device 16.
  • the photographing apparatus 14 includes an SLO unit 18, a photographing optical system 19, and an OCT unit 20.
  • the photographing optical system 19 includes a first optical scanner 22, a second optical scanner 24, and a wide-angle optical system 30.
  • the fixation markers 92U and 92L composed of a light emitting device (for example, an LED) that is lit so as to direct the line of sight of the patient are provided at positions shifted up and down from the optical axis of the photographing optical system 19.
  • the first optical scanner 22 two-dimensionally scans the light emitted from the SLO unit 18 in the X direction and the Y direction.
  • the second optical scanner 24 two-dimensionally scans the light emitted from the OCT unit 20 in the X direction and the Y direction.
  • the first optical scanner 22 and the second optical scanner 24 may be any optical element capable of deflecting a luminous flux, and for example, a polygon mirror, a galvano mirror, or the like can be used. Moreover, it may be a combination thereof.
  • the wide-angle optical system 30 includes an objective optical system having a common optical system 28 (not shown in FIG. 2), and a compositing unit 26 that synthesizes light from the SLO unit 18 and light from the OCT unit 20.
  • the objective optical system of the common optical system 28 may be a catadioptric system using a concave mirror such as an elliptical mirror, a catadioptric system using a wide-angle lens or the like, or a catadioptric system combining a concave mirror or a lens. Good.
  • a wide-angle optical system using an elliptical mirror, a wide-angle lens, or the like it is possible to photograph the retina not only in the central part of the fundus but also in the peripheral part of the fundus.
  • the wide-angle optical system 30 enables observation in the fundus with a wide field of view (FOV: Field of View) 12A.
  • the FOV 12A indicates a range that can be photographed by the photographing device 14.
  • FOV12A can be expressed as a viewing angle.
  • the viewing angle can be defined by an internal irradiation angle and an external irradiation angle in the present embodiment.
  • the external irradiation angle is an irradiation angle in which the irradiation angle of the luminous flux emitted from the ophthalmic apparatus 110 to the eye 12 to be inspected is defined with reference to the pupil 27.
  • the internal irradiation angle is an irradiation angle in which the irradiation angle of the light flux irradiated to the fundus F is defined with reference to the center O of the eyeball.
  • the external irradiation angle and the internal irradiation angle have a corresponding relationship. For example, when the external irradiation angle is 120 degrees, the internal irradiation angle corresponds to about 160 degrees. In this embodiment, the internal irradiation angle is set to 200 degrees.
  • UWF-SLO fundus image obtained by taking a picture with an internal irradiation angle of 160 degrees or more
  • UWF is an abbreviation for UltraWide Field (ultra-wide-angle).
  • FOV viewing angle
  • the SLO system is realized by the control device 16, the SLO unit 18, and the photographing optical system 19 shown in FIG. Since the SLO system includes a wide-angle optical system 30, it enables fundus photography with a wide FOV12A.
  • the SLO unit 18 includes a B (blue light) light source 40, a G light (green light) light source 42, an R light (red light) light source 44, and an IR light (infrared light (for example, near infrared light)) light source. It includes 46 and optical systems 48, 50, 52, 54, 56 that reflect or transmit light from light sources 40, 42, 44, 46 and guide them into one optical path.
  • the optical systems 48 and 56 are mirrors, and the optical systems 50, 52 and 54 are beam splitters.
  • B light is reflected by the optical system 48, is transmitted through the optical system 50, is reflected by the optical system 54, G light is reflected by the optical systems 50 and 54, and R light is transmitted through the optical systems 52 and 54.
  • IR light is reflected by the optical systems 52 and 56 and guided to one optical path, respectively.
  • the SLO unit 18 is configured to be able to switch a combination of a light source that emits laser light having a different wavelength or a light source that emits light, such as a mode that emits R light and G light and a mode that emits infrared light.
  • a light source that emits laser light having a different wavelength or a light source that emits light such as a mode that emits R light and G light and a mode that emits infrared light.
  • the example shown in FIG. 2 includes three light sources, a G light source 42, an R light source 44, and an IR light source 46, but the technique of the present disclosure is not limited thereto.
  • the SLO unit 18 further includes a light source of B light (blue light) and a light source of white light, and has various modes such as a mode of emitting G light, R light, and B light, and a mode of emitting only white light. The mode may emit light.
  • the light incident on the photographing optical system 19 from the SLO unit 18 is scanned in the X direction and the Y direction by the first optical scanner 22.
  • the scanning light is applied to the fundus through the wide-angle optical system 30 and the pupil 27.
  • the reflected light reflected by the fundus is incident on the SLO unit 18 via the wide-angle optical system 30 and the first optical scanner 22.
  • the SLO unit 18 transmits G light among the light from the rear eye portion (fundus) of the eye 12 to be examined, the beam splitter 64 that reflects the B light and transmits other than the B light, and the light that has passed through the beam splitter 64.
  • a beam splitter 58 that reflects and transmits other than G light is provided.
  • the SLO unit 18 includes a beam splitter 60 that reflects R light and transmits other than R light among the light transmitted through the beam splitter 58.
  • the SLO unit 18 includes a beam splitter 62 that reflects IR light among the light transmitted through the beam splitter 60.
  • the SLO unit 18 detects the B light detection element 70 that detects the B light reflected by the beam splitter 64, the G light detection element 72 that detects the G light reflected by the beam splitter 58, and the R light reflected by the beam splitter 60. It includes an R light detection element 74 and an IR light detection element 76 that detects the IR light reflected by the beam splitter 62.
  • the light incident on the SLO unit 18 via the wide-angle optical system 30 and the first optical scanner 22 is reflected by the beam splitter 64 and is reflected by the B light detection element 70.
  • the beam splitter 64 In the case of G light, it is reflected by the beam splitter 58 and received by the G light detection element 72.
  • the incident light passes through the beam splitter 58, is reflected by the beam splitter 60, and is received by the R light detection element 74.
  • the incident light passes through the beam splitters 58 and 60, is reflected by the beam splitter 62, and is received by the IR photodetector 76.
  • the image processing device 17 operating under the control of the CPU 16A uses the signals detected by the B photodetector 70, the G photodetector 72, the R photodetector 74, and the IR photodetector 76 to produce a UWF-SLO image. Generate.
  • the UWF-SLO images include a UWF-SLO image (G color fundus image) 502GG obtained by photographing the fundus in G color and a UWF obtained by photographing the fundus in R color.
  • G color fundus image a UWF obtained by photographing the fundus in G color
  • R color fundus image a UWF obtained by photographing the fundus in R color.
  • SLO image R color fundus image
  • the UWF-SLO images include a UWF-SLO image (B color fundus image) 506BG obtained by photographing the fundus in B color and a UWF-SLO image (IR fundus image) obtained by photographing the fundus in IR.
  • IRG IRG.
  • control device 16 controls the light sources 40, 42, and 44 so as to emit light at the same time.
  • G color fundus image 502GG, R color fundus image 504RG, and B color fundus image 506BG whose positions correspond to each other can be obtained.
  • An RGB color fundus image can be obtained from the G color fundus image 502GG, the R color fundus image 504RG, and the B color fundus image 506BG.
  • the control device 16 controls the light sources 42 and 44 so as to emit light at the same time, and the fundus of the eye to be inspected 12 is simultaneously photographed by the G light and the R light, so that the G color fundus images 502 GG and R whose positions correspond to each other are taken.
  • a colored fundus image 504RG is obtained.
  • An RG color fundus image can be obtained from the G color fundus image 502GG and the R color fundus image 504RG.
  • the UWF-SLO image also includes a UWF-SLO image (moving image) 510 ICGG photographed by ICG fluorescence.
  • ICG indocyanine green
  • the UWF-SLO image (moving image) 510 ICGG is a moving image from the time when indocyanine green (ICG) is injected into a blood vessel and reaches the retina to the time after passing through the choroid.
  • Each image data of B color fundus image 506BG, G color fundus image 502GG, R color fundus image 504RG, IR fundus image 508IRG, RGB color fundus image, RG color fundus image, and UWF-SLO image 510ICGG has a communication IF (not shown). It is sent from the ophthalmologic apparatus 110 to the management server 140 via the ophthalmologic apparatus 110.
  • the OCT system is realized by the control device 16, the OCT unit 20, and the photographing optical system 19 shown in FIG. Since the OCT system includes the wide-angle optical system 30, it enables OCT imaging of the peripheral portion of the fundus in the same manner as the acquisition of the SLO fundus image described above. That is, the wide-angle optical system 30 in which the viewing angle (FOV) of the fundus is an ultra-wide angle enables OCT imaging of a region beyond the equator from the posterior pole of the fundus of the eye 12 to be inspected. OCT data of structures existing around the fundus such as vortex veins can be acquired, and tomographic images of vortex veins and 3D structures of vortex veins can be obtained by image processing of OCT data.
  • the OCT unit 20 includes a light source 20A, a sensor (detection element) 20B, a first optical coupler 20C, a reference optical system 20D, a collimating lens 20E, and a second optical coupler 20F.
  • the light emitted from the light source 20A is branched by the first optical coupler 20C.
  • One of the branched lights is made into parallel light by the collimated lens 20E as measurement light, and then incident on the photographing optical system 19.
  • the measurement light is scanned in the X and Y directions by the second optical scanner 24.
  • the scanning light is applied to the fundus through the wide-angle optical system 30 and the pupil 27.
  • the measurement light reflected by the fundus is incident on the OCT unit 20 via the wide-angle optical system 30 and the second optical scanner 24, and passes through the collimating lens 20E and the first optical coupler 20C to the second optical coupler 20F. Incident in.
  • the other light emitted from the light source 20A and branched by the first optical coupler 20C is incident on the reference optical system 20D as reference light, and is incident on the second optical coupler 20F via the reference optical system 20D. To do.
  • the image processing device 17 that operates under the control of the image processing unit 206 generates an OCT image such as a tomographic image or an en-face image based on the OCT data detected by the sensor 20B.
  • an OCT image obtained by photographing with an internal irradiation angle of 160 degrees or more, or an OCT image obtained by scanning the peripheral part of the fundus is referred to as a UWF-OCT image.
  • the image data of the UWF-OCT image is sent from the ophthalmic apparatus 110 to the management server 140 via a communication IF (not shown) and stored in the storage apparatus 254.
  • the light source 20A exemplifies a wavelength sweep type SS-OCT (Swept-Source OCT), but there are various types such as SD-OCT (Spectral-Domain OCT) and TD-OCT (Time-Domain OCT). OCT system of various types may be used.
  • the management server 140 includes a computer main body 252.
  • the computer body 252 has a CPU 262, a RAM 266, a ROM 264, and an input / output (I / O) port 268.
  • a storage device 254, a display 256, a mouse 255M, a keyboard 255K, and a communication interface (I / F) 258 are connected to the input / output (I / O) port 268.
  • the storage device 254 is composed of, for example, a non-volatile memory.
  • the input / output (I / O) port 268 is connected to the network 130 via the communication interface (I / F) 258. Therefore, the management server 140 can communicate with the ophthalmic apparatus 110, the axial length measuring device 120, and the image viewer 150.
  • the management server 140 stores each data received from the ophthalmic device 110 and the axial length measuring device 120 in the storage device 254.
  • the image processing program includes a display control function, an image processing function, and a processing function.
  • the CPU 262 executes an image processing program having each of these functions, the CPU 262 functions as a display control unit 204, an image processing unit 206, and a processing unit 208.
  • the image processing unit 206 corresponds to the "vortex vein position detection unit", “scanning area setting unit”, and "image acquisition unit” of the technique of the present disclosure.
  • the image processing unit 206 functions as the vortex vein position analysis unit 2060 for specifying the vortex vein position and the OCT acquisition position setting unit 2062 for specifying the position for acquiring the tomographic image by OCT. ..
  • FIG. 5 is a sequence diagram showing the cooperation of the doctor terminal, the image viewer 150, the SLO unit 18, the OCT unit 20, and the management server 140 in the present embodiment.
  • the doctor gives an instruction to photograph the vortex vein to the image viewer 150 via the doctor's terminal.
  • the operator who orally operates the image viewer 150 may be instructed to take a picture.
  • step S2 the image viewer 150 gives a shooting instruction to the SLO unit 18.
  • step S3 the SLO unit 18 shoots a UWF-SLO image
  • step S4 the shot UWF-SLO image is transmitted to the management server 140.
  • step S5 the management server 140 detects the position of the vortex vein from the UWF-SLO image as described later.
  • the management server 140 creates a display image described later, and in step S7, the management server 140 transmits the created display image to the image viewer 150.
  • step S8 the image viewer 150 displays the display image, which is the detection result of the vortex vein, on the display 156.
  • the display 156 corresponds to the "image display unit" of the technique of the present disclosure.
  • step S9 the doctor gives an instruction to take an OCT image to the image viewer 150 via the doctor's terminal.
  • the doctor's terminal is not used in step S1
  • the operator who orally operates the image viewer 150 may be instructed to take a picture.
  • step S10 the image viewer 150 gives a shooting instruction to the OCT unit 20.
  • step S11 the OCT unit 20 takes an OCT image, and in step S12, the taken OCT image is transmitted to the management server 140.
  • step S13 the management server 140 creates a display image described later, and in step S14, the management server 140 transmits the created display image to the image viewer 150.
  • step S15 the image viewer 150 displays the displayed image on the display 156.
  • FIG. 6 is a flowchart showing image processing in the image processing unit 206 of the management server 140.
  • the image processing unit 206 acquires a UWF-SLO image, which is a fundus image, from the SLO unit 18.
  • a choroidal blood vessel image is created.
  • the image processing unit 206 creates a choroidal blood vessel image (FIG. 14) from the acquired UWF-SLO image as follows.
  • the choroidal blood vessel image is an image in which the choroidal blood vessel obtained by image processing the UWF-SLO image is visualized.
  • the UWF-SLO images to be image-processed are, for example, an R-color fundus image 504RG and a G-color fundus image 502GG. Further, in order to reliably detect the vortex vein around the fundus, the upper UWF-SLO image taken with the inspected eye viewed upward using the fixation targets 92U and 92L and the inspected eye are viewed downward.
  • a choroidal blood vessel image may be generated from a montage UWF-SLO image obtained by synthesizing a downward-view UWF-SLO image taken in the state of being in the state.
  • the generation of montage images is disclosed in US Patent Application Publication No. 2012/0133888.
  • the structure of the eye is such that the vitreous body is covered with multiple layers having different structures.
  • the layers include the retina, choroid, and sclera from the innermost to the outermost side of the vitreous body. Since R light has a long wavelength, it passes through the retina and reaches the choroid. Therefore, the R-color fundus image 504RG includes information on blood vessels existing in the retina (retinal blood vessels) and information on blood vessels existing in the choroid (choroidal blood vessels). On the other hand, G light has a shorter wavelength than R light, so it reaches only the retina. Therefore, the G-color fundus image 502GG contains only information on blood vessels (retinal blood vessels) existing in the retina. Therefore, a choroidal blood vessel image (FIG. 14) can be obtained by extracting a retinal blood vessel from the G-color fundus image 502GG and removing the retinal blood vessel from the R-color fundus image 504RG. Specifically, the choroidal blood vessel image is generated as follows.
  • the image processing unit 206 emphasizes the retinal blood vessels by applying a top hat filter process to the G-color fundus image 502GG.
  • the image processing unit 206 binarizes the G-color fundus image 502GG subjected to the top hat filter processing and extracts the retinal blood vessels.
  • the image processing unit 206 removes the retinal blood vessels from the R color fundus image 504RG by an inpainting process or the like. That is, the retinal blood vessel structure of the R color fundus image 504RG is filled with the same value as the surrounding pixels using the position information of the retinal blood vessels extracted from the G color fundus image 502GG to generate a retinal blood vessel removal image.
  • the image processing unit 206 performs a process of removing the uneven brightness of the retinal blood vessel removal image. Specifically, the brightness unevenness of the retinal blood vessel removal image is removed by dividing the retinal blood vessel removal image by the low frequency component image obtained by applying a Gaussian filter or the like to the retinal blood vessel removal image. Then, the choroidal blood vessel image (simplified image diagram) shown in FIG. 14 is created by performing the binarization processing on the image in which the choroidal blood vessel is further emphasized by the line enhancement processing.
  • the choroidal blood vessels and the optic disc ONH are white pixels, and the other fundus regions are black pixels.
  • the created choroidal blood vessel image is stored in the storage device 254. A fluorescently photographed fundus image may be used to generate the choroidal blood vessel image.
  • step 304 the image processing unit 206 analyzes the choroidal blood vessel image and detects the vortex vein position. Details of the detection of the vortex vein position will be described later.
  • the acquisition position of the OCT image (OCT image acquisition area) is set.
  • the acquisition position of the OCT image is, for example, a predetermined range centered on the vortex vein detected in the analysis in step 304.
  • the image processing unit 206 uses a rectangular region of 6 ⁇ 6 mm centered on the vortex vein as an OCT image acquisition position.
  • a predetermined range centered on the macula or the optic nerve head may be set as the acquisition position of the OCT image.
  • a line line segment
  • scan range for obtaining OCT tomographic data may be set.
  • the scanning method is determined.
  • the OCT scanning method includes, for example, a B scan for obtaining a cross-sectional image perpendicular to the retinal surface and a C scan for obtaining volume data (volume data) of the retina.
  • the image processing unit 206 sets a scanning method such as a B scan (line scan) or a C scan (raster scan, circle scan, etc.) designated from the image viewer 150.
  • the OCT image obtained by scanning is stored in the storage device 254 or the like of the management server 140.
  • the OCT image may be stored in the storage device of the ophthalmology device 110 or the image viewer 150 in addition to the storage device 254 of the management server 140.
  • step 312 the choroidal blood vessel image and the OCT image are output to the image viewer 150 or the like in a state similar to the display image described later, and the process is completed.
  • the display image output to the image viewer 150 is displayed on the display 156 or the like.
  • the display image may be displayed on the display 256 or the like of the management server 140.
  • FIG. 7 is a flowchart of the vortex vein position detection process in step 304 of FIG.
  • the image processing unit 206 reads the choroidal blood vessel image generated in step 302 of FIG.
  • the image processing unit 206 detects the position of the vortex vein from the choroidal blood vessel image.
  • the choroidal vascular structure vascular network
  • the image processing unit 206 detects the position of the vortex vein from the choroidal blood vessel image.
  • the choroidal vascular structure vascular network
  • it is calculated based on the traveling direction of the choroidal blood vessel. Since the vortex vein is an outflow tract of blood flow, a plurality of choroidal blood vessels are connected to the vortex vein. That is, the coordinates of the position of the vortex vein can be obtained by searching the point where a plurality of blood vessels are connected from the choroidal blood vessel image from the blood vessel traveling direction.
  • the detected coordinates of the position of the vortex vein are stored in a storage device 254 or the like.
  • step 404 the image processing unit 206 reads the G color fundus image 502GG. If the G-color fundus image 502GG acquired by the ophthalmic apparatus 110 is stored in advance in the storage device 254 or the like of the management server 140, the image processing unit 206 reads the G-color fundus image 502GG from the storage device 254 or the like.
  • the image processing unit 206 detects the macula and the optic nerve head from the G-color fundus image 502GG. Specifically, since the macula is a dark region in the G-color fundus image 502GG, the image processing unit 206 sets the region of a predetermined number of pixels having the smallest pixel value in the G-color fundus image 502GG read in step 404 as the macula. Detect as. The central position of the region including the darkest pixel is calculated as the coordinates where the macula is located and stored in the storage device 254. Further, the image processing unit 206 detects the position of the optic nerve head from the G-color fundus image 502GG.
  • the image processing unit 206 detects and detects the position of the optic nerve head in the G color fundus image 502GG by pattern matching a predetermined image of the optic nerve head with respect to the G color fundus image 502GG.
  • the position is calculated as the coordinates where the optic nerve head is located and stored in the storage device 254.
  • step 408 the image processing unit 206 sets the coordinates of the position of the vortex vein detected in step 402, the coordinates of the macula detected in step 406, and the coordinates of the optic nerve head on the choroidal blood vessel image, and from the macula to the vortex vein. And the distance from the optic disc to the vortex vein.
  • the image processing unit 206 visualizes the position of the vortex vein.
  • the position of the vortex vein is visualized, for example, by showing the position of the vortex vein on the choroidal blood vessel image displayed on the display 156 of the image viewer 150 (see FIG. 8).
  • the display image 500 is displayed on the display 156 of the image viewer 150, and the display image is stored in the storage device 254 or the like of the management server 140 to end the process.
  • the processes shown in FIGS. 6 and 7 may be executed by the CPU included in the image viewer 150 or the image processing device 17 of the ophthalmic apparatus 110.
  • FIG. 8 is a schematic view showing a display image 600 to be displayed on the display 156 of the image viewer 150.
  • the display image 600 has an information display area 602 and an image display area 604.
  • the information display area 602 includes a patient ID display area 612, a patient name display area 614, an age display area 616, a right eye / left eye display area 618, an axial length display area 620, a visual acuity display area 622, and an imaging date / time display area. It has 624.
  • the display 156 of the image viewer 150 displays each information from the patient ID display area 612 to each display area of the imaging date / time display area 624 based on the received information.
  • the image display area 604 has a choroidal blood vessel image display area 650 and a display switching icon 660.
  • a choroidal blood vessel image is displayed in the choroidal blood vessel image display area 650.
  • FIG. 8 displays a choroidal blood vessel image generated from a montage image obtained by stitching together UWF-SLO images obtained by moving the line of sight of a patient up and down using the fixation markers 92U and 92L.
  • the display mode of the choroidal blood vessel image displayed in the choroidal blood vessel image display area 650 can be switched from the 2D mode to the 3D mode.
  • the choroidal blood vessel image in the 3D mode is displayed by clicking the display switching icon 660. If the display switching icon 660 is clicked when the 3D mode choroidal blood vessel image is displayed, the 2D mode choroidal blood vessel image is displayed.
  • the OCT image acquisition areas 672A, 672B, 672C, and 672D set in step 306 of FIG. 6 are superimposed on the choroidal blood vessel image displayed in the choroidal blood vessel image display area 650.
  • the positions of the OCT image acquisition areas 672A, 672B, 672C, and 672D can be arbitrarily changed by the position change tool 670.
  • the position change tool 670 can focus on the OCT image acquisition areas 672A, 672B, 672C, and 672D by clicking the left button of the mouse 155M, and can focus the mouse 155M while focusing on the OCT image acquisition areas 672A, 672B, 672C, and 672D.
  • the OCT image acquisition areas 672A, 672B, 672C, and 672D correspond to "scanning areas" for acquiring tomographic images of one vortex vein of the technique of the present disclosure.
  • the size of the OCT image acquisition areas 672A, 672B, 672C, and 672D can be changed by clicking the scan size button 634 on the right side of the image display area 604 with the mouse 155M. For example, when the scan size button 634 is clicked, a pull-down menu exemplifying the sizes of the selectable OCT image acquisition areas 672A, 672B, 672C, and 672D is displayed, and the OCT image acquisition areas 672A, 672B, and 672C are displayed from the displayed pull-down menu. , 672D size can be selected.
  • the size of the OCT image acquisition areas 672A, 672B, 672C, and 672D is 6 ⁇ 6 mm in FIG. 8, but can be expanded up to, for example, 12 ⁇ 12 mm.
  • the OCT image acquisition area can be arbitrarily added by clicking the shooting position addition button 640.
  • the added OCT image acquisition area can be moved to an arbitrary position by the above-mentioned position change tool 670.
  • the OCT image acquisition areas 672A, 672B, 672C, and 672D can be deleted by clicking the shooting position deletion button 642.
  • To delete the OCT image acquisition areas 672A, 672B, 672C, and 672D for example, press the shooting position deletion button 642 while focusing on any of the OCT image acquisition areas 672A, 672B, 672C, and 672D with the position change tool 670 described above. It is done by clicking.
  • the scan mode button 632 is a button for selecting a target for acquiring an OCT image.
  • the state of scanning the vortex vein (VV) in three dimensions (3D) is set, but by clicking the scan mode button 632, the state of scanning the macula in 3D and the optic nerve head in 3D can be obtained. You can choose.
  • the scan mode button 632 is clicked to switch to the state of scanning the macula in 3D, the OCT image acquisition area shows a predetermined range centered on the macula. Further, when the scan mode button 632 is clicked to switch to the state of scanning the optic disc in 3D, the OCT image acquisition area shows a predetermined range centered on the optic disc.
  • the added average number button 636 is a button for changing the number of choroidal blood vessel images used in the process of averaging the brightness values of each pixel of a plurality of existing choroidal blood vessel images.
  • FIG. 8 shows a state in which the brightness values of the pixels are averaged using 10 choroidal blood vessel images.
  • the photographing button 644 is a button for acquiring OCT images of the OCT image acquisition areas 672A, 672B, 672C, and 672D displayed on the choroidal blood vessel image.
  • an imaging command is transmitted to the ophthalmic apparatus 110 via the network 130, and the ophthalmic apparatus 110 acquires OCT images of the OCT image acquisition areas 672A, 672B, 672C, and 672D.
  • Remarks column 680 is an area for describing matters related to patients and the like.
  • an arbitrary character string can be input using the keyboard 155K or the like of the image viewer 150.
  • FIG. 9 is a flowchart showing acquisition of an OCT image by the ophthalmic apparatus 110.
  • the image processing device 17 moves the patient's line of sight up and down using the fixation markers 92U and 92L, and joins the UWF-SLO images acquired in the montage image (or choroidal blood vessel image) and FIG.
  • the position information of the vortex vein detected in step 402 and stored in the storage device 254 of the management server 140 is acquired.
  • the image processing device 17 determines the scanning method according to the input from the input / display device 16E.
  • the scanning method includes the above-mentioned B scan and C scan, and further, the area to be scanned can be selected.
  • FIG. 10 is a schematic view showing a case where a C scan is performed in the OCT image acquisition area 672A, 672B, 672C, and 672D, and a B scan is performed in the OCT image acquisition area 674.
  • FIG. 11 is a schematic view showing a case where a ring-shaped region 676 including the OCT image acquisition regions 672A, 672B, 672C, and 672D is C-scanned. By C-scanning the annular region 676, a wide region of the fundus can be scanned, and tomographic images of a plurality of vortex veins can be acquired.
  • step 704 the image processing device 17 executes scanning according to the click of the shooting button 644 shown in FIG. 8 or the input from the input / display device 16E.
  • the OCT image acquisition area 672A, 672B, 672C, 672DOCT image acquisition area or the annular area 676 including the 672A, 672B, 672C, 672D is scanned.
  • C scan data is acquired. Such a scan produces a 3D tomographic image.
  • step 706 the image processing device 17 determines whether or not the entire area such as the OCT image acquisition areas 672A, 672B, 672C, and 672D has been scanned. If the entire area was scanned in step 706, the procedure proceeds to step 708, and if the entire area is not scanned in step 706, the procedure proceeds to step 704 to continue the scan.
  • step 708 the image processing device 17 causes the input / display device 16E to display the acquired OCT image.
  • the image processing device 17 creates OCT data having a header and a data area in a format according to DICOM (Digital Imaging and Communications in Medicine).
  • DICOM Digital Imaging and Communications in Medicine
  • the creation of OCT data for example, even if a 3D polygon, an en-face image, an OCT pixel, a blood flow visualization image, or the like is generated from a tomographic image of the vortex vein to visualize the choroidal blood vessel connected to the vortex vein and the vortex vein. Good.
  • step 712 the image processing device 17 transmits the created OCT data to the management server 140 and ends the process.
  • FIG. 12 is a schematic view showing a display image 800 including an OCT image to be displayed on the display 156 of the image viewer 150.
  • the display image 800 has an information display area 802 and an image display area 804. Similar to the display image 600 shown in FIG. 8, the information display area 802 includes a patient ID display area 812, a patient name display area 814, an age display area 816, a right eye / left eye display area 818, and an axial length display area. It has 820, a visual acuity display area 822, and a shooting date / time display area 824.
  • the display 156 of the image viewer 150 displays each information from the patient ID display area 812 to each display area of the imaging date / time display area 824 based on the received information.
  • the information display area 802 is provided with a data selection button 830.
  • a pull-down menu or the like for selecting a related image to be displayed in the related image display area 850, which will be described later, is displayed.
  • a pseudo-color (RGB3 color) image, an RG color image, a blue monochromatic image, a green monochromatic image, a red monochromatic image, etc. of the fundus of the eye to be inspected 12 that have already been acquired are displayed.
  • FIG. 12 shows how the RG color image is displayed in the related image display area 850.
  • the image display area 804 has a related image display area 850, a display switching icon 860, a choroidal blood vessel image display area 870, a display switching icon 880, and an OCT image display area 890.
  • a choroidal blood vessel image is displayed in the same manner as in the choroidal blood vessel image display area 650 of FIG.
  • the display switching icon 880 By clicking the display switching icon 880, the display mode of the choroidal blood vessel image display area 870 can be switched from the 2D mode to the 3D mode in the same manner as in the choroidal blood vessel image display area 650 of FIG. In FIG.
  • the choroidal blood vessel image in the 3D mode is displayed by clicking the display switching icon 880. If the display switching icon 880 is clicked when the 3D mode choroidal blood vessel image is displayed, the 2D mode choroidal blood vessel image is displayed.
  • a display switching icon 860 is also provided in the related image display area 850.
  • the pseudo color (RGB3 color) image, RG color image, blue monochromatic image, green monochromatic image, red monochromatic image, etc. displayed in the related image display area 850 are displayed from 2D mode to 3D. You can switch between modes or from 3D mode to 2D mode.
  • a scan mode button 882 On the right side of the image display area 804, a scan mode button 882, a scan size button 884, and an average number of sheets to be added button 886 are provided as in the display image 600 of FIG. Since the functions of the scan mode button 882, the scan size button 884, and the added average number button 886 are the same as in FIG. 8, detailed description thereof will be omitted.
  • OCT images 892A, 892B, 892C, and 892D corresponding to each of the OCT image acquisition areas 872A, 872B, 872C, and 872D displayed overlaid on the choroidal blood vessel image of the choroidal blood vessel image display area 870 are provided. Is displayed.
  • the OCT images 892A, 892B, 892C, and 892D display, for example, the shape of the vortex vein visualized as a tomographic image by scanning the OCT image acquisition regions 872A, 872B, 872C, and 872D. No. 1 attached to the upper part of the OCT images 892A, 892B, 892C, 892D. 1, No. 2. No. 3, No.
  • the reference numeral 4 is No. 4 assigned to the OCT image acquisition regions 872A, 872B, 872C, and 872D. 1, No. 2. No. 3, No. Corresponds to each of the symbols of 4. Therefore, the display image 800 includes OCT image acquisition regions 872A, 872B, 872C, 872D and OCT images 892A, 892B, 892C, 892D showing the shape of the visualized vortex vein (or choroidal blood vessel connected to the vortex vein and the vortex vein). It can be displayed in association with each other.
  • An image showing the shape of the vortex vein is displayed in the OCT image display area 890 of the display image 800, but the present embodiment is not limited to this.
  • the OCT image of the macula is acquired, the cross-sectional image of the macula is displayed, and when the OCT image of the optic disc is acquired, the cross-sectional image of the optic disc is displayed in the OCT image display area 890.
  • a display switching button 894 is provided in the OCT image display area 890, and when the display switching button 894 is clicked, a pull-down menu for switching the display of the OCT images 892A, 892B, 892C, and 892D is displayed.
  • the display of the OCT images 892A, 892B, 892C, and 892D can be switched by selecting, for example, a 3D polygon, an en-face image, an OCT pixel, a blood flow visualization image, or the like from the displayed pull-down menu.
  • the display image 800 is provided with a remarks column 900 in the same manner as the display image 600 shown in FIG.
  • a remarks column 900 an arbitrary character string such as matters related to the patient or the like can be input by using the keyboard 155K or the like of the image viewer 150.
  • the present embodiment by setting the OCT image acquisition region based on the vortex vein position detected from the fundus image, it is possible to acquire a tomographic image of the vortex vein position other than the lesion candidate. ..
  • data processing is realized by a software configuration using a computer
  • data processing may be executed only by a hardware configuration such as FPGA (Field-Programmable Gate Array) or ASIC (Application Specific Integrated Circuit). Some of the data processing may be performed by the software configuration and the rest may be performed by the hardware configuration.
  • FPGA Field-Programmable Gate Array
  • ASIC Application Specific Integrated Circuit

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

L'invention concerne un procédé de traitement d'image comprenant une étape de détection des emplacements de veines vorticineuses à partir d'une image de fond d'oeil, et une étape de réglage d'une région de balayage pour obtenir une image tomographique comprenant les emplacements de veines vorticineuses.
PCT/JP2019/040481 2019-10-15 2019-10-15 Procédé de traitement d'images, dispositif de traitement d'images et programme de traitement d'images WO2021074960A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2019/040481 WO2021074960A1 (fr) 2019-10-15 2019-10-15 Procédé de traitement d'images, dispositif de traitement d'images et programme de traitement d'images
JP2021552009A JPWO2021074960A1 (fr) 2019-10-15 2019-10-15

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/040481 WO2021074960A1 (fr) 2019-10-15 2019-10-15 Procédé de traitement d'images, dispositif de traitement d'images et programme de traitement d'images

Publications (1)

Publication Number Publication Date
WO2021074960A1 true WO2021074960A1 (fr) 2021-04-22

Family

ID=75538688

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/040481 WO2021074960A1 (fr) 2019-10-15 2019-10-15 Procédé de traitement d'images, dispositif de traitement d'images et programme de traitement d'images

Country Status (2)

Country Link
JP (1) JPWO2021074960A1 (fr)
WO (1) WO2021074960A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7334922B1 (ja) * 2023-04-20 2023-08-29 株式会社中京メディカル 画像処理装置、画像処理方法及びプログラム
WO2023199848A1 (fr) * 2022-04-13 2023-10-19 株式会社ニコン Procédé de traitement d'image, dispositif de traitement d'image, et programme

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018117693A (ja) * 2017-01-23 2018-08-02 株式会社トプコン 眼科装置
JP2019118421A (ja) * 2017-12-28 2019-07-22 株式会社トプコン 眼科撮影装置、その制御方法、プログラム、及び記録媒体
WO2019181981A1 (fr) * 2018-03-20 2019-09-26 株式会社ニコン Procédé de traitement d'image, programme, instrument ophtalmologique et procédé de génération d'image de vaisseau sanguin choroïdien

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018117693A (ja) * 2017-01-23 2018-08-02 株式会社トプコン 眼科装置
JP2019118421A (ja) * 2017-12-28 2019-07-22 株式会社トプコン 眼科撮影装置、その制御方法、プログラム、及び記録媒体
WO2019181981A1 (fr) * 2018-03-20 2019-09-26 株式会社ニコン Procédé de traitement d'image, programme, instrument ophtalmologique et procédé de génération d'image de vaisseau sanguin choroïdien

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
OHNO-MATSUI KYOKO, AKIBA MASAHIRO, ISHIBASHI TATSURO, MORIYAMA MUKA: "Observations of Vascular Structures within and Posterior to Sclere in Eyes with Pathologic Myopia by Swept-Source Optical Coherence Tomography", INVESTIGATIVE OPHTHALMOLOGY & VISUAL SCIENCE, vol. 53, no. 11, 2012, pages 7290 - 7298, XP055819085 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023199848A1 (fr) * 2022-04-13 2023-10-19 株式会社ニコン Procédé de traitement d'image, dispositif de traitement d'image, et programme
JP7334922B1 (ja) * 2023-04-20 2023-08-29 株式会社中京メディカル 画像処理装置、画像処理方法及びプログラム

Also Published As

Publication number Publication date
JPWO2021074960A1 (fr) 2021-04-22

Similar Documents

Publication Publication Date Title
US11941788B2 (en) Image processing method, program, opthalmic device, and choroidal blood vessel image generation method
JP2023009530A (ja) 画像処理方法、画像処理装置、及びプログラム
WO2019203311A1 (fr) Procédé de traitement d'image, programme et dispositif de traitement d'image
WO2021075062A1 (fr) Procédé de traitement d'image, dispositif de traitement d'image et programme
WO2021074960A1 (fr) Procédé de traitement d'images, dispositif de traitement d'images et programme de traitement d'images
JP2024041773A (ja) 画像処理装置、眼科装置、画像処理方法、及び画像処理プログラム
JP7347641B2 (ja) 画像処理方法、プログラム、画像処理装置、及び眼科システム
WO2020240867A1 (fr) Procédé de traitement d'image, dispositif de traitement d'image, et programme de traitement d'image
JP2020058627A (ja) 画像処理方法、画像処理装置、画像処理プログラム、及び血管径算出装置
JP7419946B2 (ja) 画像処理方法、画像処理装置、及び画像処理プログラム
WO2021075026A1 (fr) Procédé de traitement d'images, dispositif de traitement d'images et programme de traitement d'images
WO2021038847A1 (fr) Procédé de traitement d'image, dispositif de traitement d'image et programme
WO2022181729A1 (fr) Procédé de traitement d'image, dispositif de traitement d'image et programme de traitement d'image
WO2021210281A1 (fr) Méthode de traitement d'image, dispositif de traitement d'image et programme de traitement d'image
WO2021111840A1 (fr) Procédé de traitement d'images, dispositif de traitement d'images et programme
WO2022177028A1 (fr) Procédé de traitement d'image, dispositif de traitement d'image et programme
WO2023282339A1 (fr) Procédé de traitement d'image, programme de traitement d'image, dispositif de traitement d'image et dispositif ophtalmique
JP7264177B2 (ja) 画像処理方法、画像表示方法、画像処理装置、画像表示装置、画像処理プログラム、及び画像表示プログラム
JP7416083B2 (ja) 画像処理方法、画像処理装置、およびプログラム
US11954872B2 (en) Image processing method, program, and image processing device
WO2023199847A1 (fr) Procédé de traitement d'image, dispositif de traitement d'image, et programme
JP2019058493A (ja) レーザ治療装置、眼科情報処理装置、及び眼科システム
WO2021074961A1 (fr) Procédé de traitement d'image, dispositif de traitement d'image et programme
US20230000337A1 (en) Image processing method, image processing device, and program
WO2020240629A1 (fr) Procédé de création de données, dispositif de création de données et programme de création de données

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19949345

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021552009

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19949345

Country of ref document: EP

Kind code of ref document: A1