WO2011158849A1 - Appareil et procédé de traitement d'image tomographique, et appareil de diagnostic d'image tomographique à cohérence optique - Google Patents

Appareil et procédé de traitement d'image tomographique, et appareil de diagnostic d'image tomographique à cohérence optique Download PDF

Info

Publication number
WO2011158849A1
WO2011158849A1 PCT/JP2011/063641 JP2011063641W WO2011158849A1 WO 2011158849 A1 WO2011158849 A1 WO 2011158849A1 JP 2011063641 W JP2011063641 W JP 2011063641W WO 2011158849 A1 WO2011158849 A1 WO 2011158849A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
light
tomographic image
layer
specific structure
Prior art date
Application number
PCT/JP2011/063641
Other languages
English (en)
Japanese (ja)
Inventor
和弘 広田
敏景 千
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2011158849A1 publication Critical patent/WO2011158849A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0062Arrangements for scanning
    • A61B5/0066Optical coherence imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6846Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive
    • A61B5/6847Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive mounted on an invasive device
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/47Scattering, i.e. diffuse reflection
    • G01N21/4795Scattering, i.e. diffuse reflection spatially resolved investigating of object in scattering medium
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N2021/178Methods for obtaining spatial resolution of the property being measured
    • G01N2021/1785Three dimensional
    • G01N2021/1787Tomographic, i.e. computerised reconstruction from projective measurements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10101Optical tomography; Optical coherence tomography [OCT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular

Definitions

  • the present invention relates to a tomographic image processing apparatus and method, and an optical coherence tomographic diagnostic apparatus, and in particular, identifies blood vessels from tomographic images acquired by a tomographic method typified by optical coherence tomography (OCT).
  • OCT optical coherence tomography
  • the present invention relates to an image processing technique suitable for extracting and drawing a three-dimensional image of a structure.
  • OCT measurement has been used as one of non-invasive methods for obtaining a tomographic image inside a living body.
  • OCT measurement has an advantage that the resolution is about 10 ⁇ m higher than that of ultrasonic measurement, and a detailed tomographic image inside the living body can be obtained.
  • a three-dimensional tomographic image can be obtained by acquiring a plurality of images while shifting the position in a direction perpendicular to the tomographic image.
  • a time domain OCT (Time domain OCT) is proposed in which light output from a low coherence light source is scanned to obtain a tomographic image of a subject (Patent Document 1).
  • Frequency domain OCT Frequency domain OCT
  • Patent Document 2 Non-Patent Document 1
  • Frequency domain OCT Frequency domain OCT
  • SD-OCT Spectral Domain OCT
  • SS-OCT Session Source OCT
  • the SD-OCT device uses broadband low-coherent light such as SLD (Super Luminescence Diode), ASE (Amplified Spontaneous Emission) light source, white light as a light source, and uses broadband Michelson-type interferometer to generate broadband low-coherent light.
  • SLD Super Luminescence Diode
  • ASE Amontaneous Emission
  • white light white light
  • broadband Michelson-type interferometer uses broadband Michelson-type interferometer to generate broadband low-coherent light.
  • the measurement light After splitting into measurement light and reference light, the measurement light is irradiated onto the measurement object, the reflected light returning at that time interferes with the reference light, and this interference light is decomposed into frequency components using a spectrometer.
  • the interference light intensity for each frequency component is measured using a detector array in which elements such as photodiodes are arranged in an array, and the spectrum interference intensity signal obtained thereby is Fourier transformed by a computer to obtain an optical signal.
  • a tomographic image is constructed.
  • the SS-OCT apparatus uses a laser that temporally sweeps the optical frequency as a light source, causes reflected light and reference light to interfere with each other at each wavelength, and measures the time waveform of the signal corresponding to the temporal change of the optical frequency.
  • An optical tomographic image is constructed by Fourier-transforming the spectral interference intensity signal thus obtained with a computer.
  • Patent Document 3 discloses an image diagnostic apparatus that not only generates a diagnostic image of one cross section by OCT but also draws a three-dimensional image by performing a three-dimensional scan to three-dimensionally diagnose a lesion. Has been.
  • OCT measurement and three-dimensional computer graphic technology it is possible to display a three-dimensional structure model composed of structure information of a measurement object having a resolution of the order of micrometers.
  • this three-dimensional structure model by OCT measurement is referred to as three-dimensional volume data.
  • cancer cells have a feature that many new blood vessels are concentrated. Therefore, if a blood vessel image can be extracted from a tomographic image and displayed three-dimensionally, it is very effective for cancer diagnosis.
  • the acquired tomographic image is an image having a tail drawn from the tissue (for example, blood vessel) toward the deep part (see reference numerals 860 and 861 in FIG. 23), and it is difficult to accurately grasp the tissue structure such as the blood vessel.
  • FIG. 22 is a graph showing an example of an optical coherence tomographic image signal.
  • FIG. 22 is a graph in which the signal intensity at the measurement position (pixel position) where the blood vessel exists is plotted against the position in the depth direction.
  • the horizontal axis represents the position in the depth direction
  • the vertical axis represents the signal intensity.
  • the background A is a dark signal, rising from the position B on the surface of the living body, and the signal intensity gradually decreases as the position advances in the depth direction.
  • the blood vessel position C the upper end of the blood vessel
  • the signal intensity rapidly decreases, and the deeper signal intensity further decreases.
  • FIG. 23 is a schematic diagram illustrating an example of an OCT image of an image including a blood vessel.
  • the depth direction of the subject (measurement target) 800 is defined as the Z axis
  • the primary scanning direction is defined as the X axis
  • the secondary scanning direction is defined as the Y axis (see FIG. 7).
  • a cross-sectional tomographic image 880 along the XZ plane is obtained by scanning measurement light emitted from above in FIG. 23 toward the subject 800 in the X-axis direction.
  • XZ plane tomographic images having different Y positions can be obtained.
  • blood vessels 810 and 811 in the mucosa to be measured by OCT are distributed along the mucosal surface 820 in a generally parallel plane direction.
  • a lot of light is attenuated due to the optical characteristics of blood shown in FIG. 22, and light reaching the lower layer (deep part) of blood vessels 810 and 811 becomes smaller than the peripheral portion.
  • FIG. 23 in each tomographic image 880, there is a problem that it becomes an image in which the tail is drawn downward from the blood vessels 810 and 811.
  • portions indicated by reference numerals 860 and 861 are portions of reflection caused by the blood vessels 810 and 811.
  • the present invention has been made in view of such circumstances, and a tomographic image processing apparatus and a tomographic image processing method capable of obtaining an image capable of clearly identifying a specific structure such as a blood vessel, and the image processing thereof
  • An object of the present invention is to provide an optical coherence tomography diagnostic apparatus to which the technology is applied.
  • the tomographic image processing apparatus generates information on the depth direction of the measurement target generated based on the reflected wave from the measurement target of the measurement wave irradiated toward the measurement target. From the tomographic image acquisition unit that acquires tomographic image data including the tomographic image data acquired via the tomographic image acquisition unit, for each layer at a plurality of depth direction positions different in the depth direction, A cross-sectional image generation unit that generates a cross-sectional image of a cross section perpendicular to the incident direction of the measurement wave, a specific structure image extraction unit that extracts a region of a specific structure from the cross-sectional image of each layer, and a cross-sectional image of each layer When extracting the region of the specific structure for each layer from the reflection component removal unit that performs processing to remove the reflection component due to the specific structure extracted in the upper layer than the layer of interest, and Reflection component Based on the removed by image signals, and a display image generating unit that generates a display image for drawing the specific
  • a plurality of cross-sectional images having different positions in the depth direction are generated from the tomographic image data acquired from the tomographic image acquisition unit, and the specific structure region is extracted from the cross-sectional images of the respective layers.
  • the tomographic image acquired via the tomographic image acquisition unit is a reflection of the specific structure with a tail that is deeper than the specific structure due to the sudden attenuation of the measurement wave by the specific structure. Ingredients included.
  • the specific structure region when extracting the specific structure region from the cross-sectional image of each layer, the information on the specific structure extracted in the upper layer than the layer of interest as a processing target is taken into consideration, and the specific structure related to the upper layer extraction A process of removing the reflection component caused by the body is performed. For this reason, the structure information of the specific structure can be accurately extracted from the cross-sectional image (slice image) data of each layer. Based on the specific structure extraction image of each layer thus obtained, the specific structure can be rendered as a display image that can be clearly identified.
  • the “measurement wave” for example, light, other electromagnetic waves, or ultrasonic waves can be used.
  • the “tomographic image acquisition unit” there are a data input interface and a signal input terminal for receiving already generated tomographic image data.
  • a tomographic image is generated based on the emission / reception unit that emits the measurement wave to the measurement target and receives the reflected wave, and the received information.
  • a configuration including a signal processing unit for generating the data is also possible.
  • an integrated image is generated by integrating the tomographic image data acquired through the tomographic image acquisition unit in the depth direction.
  • an integration extraction image generation unit that generates an integration extraction image obtained by extracting a region of the specific structure from the integration image
  • the specific structure image extraction unit includes a cross-sectional image of each layer and It is preferable to extract a specific structure region of each layer by taking a correlation with the integral extraction image.
  • the signal of the specific structure and its reflection signal are integrated, and the two-dimensional of the specific structure projected onto the plane perpendicular to the incident direction of the measurement wave An image is obtained.
  • a specific structure region is extracted from the integrated image by a technique such as binarization processing, and a specific structure image (“integrated extracted image”) indicating the extracted specific structure region is generated.
  • the region of the specific structure in each layer can be extracted by correlating the images between the integral extraction image and the cross-sectional image of each layer and extracting the correlated portion.
  • a reflection component removal process is performed.
  • the reflection component removal unit is notable when taking a correlation between the cross-sectional image of each layer and the integral extraction image.
  • the information on the specific structure region extracted in the cross-sectional image above the layer to be removed is removed from the integral extraction image, and the specific structure image extraction unit is extracted from the cross-sectional image in the upper layer
  • the specific structure region is obtained from the cross-sectional image of the layer of interest. It is preferable to extract.
  • an information acquisition unit that acquires information about the thickness of the specific structure, and the information acquisition It is preferable to include an interlayer distance variable setting unit that sets an interlayer distance between the layers based on information acquired from the unit.
  • the information acquisition unit is configured by a user interface or the like that allows an operator (user) to input a desired numerical value by operating an input device or select corresponding information from selection candidates stored in advance in a memory or the like.
  • a configuration may be used in which data of a tomographic image is analyzed and information is automatically acquired from the analysis result.
  • the thickness information of the blood vessel can be set based on biological knowledge.
  • the incident direction of the measurement wave is It is preferable that an estimation unit that estimates the thickness of the specific structure from information of the specific structure extracted from an image of a vertical section is used, and the estimation unit is used as the information acquisition unit.
  • the reflection is performed even though the specific structure exists in the target layer. It is preferable to provide a restoration processing unit that restores the information of the specific structure that has been removed by the removal processing.
  • the signal of the specific structure existing on the lower side may be removed as a reflection signal.
  • such an erroneously removed portion can be restored, and the specific structure can be accurately extracted.
  • the process of determining the erroneous removal portion is performed using the thickness information of the specific structure, and the image signal is revived so that the disconnection points of the erroneous removal portion are smoothly connected (connected).
  • the tomographic image is an optical coherence tomographic image using light as the measurement wave. Is preferred.
  • the seventh aspect is suitable as an image processing technique for extracting information on a specific structure from an optical coherence tomographic image and outputting it to a display device or the like.
  • the optical coherence tomographic image divides light emitted from a wavelength swept light source into measurement light and reference light, and performs the measurement Irradiating the measurement object with light, combining the reflected light from the measurement object and the reference light, detecting interference light when the reflected light and the reference light are combined as an interference signal, and detecting the interference
  • a tomographic image generated from a signal is preferable.
  • the tomographic image data acquisition unit is obtained by scanning the measurement object with the measurement wave. It is preferable that the display image generation unit generates a three-dimensional display image for rendering a three-dimensional structure of the specific structure as the display image.
  • the ninth aspect is suitable as an image processing technique for extracting the information of the three-dimensional structure of the specific structure from the three-dimensional tomographic image data and outputting it to a display device or the like.
  • the incident direction of the measurement wave is the Z axis and the X and Y axes are set on a plane perpendicular to the Z axis (the X, Y, and Z axes are orthogonal to each other)
  • the three-dimensional tomographic image data is converted to Z
  • An integral image (XY plane image) is generated by integrating in the direction.
  • a specific structure region is extracted from the integrated image to generate an integrated extracted image.
  • cross-sectional images of multiple XY planes (slice planes) with different Z-direction positions are extracted from the three-dimensional tomographic image data, and the influence of the upper layer extraction portion is removed from the cross-sectional images (slice images) and integral extraction images of each layer.
  • the correlation between the images is taken, and the specific structure image of each layer is generated.
  • the three-dimensional structure of the specific structure can be drawn by connecting the specific structure images of the respective layers thus obtained in the Z direction.
  • the setting of the coordinate axes is not necessarily limited to the orthogonal XYZ system. Any coordinate system in which a base vector is selected so as to describe a three-dimensional space may be used, and a cylindrical coordinate system and other various coordinate systems can be employed. Coordinate conversion can be handled by a known mathematical process.
  • the specific structure may be a blood vessel.
  • the tenth aspect is suitable, for example, as a technique for generating a three-dimensional image (blood vessel image) of blood vessels near the mucosal surface.
  • the eleventh aspect of the present invention provides a method invention for achieving the above object. That is, the tomographic image processing method according to the eleventh aspect includes information on the depth direction of the measurement object generated based on the reflected wave from the measurement object of the measurement wave irradiated toward the measurement object.
  • the reflection component removal processing step is performed for removing the reflection component due to the specific structure extracted in a layer above the target layer. Including a flop, based on the image signal the image reflection component is removed, and a display image generating step of generating a display image for drawing the specific structure.
  • the twelfth aspect of the present invention provides an optical coherence tomographic image diagnostic apparatus that achieves the above object. That is, the optical coherence tomography diagnostic apparatus according to the twelfth aspect refers to a wavelength swept light source that emits light while sweeping the wavelength of light at a constant period, and the light emitted from the wavelength swept light source is referred to as measurement light A light dividing unit that divides the light, a multiplexing unit that combines the reflected light from the measurement object and the reference light when the measurement light divided by the light dividing unit is irradiated on the measurement object; An optical interference detection unit for detecting interference light between the reflected light and the reference light combined by the multiplexing unit, and information on the depth direction of the measurement target from the interference signal detected by the optical interference detection unit From the tomographic image acquisition unit that acquires tomographic image data including the tomographic image data acquired through the tomographic image acquisition unit, for each layer at a plurality of depth direction positions different in the depth direction, Perpen
  • the twelfth aspect corresponds to an optical coherence tomographic image diagnostic apparatus to which the tomographic image processing apparatus according to the first aspect is applied.
  • the tomographic image processing apparatus described in any one of the second to ninth aspects can be applied.
  • the present invention it is possible to improve the influence of reflection and clearly identify a specific structure such as a blood vessel.
  • FIG. 1 is an external view of an image diagnostic apparatus to which a tomographic image processing apparatus according to an embodiment of the present invention is applied;
  • 2 is a block diagram showing the internal configuration of the OCT processor of FIG. 1;
  • 3 is a cross-sectional view of the OCT probe of FIG. 2;
  • FIG. 4 is a diagram showing a scan surface of a tomographic image when the optical scanning is a radial scan with respect to the measurement target;
  • FIG. 5 is a diagram showing three-dimensional volume data constructed from the tomographic image of FIG. 4;
  • 6 is a diagram showing a state in which a tomographic image is obtained using an OCT probe derived from the forceps opening of the endoscope of FIG. 1;
  • FIG. 7 is a diagram showing a configuration for obtaining a tomographic image by performing sector scanning on the measuring object S using a galvanomirror;
  • FIG. 8 is a diagram showing three-dimensional volume data constructed by the tomographic image of FIG. 7;
  • 9 is a block diagram showing the configuration of the signal processing unit of FIG. 2;
  • FIG. 10 is a diagram showing an image (integrated image) obtained by integrating the three-dimensional tomographic image of FIG. 23 in the depth direction;
  • FIG. 11 is a diagram showing a blood vessel image (integrated extracted image) extracted from the integrated image of FIG. 10;
  • FIG. 12 is a flowchart showing an example of a blood vessel region extraction process;
  • FIG. 13 is a diagram showing an example of a plurality of cross-sectional images (slice images) having different positions in the depth direction;
  • 14 is a schematic diagram showing a method for generating a blood vessel image of each layer from the cross-sectional image of each layer and the integral extracted image (FIG. 11) shown in FIG. 13;
  • FIG. 15 is a flowchart of the three-dimensional blood vessel structure extraction process in the present embodiment;
  • FIG. 16 is a flowchart of a process for generating a blood vessel image of each layer from the cross-sectional image of each layer and the integral extraction image (FIG. 11);
  • FIG. 17 is a schematic diagram showing an example of an OCT tomographic image obtained from a subject including a three-dimensionally intersecting blood vessel;
  • FIG. 18 is a diagram showing an image (integrated image) obtained by integrating a three-dimensional tomographic image obtained in a case where blood vessels intersect three-dimensionally (FIG. 17) in the depth direction; 19 is a diagram showing a blood vessel image (integrated extracted image) extracted from the integrated image of FIG. 18;
  • FIG. 20 is a schematic diagram when a blood vessel image of each layer is generated by the same method as FIG. 14;
  • FIG. 21 is an explanatory diagram of a process for restoring a blood vessel site that has been removed by mistake;
  • FIG. 22 is a schematic diagram of an optical coherence tomographic image signal;
  • FIG. 23 is a schematic diagram illustrating an example of a tomographic image obtained by OCT measurement of a subject including blood vessels.
  • FIG. 1 is an external view of an image diagnostic apparatus to which a tomographic image processing apparatus according to an embodiment of the present invention is applied.
  • the diagnostic imaging apparatus 10 mainly includes an endoscope 100, an endoscope processor 200, a light source device 300, an OCT processor 400 as a tomographic image processing device, and a monitor device 500.
  • the endoscope processor 200 may be configured to incorporate the light source device 300.
  • the endoscope 100 includes a hand operation unit 112 and an insertion unit 114 connected to the hand operation unit 112. The surgeon grasps and operates the hand operation unit 112 and performs observation by inserting the insertion unit 114 into the body of the subject.
  • the hand operation part 112 is provided with a forceps insertion part 138, and the forceps insertion part 138 communicates with the forceps port 156 of the distal end part 144.
  • the OCT probe 600 is led out from the forceps opening 156 by inserting the OCT probe 600 from the forceps insertion portion 138.
  • the OCT probe 600 includes an insertion portion 602 inserted from the forceps insertion portion 138 and led out from the forceps opening 156, an operation portion 604 for an operator to operate the OCT probe 600, and the OCT processor 400 via the connector 610. And a cable 606 to be connected.
  • Endoscope At the distal end portion 144 of the endoscope 100, an observation optical system 150, an illumination optical system 152, and a CCD (not shown) are disposed.
  • the observation optical system 150 forms an object image on a light receiving surface of a CCD (not shown), and the CCD converts the object image formed on the light receiving surface into an electric signal by each light receiving element.
  • the CCD of this embodiment is a color CCD in which three primary color red (R), green (G), and blue (B) color filters are arranged for each pixel in a predetermined arrangement (Bayer arrangement, honeycomb arrangement). is there.
  • Reference numeral 154 denotes a cleaning nozzle for supplying cleaning liquid and pressurized air toward the observation optical system 150.
  • the light source device 300 causes visible light to enter a light guide (not shown). One end of the light guide is connected to the light source device 300 via the LG connector 120, and the other end of the light guide faces the illumination optical system 152. The light emitted from the light source device 300 is emitted from the illumination optical system 152 via the light guide, and illuminates the visual field range of the observation optical system 150.
  • Endoscope processor An image signal output from the CCD is input to the endoscope processor 200 via the electrical connector 110.
  • the analog image signal is converted into a digital image signal in the endoscope processor 200, and necessary processing for displaying on the screen of the monitor device 500 is performed.
  • observation image data obtained by the endoscope 100 is output to the endoscope processor 200, and an image is displayed on the monitor device 500 connected to the endoscope processor 200.
  • FIG. 2 is a block diagram showing an internal configuration of the OCT processor of FIG.
  • the OCT processor 400 and the OCT probe 600 shown in FIG. 2 are for acquiring an optical tomographic image of a measurement object by an optical coherence tomography (OCT) measurement method.
  • OCT optical coherence tomography
  • the OCT processor 400 emits light La for measurement, a first light source unit (first light source unit) 12, and the light La emitted from the first light source unit 12 is measured light (first light flux).
  • An optical fiber coupler (branching / combining unit) 14 for branching into L1 and reference light L2 and combining the return light L3 and the reference light L2 from the measurement target S, which is the subject, to generate interference light L4;
  • the measurement light L1 branched by the fiber coupler 14 is guided to the optical connector 18 of the OCT probe 600, and the fixed-side optical fiber FB2 that guides the return light L3 guided by the rotation-side optical fiber FB1 in the OCT probe 600;
  • the interference light detection unit 20 that detects the interference light L4 generated by the optical fiber coupler 14 as an interference signal, and processes the interference signal detected by the interference light detection unit 20 to produce an optical tomographic image. (Hereinafter, simply referred to as "tomographic image”.) Having a signal processing unit 22 for acquiring.
  • the OCT processor 400 adjusts the optical path length of the second light source unit (second light source unit) 13 that emits aiming light (second light flux) Le for indicating a mark of measurement, and the reference light L2. Detection that detects return light (interference light) L4 and L5 combined by the optical fiber coupler 14, and an optical fiber coupler 28 that splits the light La emitted from the first light source unit 12, and an optical path length adjustment unit 26 Sections 30a and 30b, and an operation control section 32 for inputting various conditions to the signal processing section 22, changing settings, and the like.
  • the OCT probe 600 connected to the OCT processor 400 guides the measurement light L1 guided through the fixed optical fiber FB2 to the measurement target S and also guides the return light L3 from the measurement target S.
  • An optical fiber FB1 and an optical connector 18 that rotatably connects the rotation side optical fiber FB1 to the fixed side optical fiber FB2 and transmits the measurement light L1 and the return light L3 are provided.
  • various light including the above-described emission light La, aiming light Le, measurement light L1, reference light L2, return light L3, etc.
  • Various optical fibers FB (FB3, FB4, FB5, FB6, FB7, FB8, etc.) including the rotation side optical fiber FB1 and the fixed side optical fiber FB2 are used as light paths for guiding and transmitting between the components. It is used.
  • the first light source unit 12 emits light for OCT measurement (for example, wavelength-variable laser light in the infrared region or low-coherence light).
  • the first light source unit 12 of this example emits a laser beam La (for example, a laser beam centered on a wavelength of 1.3 ⁇ m) while sweeping the optical frequency (wavelength) at a constant period in the infrared wavelength region. It is a variable light source.
  • the first light source unit 12 includes a light source 12a that emits laser light or low-coherence light La, and a lens 12b that collects the light La emitted from the light source 12a. Further, as will be described in detail later, the light La emitted from the first light source unit 12 is divided into the measurement light L1 and the reference light L2 by the optical fiber coupler 14 via the optical fibers FB4 and FB3, and the measurement light L1 is Input to the optical connector 18.
  • the second light source unit 13 emits visible light to make it easy to confirm the measurement site as the aiming light Le.
  • red semiconductor laser light with a wavelength of 660 nm He—Ne laser light with a wavelength of 630 nm, blue semiconductor laser light with a wavelength of 405 nm, or the like can be used.
  • the second light source unit 13 in this embodiment includes, for example, a semiconductor laser 13a that emits red, blue, or green laser light, and a lens 13b that condenses the aiming light Le emitted from the semiconductor laser 13a. .
  • the aiming light Le emitted from the second light source unit 13 is input to the optical connector 18 through the optical fiber FB8.
  • the measurement light (first light beam) L1 and the aiming light (second light beam) Le are combined and guided to the rotation side optical fiber FB1 in the OCT probe 600.
  • the optical fiber coupler (branching / combining unit) 14 is composed of, for example, a 2 ⁇ 2 optical fiber coupler, and is optically connected to the fixed-side optical fiber FB2, the optical fiber FB3, the optical fiber FB5, and the optical fiber FB7, respectively. ing.
  • the optical fiber coupler 14 divides the light La incident from the first light source unit 12 via the optical fibers FB4 and FB3 into measurement light (first light flux) L1 and reference light L2, and the measurement light L1 is fixed.
  • the light is incident on the optical fiber FB2, and the reference light L2 is incident on the optical fiber FB5.
  • the optical fiber coupler 14 includes a reference light L2 that is incident on the optical fiber FB5 and is frequency-shifted and changed in optical path length by an optical path length adjusting unit 26, which will be described later, and returned to the optical fiber FB5, and an OCT probe 600, which will be described later.
  • the acquired light L3 guided from the fixed side optical fiber FB2 is multiplexed and emitted to the optical fiber FB3 (FB6) and the optical fiber FB7.
  • the OCT probe 600 is connected to the fixed optical fiber FB2 via the optical connector 18, and the measurement light L1 combined with the aiming light Le is rotated from the fixed optical fiber FB2 via the optical connector 18.
  • the light enters the side optical fiber FB1.
  • the measurement light L1 combined with the incident aiming light Le is transmitted by the rotation side optical fiber FB1, and is irradiated to the measurement object S.
  • the return light L3 from the measuring object S is acquired, the acquired return light L3 is transmitted by the rotation side optical fiber FB1, and is emitted to the fixed side optical fiber FB2 via the optical connector 18.
  • the interference light detection unit 20 is connected to the optical fibers FB6 and FB7, and uses the interference lights L4 and L5 generated by combining the reference light L2 and the return light L3 by the optical fiber coupler 14 as interference signals. It is to detect.
  • a detector 30a for detecting the light intensity of the interference light L4 is provided on the optical path of the optical fiber FB6 branched from the optical fiber coupler 28, and the light intensity of the interference light L5 is detected on the optical path of the optical fiber FB7.
  • a detector 30b is provided.
  • the interference light detection unit 20 generates an interference signal based on the detection results of the detectors 30a and 30b.
  • the signal processing unit 22 acquires a tomographic image from the interference signal detected by the interference light detection unit 20, and outputs the acquired tomographic image to the monitor device 500.
  • a blood vessel portion is extracted from the tomographic image to generate a three-dimensional blood vessel image, and an image showing the three-dimensional structure of the blood vessel is displayed on the monitor device 500. Is output.
  • a detailed configuration of the signal processing unit 22 for realizing this will be described later.
  • the optical path length adjustment unit 26 for changing the optical path length of the reference light L2 is disposed on the emission side of the reference light L2 of the optical fiber FB5 (that is, the end of the optical fiber FB5 opposite to the optical fiber coupler 14). ing.
  • the optical path length adjustment unit 26 includes a first optical lens 80 that converts the light emitted from the optical fiber FB5 into parallel light, a second optical lens 82 that condenses the light converted into parallel light by the first optical lens 80, and The reflection mirror 84 that reflects the light collected by the second optical lens 82, the base 86 that supports the second optical lens 82 and the reflection mirror 84, and the base 86 are moved in a direction parallel to the optical axis direction. And a mirror moving mechanism 88. By changing the distance between the first optical lens 80 and the second optical lens 82, the optical path length of the reference light L2 is adjusted.
  • the first optical lens 80 converts the reference light L2 emitted from the core of the optical fiber FB5 into parallel light, and condenses the reference light L2 reflected by the reflection mirror 84 on the core of the optical fiber FB5.
  • the second optical lens 82 condenses the reference light L2 converted into parallel light by the first optical lens 80 on the reflection mirror 84, and makes the reference light L2 reflected by the reflection mirror 84 parallel light.
  • the first optical lens 80 and the second optical lens 82 form a confocal optical system.
  • the reflection mirror 84 is disposed at the focal point of the light collected by the second optical lens 82 and reflects the reference light L2 collected by the second optical lens 82.
  • the reference light L2 emitted from the optical fiber FB5 becomes parallel light by the first optical lens 80, and is condensed on the reflection mirror 84 by the second optical lens 82. Thereafter, the reference light L2 reflected by the reflection mirror 84 becomes parallel light by the second optical lens 82 and is condensed by the first optical lens 80 on the core of the optical fiber FB5.
  • the base 86 fixes the second optical lens 82 and the reflecting mirror 84, and the mirror moving mechanism 88 moves the base 86 in the optical axis direction of the first optical lens 80 (the direction of arrow A in FIG. 2). .
  • the distance between the first optical lens 80 and the second optical lens 82 can be changed, and the optical path length of the reference light L2 can be adjusted. Can do.
  • the operation control unit 32 has input means such as a keyboard and a mouse, and control means for managing various conditions based on the input information, and is connected to the signal processing unit 22.
  • the operation control unit 32 inputs, sets, and changes various processing conditions and the like in the signal processing unit 22 based on an operator instruction input from the input unit.
  • the operation control unit 32 may display the operation screen on the monitor device 500, or may provide a separate display unit to display the operation screen.
  • the operation control unit 32 controls the operation of the first light source unit 12, the second light source unit 13, the optical connector 18, the interference light detection unit 20, the optical path length, the detection units 30a and 30b, and sets various conditions. You may do it.
  • FIG. 3 is a cross-sectional view of the OCT probe 600.
  • the distal end portion of the insertion portion 602 includes a probe outer tube (sheath) 620, a cap 622, a rotation side optical fiber FB1, a spring 624, a fixing member 626, and an optical lens 628. is doing.
  • the probe outer cylinder 620 is a flexible cylindrical member, and is made of a material through which the measurement light L1 combined with the aiming light Le and the return light L3 are transmitted in the optical connector 18.
  • the probe outer cylinder 620 is a tip through which the measurement light L1 (aiming light Le) and the return light L3 pass (the tip of the rotation side optical fiber FB1 opposite to the optical connector 18, hereinafter referred to as the tip of the probe outer cylinder 620). It is only necessary that a part of the side is made of a material that transmits light over the entire circumference (transparent material), and parts other than the tip may be made of a material that does not transmit light.
  • the cap 622 is provided at the tip of the probe outer cylinder 620 and closes the tip of the probe outer cylinder 620.
  • the rotation side optical fiber FB1 is a linear member, and is accommodated in the probe outer cylinder 620 along the probe outer cylinder 620.
  • the rotation-side optical fiber FB1 guides the measurement light L1 combined with the optical connector 18 and the aiming light Le to the optical lens 628 and irradiates the measurement target S with the measurement light L1 (aiming light Le).
  • the return light L3 from the measuring object S acquired by the lens 628 is guided to the optical connector 18.
  • the return light L3 enters the fixed side optical fiber FB2 via the optical connector 18.
  • the rotation-side optical fiber FB1 is disposed so as to be rotatable with respect to the probe outer cylinder 620 and movable in the axial direction of the probe outer cylinder 620.
  • the spring 624 is fixed to the outer periphery of the rotation side optical fiber FB1.
  • the rotation side optical fiber FB1 and the spring 624 are connected to the optical connector 18 together with the rotation cylinder 656.
  • the optical lens 628 is disposed at the measurement-side tip of the rotation-side optical fiber FB1 (tip of the rotation-side optical fiber FB1 opposite to the optical connector 18).
  • the distal end portion (light emission surface) of the optical lens 628 is formed in a substantially spherical shape for condensing the measurement light L1 (aiming light Le) emitted from the rotation-side optical fiber FB1 onto the measurement target S. .
  • the optical lens 628 irradiates the measurement target S with the measurement light L1 (aiming light Le) emitted from the rotation side optical fiber FB1, collects the return light L3 from the measurement target S, and enters the rotation side optical fiber FB1. .
  • the fixing member 626 is disposed on the outer periphery of the connection portion between the rotation side optical fiber FB1 and the optical lens 628, and fixes the optical lens 628 to the end portion of the rotation side optical fiber FB1.
  • the fixing method of the rotation side optical fiber FB1 and the optical lens 628 by the fixing member 626 is not particularly limited, and the fixing member 626, the rotation side optical fiber FB1 and the optical lens 628 may be bonded and fixed by an adhesive. It may be fixed with a mechanical structure using bolts or the like. Any fixing member 626 may be used as long as it is used for fixing, holding or protecting the optical fiber, such as a zirconia ferrule or a metal ferrule.
  • the rotation-side optical fiber FB1 and the spring 624 are connected to the rotation cylinder 656.
  • the optical lens 628 is moved with respect to the probe outer cylinder 620 by the arrow R2.
  • the optical connector 18 includes a rotary encoder. Based on the signal from the rotary encoder, the irradiation position of the measuring light L1 is detected from the position information (angle information) of the optical lens 628. That is, the measurement position is detected by detecting the angle of the rotating optical lens 628 with respect to the reference position in the rotation direction.
  • rotation-side optical fiber FB1 the spring 624, the fixing member 626, and the optical lens 628 are moved inside the probe outer cylinder 620 in the directions indicated by arrows S1 (forceps opening direction) and S2 (outside the probe) by a drive mechanism including a motor 660. It is configured to be movable in the direction of the tip of the cylinder 620.
  • FIG. 3 shows a schematic configuration of a drive mechanism such as the rotation-side optical fiber FB1 in the operation unit 604 of the OCT probe 600 on the left side of FIG.
  • the probe outer cylinder 620 is fixed to the fixing member 670, while the proximal end portions of the rotation side optical fiber FB1 and the spring 624 are connected to the rotation cylinder 656.
  • the rotating cylinder 656 is configured to rotate via a gear 654 in accordance with the rotation of the motor 652.
  • the rotary cylinder 656 is connected to the optical connector 18, and the measurement light L1 and the return light L3 are transmitted between the rotation side optical fiber FB1 and the fixed side optical fiber FB2 via the optical connector 18.
  • the frame 650 containing the rotating cylinder 656, the motor 652, the gear 654, and the optical connector 18 includes a support member 662.
  • the support member 662 has a screw hole (not shown), and a ball screw 664 for advancing / retreating is engaged with the screw hole.
  • a motor 660 is connected to the ball screw 664 for advancing / retreating movement.
  • the OCT probe 600 is configured as described above, and the measurement light emitted from the optical lens 628 by rotating the rotation-side optical fiber FB1 and the spring 624 in the direction of the arrow R2 in FIG. L1 (aiming light Le) is irradiated onto the measuring object S while scanning in the direction of arrow R2 (circumferential direction of the probe outer cylinder 620), and the return light L3 is acquired.
  • the aiming light Le is applied to the measuring object S as, for example, blue, red, or green spot light.
  • the reflected light of the aiming light Le (reflected light from the measuring object S) is also displayed as a bright spot on the observation image displayed on the monitor device 500.
  • a desired part of the measuring object S can be accurately captured on the entire circumference of the probe outer cylinder 620 in the circumferential direction, and the return light L3 reflected from the measuring object S is obtained. Can be acquired.
  • the rotation-side optical fiber FB1 and the optical lens 628 are moved to the end of the movable range in the arrow S1 direction by the drive mechanism including the motor 66. Then, it moves in the S2 direction by a predetermined amount while acquiring the tomographic image, or moves to the end of the movable range while alternately repeating the acquisition of the tomographic image and the predetermined amount of movement in the S2 direction.
  • a plurality of tomographic images can be obtained in a desired range with respect to the measurement object S, and three-dimensional volume data can be obtained based on the acquired plurality of tomographic images.
  • FIG. 4 is a diagram showing a scan plane of a tomographic image when the optical scanning is radial scan with respect to the measurement target S
  • FIG. 5 is a diagram showing three-dimensional volume data constructed by the tomographic image of FIG. .
  • a tomographic image in the depth direction (first direction) of the measurement target S is acquired from the interference signal, and the measurement target S is scanned (radial scan) in the direction of arrow R2 in FIG.
  • a tomographic image on the scan plane composed of the first direction and the second direction orthogonal to the first direction can be acquired.
  • a plurality of tomographic images for generating three-dimensional volume data can be acquired as shown in FIG.
  • FIG. 6 is a diagram illustrating a state in which a tomographic image is obtained using the OCT probe 600 derived from the forceps port 156 of the endoscope 100.
  • the tomographic image is obtained by bringing the distal end portion of the insertion portion 602 of the OCT probe 600 close to a desired portion of the measurement target S.
  • the optical lens 628 may be moved within the probe outer cylinder 620 by the drive mechanism described above.
  • the measurement light L1 (aiming light Le) is radially scanned on the measurement object S, but the present invention is not limited to this.
  • FIG. 7 is a diagram illustrating a configuration in which a tomographic image is acquired by performing sector scanning on the measurement target S
  • FIG. 8 is a diagram illustrating three-dimensional volume data constructed from the tomographic image of FIG.
  • the present invention can be applied to a configuration in which a galvano mirror 900 is used and a tomographic image is acquired by performing sector scanning from above the measurement target S.
  • the scan plane is moved (X direction and By scanning in the Y direction), as shown in FIG. 8, a plurality of tomographic images (frames 1, 2, 3,...) For generating three-dimensional volume data can be acquired.
  • FIG. 9 is a block diagram showing a configuration of the signal processing unit 22 of FIG.
  • the signal processing unit 22 is a processing unit that performs signal processing for generating an image output to the monitor device 500 from the interference signal input from the interference light detection unit 20.
  • the control unit 490 controls each unit of the signal processing unit 22 based on the operation signal from the operation control unit 32.
  • the interference light detection unit 20 is obtained when the light emitted from the first light source unit 12 serving as a wavelength swept light source is divided into measurement light and reference light, and the measurement light S is irradiated from the OCT probe 600 to the measurement target S.
  • the interference light when the reflected light and the reference light are combined is input.
  • the interference light detection unit 20 converts an input interference light (optical signal) into an interference signal (electric signal), and converts the interference signal generated by the interference signal generation unit 20a from an analog signal to a digital signal. It comprises an AD conversion unit 20b that converts it into a signal.
  • the AD conversion unit 20b for example, conversion from an analog signal to a digital signal is performed with a resolution of about 14 bits at a sampling rate of about 80 MHz, but these values are not particularly limited.
  • the interference signal converted into a digital signal by the AD conversion unit 20 b is input to the Fourier transform unit 410 of the signal processing unit 22.
  • the Fourier transform unit 410 performs frequency analysis by FFT (Fast Fourier Transform) on the interference signal converted into the digital signal in the AD conversion unit 20b of the interference light detection unit 20, and the reflected light at each depth position of the measurement target S ( Return light) L3 intensity, that is, reflection intensity data (tomographic information) in the depth direction is generated.
  • the data (tomographic information) Fourier-transformed by the Fourier transform unit 410 is logarithmically transformed by the logarithmic transform unit 420.
  • the logarithmically converted data is input to the tomographic image construction unit 450.
  • the tomographic image construction unit 450 performs luminance conversion, contrast adjustment, resampling according to the display size, coordinate conversion according to a scanning method such as radial scanning, sector scanning, and the like on the data logarithmically converted by the logarithmic conversion unit 420. To build a tomographic image.
  • the tomographic image construction unit 450 generates three-dimensional tomographic image data, and the three-dimensional tomographic image data is input to the three-dimensional blood vessel structure extraction processing unit 460.
  • the three-dimensional blood vessel structure extraction processing unit 460 extracts blood vessel structure information based on the three-dimensional tomographic image data constructed by the tomographic image construction unit 450, and generates a display image for the three-dimensional blood vessel image. Details of processing contents of the three-dimensional blood vessel structure extraction processing unit 460 will be described later.
  • the three-dimensional blood vessel image generated in this way is output to a monitor device 500 such as an LCD monitor.
  • a monitor device 500 such as an LCD monitor.
  • the tomographic image constructed by the tomographic image construction unit 450 can be displayed on the monitor device 500 instead of the display output of the three-dimensional blood vessel image or together with the display of the three-dimensional blood vessel image.
  • FIG. 10 shows a cross section of a biological image by OCT.
  • a description will be given in a state where two blood vessels are running in a tissue of a living body to be measured.
  • S / N of an OCT image is poor, it is difficult to extract blood vessels as compared with an X-ray projection image or the like.
  • all image signals are integrated in the depth direction (Z direction).
  • the calculation of “integration” is performed as integration (addition) of pixel values in digital signal processing.
  • An image (referred to as “integrated image”) obtained by this integration processing is shown in FIG.
  • a blood vessel image included in a data group (three-dimensional data) of a three-dimensional tomographic image is extracted. Further, the S / N of the integral image is improved by the integral calculation.
  • the integrated image thus obtained is an image having the same meaning as an image obtained by projecting a three-dimensional blood vessel image onto one surface (here, the XY plane).
  • FIG. 11 shows a blood vessel image extracted from the integrated image of FIG. 10 (corresponding to “integrated extracted image”, hereinafter referred to as “integrated blood vessel image”).
  • integrated blood vessel image a blood vessel image extracted from the integrated image of FIG. 10 (corresponding to “integrated extracted image”, hereinafter referred to as “integrated blood vessel image”).
  • the extraction of blood vessels may be performed by binarization using a threshold value, and since the integrated image is the same type of image as the projection image as described above, the blood vessel extraction method used in the X-ray projection image, etc. May be used.
  • the method of extracting a blood vessel region from an integrated image is performed by, for example, extracting a linear structure having a signal intensity lower than that of the periphery from a two-dimensional image.
  • XY plane image an integrated image
  • various combinations of known techniques such as edge detection and template matching are possible.
  • FIG. 1 a specific flowchart using edge detection is shown in FIG.
  • pre-processing is performed on the input XY plane image. Specifically, noise suppression processing such as averaging or low-frequency filtering, or histogram enhancement for enhancing the extraction target signal is performed.
  • edge detection for extracting a blood vessel region is performed.
  • a frequency filtering process corresponding to the signal frequency of the blood vessel to be extracted such as a DOG (Difference OfGaussian) filter and a Top-Hat conversion, is performed.
  • DOG Difference OfGaussian
  • step S303 the filtering processing result in step S302 is binarized to separate the blood vessel portion and the non-blood vessel portion.
  • the data obtained after the processing in step S303 includes a non-blood vessel part, or a portion that is a blood vessel part but is not detected as a blood vessel part (a non-detection part of the blood vessel part) occurs.
  • step S304 a blood vessel region is determined for each pixel based on the result of step S303. Specifically, a non-blood vessel part remaining in the process of step S303 is removed by an evaluation function for evaluating the linearity and connectivity of the blood vessel, and a process for complementing the blood vessel part that has not been detected is performed.
  • the blood vessel region determination process includes a removal process for the remaining non-blood vessel part and a complement process for compensating for the non-detection part of the blood vessel.
  • the integrated blood vessel image as shown in FIG. 11 is obtained by the above-described steps S301 to S304.
  • FIG. 13 shows an image example of each cross section.
  • the image is displayed as an image of five layers, but actually has a cross section of about 100 to 1000.
  • the blood vessel is not reflected in the cross-sectional image of the uppermost layer (first layer) shown in FIG.
  • first layer first layer
  • second layer second layer
  • third layer shows a blood vessel region (reference numeral 810c) including blood vessels actually present in the third layer and artifacts due to reflection signals of blood vessels present in the upper layer. 811c)
  • the reflection signal of the upper blood vessel enters all the lower layers, and affects the lower layer.
  • the image in the lowermost layer (fifth layer) in FIG. 13 is present in the blood vessel actually present in the fifth layer and in the upper layer.
  • a blood vessel region (reference numerals 810e and 811e) including an artifact due to a blood vessel reflection signal is shown.
  • FIG. 14 shows a technique for extracting a blood vessel portion for each layer from these cross-sectional images.
  • the leftmost column in FIG. 14 is a cross-sectional image for each layer described in FIG.
  • Pk 1, 2, 3, 4, 5
  • I 1 an integral blood vessel image described in FIG. 11
  • the blood vessel image is also reflected as an artifact in the lower layer.
  • the following processing is performed.
  • a correlation between the images of the first layer image P 1 and the integrated blood vessel image I 1 is obtained, and an image (reference symbol C 1 ) obtained by extracting a correlated portion is obtained.
  • This correlation extracted image shown by reference numeral C 1 is an image obtained by extracting the blood vessel existing in the first layer (target layer).
  • the image with the code C 1 is referred to as a “first layer blood vessel image”.
  • the first layer blood vessel image C 1 since the first layer there is no blood vessels, the first layer blood vessel image C 1, it does not include information of a blood vessel.
  • the second layer blood vessel image C 2 is an image obtained by extracting the blood vessel existing in the second layer.
  • the integrated blood vessel image I 3 after removal of the upper blood vessel obtained by removing the second blood vessel image C 2 from the image indicated by the symbol I 2 and the third layer image P 3 are processed. taking the correlation between the image of the image P 3, to obtain a third layer blood vessel image C 3.
  • the extracted image is removed from the original integrated blood vessel image to update the “integrated blood vessel image from which the upper layer blood vessel has been removed”, and the next cross-sectional image (next layer) is updated. And the correlation between the images, blood vessels in that layer are extracted.
  • a blood vessel image (C 1 to C 5 ) extracted for each layer is connected in the depth direction, thereby generating a three-dimensional blood vessel extraction image.
  • blood vessel travel information can be obtained.
  • the blood vessel width extracted in the computation attention processing layer is a blood vessel diameter
  • Blood vessel extraction. This processing may be performed at the stage of processing for each frame (slice image of each layer), or the blood vessel image extracted from each layer is connected in the depth direction to form a stereoscopic image, and then corresponds to the blood vessel diameter. You may correct
  • slice width is set to be significantly thinner (narrower) than the blood vessel diameter. Therefore, in consideration of the blood vessel diameter, it is preferable to set slice planes (slice widths) at intervals substantially equal to or slightly narrower than the blood vessel diameter.
  • FIG. 15 is a flowchart of the three-dimensional blood vessel structure extraction process in this embodiment.
  • This processing is performed by the three-dimensional blood vessel structure extraction processing unit 460 of FIG.
  • the specific structure is a three-dimensional blood vessel structure
  • the specific structure image extraction unit in the tomographic image processing apparatus of the present invention and the specific structure image extraction unit in the optical coherence tomographic image diagnosis apparatus of the present invention are mainly used.
  • This is realized by the three-dimensional blood vessel structure extraction processing unit 460.
  • the specific structure image extraction processing step in the tomographic image processing method of the present invention is mainly performed by the three-dimensional blood vessel structure extraction processing unit 460.
  • step S110 in FIG. 15 a three-dimensional optical coherence tomographic image is acquired.
  • the tomographic image data generated by the tomographic image construction unit 450 in FIG. 9 is input to the three-dimensional blood vessel structure extraction processing unit 460.
  • step S120 the tomographic image data group acquired as an XZ plane frame is reconstructed into an XY plane image series.
  • the image sequence on the XY plane here is a group of images having a cross section perpendicular to the incident direction (Z direction) of the measurement light L1.
  • the main reason for reconstructing the image series of the XY plane image in this way is that the intramucosal blood vessels to be subjected to OCT measurement in this example are distributed horizontally with respect to the mucosal surface (the intramucosal blood vessels are generally This is because it is advantageous to perform the blood vessel extraction process on the XY plane because it travels along a plane parallel to the mucosal surface.
  • step S130 the XY plane image obtained in step S120 is integrated in the depth direction to generate an integrated image.
  • the integrated image described in FIG. 10 is an example of an image generated by the integration process in step S130.
  • step S140 a blood vessel region is extracted from the integrated image (see FIG. 10) obtained in step S130.
  • the integrated blood vessel image described with reference to FIG. 11 is generated. Note that an example of a method for extracting a blood vessel region from the integrated image in step S140 is as described in FIG.
  • step S150 the blood vessel diameter is estimated based on the integrated blood vessel image obtained in step S140.
  • step S160 the slice width is determined based on the blood vessel diameter information estimated in step S150, and a plurality of slice images (cross-sectional images along the XY plane) are acquired from the three-dimensional tomographic image data.
  • step S170 blood vessels in each layer are extracted from the correlation between the cross-sectional image of each layer obtained in step S160 and the integrated blood vessel image obtained in step S140, and a blood vessel image for each layer is generated.
  • the flow of processing in step S170 is shown in the flowchart of FIG.
  • FIG. 16 shows the flow of processing described in FIG.
  • step S401 of FIG. 16 to set a layer number j of the target layer 1, the first layer image P 1 and the target of the operation.
  • step S402 it is taking a correlation between the first layer image P 1 and the integral blood vessel image I 1, extracts a blood vessel portion of the first layer, to produce a first layer blood vessel image C 1.
  • step S403 the layer number j of the target layer is incremented (+1), and the cross-sectional image of the next layer is selected as a processing target.
  • step S404 a process of removing the blood vessel image extracted in the layer above the j-th layer from the integrated blood vessel image is performed.
  • step S405 the j-th layer image and the upper-layer blood vessel-removed integrated blood vessel image I j are correlated to generate a j-th layer blood vessel image C j .
  • step S190 the display image created in step S180 is displayed on the monitor device (reference numeral 500 in FIG. 9).
  • the calculation function of each processing step described in FIG. 15 and FIG. 16 is realized by means including software (program), hardware circuit, or a combination thereof.
  • blood vessels can be clearly identified, and cancer cell portions in which new blood vessels are densely identified can be clearly identified.
  • FIG. 17 is a schematic diagram of an OCT image including three-dimensionally intersecting blood vessels.
  • the blood vessel 810 passes over the blood vessel 811 and intersects.
  • the method according to the first embodiment cannot correctly extract the lower blood vessel when the blood vessels intersect three-dimensionally as shown in FIG. Therefore, as a means for correcting this, in the second embodiment, when the disconnection is determined for the blood vessel image extracted for each layer and it is determined that the disconnection is due to the presence of the upper blood vessel, the disconnection is performed. A process of connecting (connecting) parts is performed.
  • FIG. 18 is an integrated image in the depth direction obtained from the three-dimensional tomographic image data obtained when the blood vessels intersect three-dimensionally (FIG. 17), and FIG. 19 shows the blood vessel region from the integrated image of FIG. It is an extracted integrated blood vessel image (corresponding to an “integrated extracted image”).
  • FIG. 20 is a drawing corresponding to FIG. 14 in the first embodiment. In FIG. 20, elements corresponding to those shown in FIG. 14 are denoted by the same reference numerals and description thereof is omitted.
  • the blood vessel portion should originally exist where the blood vessel intersects three-dimensionally (portion indicated by the arrow E). Since another blood vessel is extracted in the second layer above the layer, a part of the blood vessel line extracted in the fourth layer is interrupted. Since this discontinuous portion (discontinuous portion) is a portion that has been mistakenly removed by the process of removing the reflection component (“erroneous removal portion”), a process of restoring this as shown in FIG. 21 is performed.
  • the determination of whether or not it is an erroneously removed portion that is, the determination of the disconnection property of the blood vessel part, for example, determines whether or not the disconnection is about the same as the blood vessel diameter assumed in advance. For a portion determined to be an erroneous removal portion, a restoration process (erroneous removal blood vessel restoration processing) is performed for smoothly joining the portions.
  • disconnection it may be automatically connected if the disconnection is about the same as the blood vessel diameter expected in advance, and if a disconnection part is connected, a smooth curve is automatically generated. It may be connected.
  • ⁇ Modification 1> In the above description, the example in which the blood vessel image extracted from each layer is connected in the depth direction to display a three-dimensional blood vessel image has been described, but the final output form of the blood vessel image information extracted from each layer is three-dimensional. It is not limited to a typical display. There may be a utilization mode in which, after grasping the three-dimensional structure information of the blood vessel, the blood vessel portion is highlighted in each tomographic image (for example, superimposed display with red color).
  • the SS-OCT (Swept Source OCT) apparatus has been described as the OCT processor 400. It can also be applied as an OCT apparatus.
  • the optical coherence tomographic image diagnostic apparatus is exemplified.
  • the scope of application of the present invention is not limited to this, and other tomographic measurement methods such as an ultrasonic diagnostic imaging apparatus that uses ultrasonic waves as measurement waves are used.
  • the present invention can be widely applied to diagnostic imaging apparatuses.
  • SYMBOLS 10 ... Image diagnostic apparatus, 12 ... 1st light source part, 20 ... Interference light detection part, 20a ... Interference signal generation part, 20b ... AD conversion part, 22 ... Signal processing part, 100 ... Endoscope, 200 ... Endoscope Mirror processor, 300 ... light source device, 400 ... OCT processor, 410 ... Fourier transform unit, 420 ... logarithmic transformation unit, 450 ... tomographic image construction unit, 460 ... three-dimensional blood vessel structure extraction processing unit, 490 ... control unit, 500 ... monitor Apparatus, 600 ... OCT probe, 800 ... Subject, 810, 811 ... Blood vessel, 860, 861 ... Reflected portion

Abstract

L'invention concerne un procédé de traitement d'image tomographique, qui consiste : à intégrer des données d'image tomographique obtenues dans la direction de la profondeur (S130); à extraire une zone vasculaire de l'image intégrée en utilisant une technique de binarisation ou analogue pour donner ainsi une image vasculaire intégrée (S140); à former, à partir des données d'image tomographique obtenues ci-dessus, des images de section transversale pour de multiples couches positionnées à différentes profondeurs (S160), ladite section transversale étant perpendiculaire à la direction incidente de l'onde de mesure; et, sur la base de la corrélation entre les images de la section transversale des couches individuelles et l'image vasculaire intégrée, à former des images vasculaires des couches correspondantes (S170), tout en exécutant un traitement pour enlever un composant éblouissant provoqué par un vaisseau sanguin qui est extrait dans une couche au-dessus de la couche ciblée.
PCT/JP2011/063641 2010-06-15 2011-06-15 Appareil et procédé de traitement d'image tomographique, et appareil de diagnostic d'image tomographique à cohérence optique WO2011158849A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010136320A JP2012002598A (ja) 2010-06-15 2010-06-15 断層画像処理装置及び方法、並びに光干渉断層画像診断装置
JP2010-136320 2010-06-15

Publications (1)

Publication Number Publication Date
WO2011158849A1 true WO2011158849A1 (fr) 2011-12-22

Family

ID=45348246

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/063641 WO2011158849A1 (fr) 2010-06-15 2011-06-15 Appareil et procédé de traitement d'image tomographique, et appareil de diagnostic d'image tomographique à cohérence optique

Country Status (2)

Country Link
JP (1) JP2012002598A (fr)
WO (1) WO2011158849A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2485345A (en) * 2010-11-05 2012-05-16 Queen Mary & Westfield College Optical coherence tomography scanning to identify a region of interest in a sample
JP2013064645A (ja) * 2011-09-16 2013-04-11 Fujifilm Corp 光干渉断層画像処理方法及びその装置
EP3150108A4 (fr) * 2014-05-27 2018-01-24 Koh Young Technology Inc. Dispositif oct amovible
US10209177B2 (en) 2015-03-31 2019-02-19 Sony Corporation Signal processing apparatus for eliminating object reflection noise in optical tomographic measurement

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004502483A (ja) * 2000-07-11 2004-01-29 カール−ツアイス−スチフツング 眼疾患を診断および監視する装置
JP2007325831A (ja) * 2006-06-09 2007-12-20 Topcon Corp 眼底観察装置、眼科画像処理装置及び眼科画像処理プログラム
JP2009297231A (ja) * 2008-06-12 2009-12-24 Olympus Medical Systems Corp 被検体情報算出装置及び被検体情報算出方法
JP2010060332A (ja) * 2008-09-01 2010-03-18 Olympus Corp 散乱体内部観測装置及び散乱体内部観測方法
JP2010110556A (ja) * 2008-11-10 2010-05-20 Canon Inc 画像処理装置、画象処理方法、プログラム、及びプログラム記憶媒体

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004502483A (ja) * 2000-07-11 2004-01-29 カール−ツアイス−スチフツング 眼疾患を診断および監視する装置
JP2007325831A (ja) * 2006-06-09 2007-12-20 Topcon Corp 眼底観察装置、眼科画像処理装置及び眼科画像処理プログラム
JP2009297231A (ja) * 2008-06-12 2009-12-24 Olympus Medical Systems Corp 被検体情報算出装置及び被検体情報算出方法
JP2010060332A (ja) * 2008-09-01 2010-03-18 Olympus Corp 散乱体内部観測装置及び散乱体内部観測方法
JP2010110556A (ja) * 2008-11-10 2010-05-20 Canon Inc 画像処理装置、画象処理方法、プログラム、及びプログラム記憶媒体

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2485345A (en) * 2010-11-05 2012-05-16 Queen Mary & Westfield College Optical coherence tomography scanning to identify a region of interest in a sample
JP2013064645A (ja) * 2011-09-16 2013-04-11 Fujifilm Corp 光干渉断層画像処理方法及びその装置
EP3150108A4 (fr) * 2014-05-27 2018-01-24 Koh Young Technology Inc. Dispositif oct amovible
US10986996B2 (en) 2014-05-27 2021-04-27 Koh Young Technology Inc. Removable Optical Coherence Tomography (OCT) device
US10209177B2 (en) 2015-03-31 2019-02-19 Sony Corporation Signal processing apparatus for eliminating object reflection noise in optical tomographic measurement

Also Published As

Publication number Publication date
JP2012002598A (ja) 2012-01-05

Similar Documents

Publication Publication Date Title
US20120101372A1 (en) Diagnosis support apparatus, diagnosis support method, lesioned part detection apparatus, and lesioned part detection method
JP2010068865A (ja) 画像診断装置
US10942022B2 (en) Manual calibration of imaging system
JPWO2016047773A1 (ja) 断層像形成装置および制御方法
JP5653087B2 (ja) 光断層画像化装置及びその作動方法
WO2011158849A1 (fr) Appareil et procédé de traitement d'image tomographique, et appareil de diagnostic d'image tomographique à cohérence optique
JP2010179042A (ja) 光構造観察装置及びその構造情報処理方法、光構造観察装置を備えた内視鏡システム
JP2010043994A (ja) 光プローブ及び3次元画像取得装置
JP2011078447A (ja) 光構造観察装置、その構造情報処理方法及び光構造観察装置を備えた内視鏡装置
JP2011062301A (ja) 光構造像観察装置、その構造情報処理方法及び光構造像観察装置を備えた内視鏡装置
JP2006026015A (ja) 光断層画像取得システム
JP2012013432A (ja) 断層画像処理装置及び方法、並びに光干渉断層画像診断装置
JP5779461B2 (ja) 光干渉断層画像処理装置及びその作動方法
JP5748281B2 (ja) 光干渉断層画像処理方法及びその装置
JP2012010776A (ja) 断層画像処理装置及び方法、並びに光干渉断層画像診断装置
JP5405839B2 (ja) 光立体構造像観察装置、その作動方法及び光立体構造像観察装置を備えた内視鏡システム
JP2011206373A (ja) 光画像診断装置及びその表示制御方法
WO2011158848A1 (fr) Dispositif de tomographie optique et procédé de tomographie optique
JP5752538B2 (ja) 光干渉断層画像処理方法及びその装置
US8379945B2 (en) Optical apparatus for acquiring structure information and its processing method of optical interference signal
JP5696178B2 (ja) 光断層画像化装置及びその作動方法
JP5812785B2 (ja) 光断層画像処理装置及び光断層画像処理装置の作動方法
EP2600137B1 (fr) Système et procédé d'imagerie tomographique optique
JP2010179043A (ja) 光構造観察装置及びその構造情報処理方法
JP5405842B2 (ja) 光構造解析装置及びその作動方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11795753

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11795753

Country of ref document: EP

Kind code of ref document: A1