WO2022009478A1 - Dispositif de traitement d'image, système d'endoscope, procédé de fonctionnement pour dispositif de traitement d'image et programme pour dispositif de traitement d'image - Google Patents

Dispositif de traitement d'image, système d'endoscope, procédé de fonctionnement pour dispositif de traitement d'image et programme pour dispositif de traitement d'image Download PDF

Info

Publication number
WO2022009478A1
WO2022009478A1 PCT/JP2021/010863 JP2021010863W WO2022009478A1 WO 2022009478 A1 WO2022009478 A1 WO 2022009478A1 JP 2021010863 W JP2021010863 W JP 2021010863W WO 2022009478 A1 WO2022009478 A1 WO 2022009478A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
support
light
observation target
Prior art date
Application number
PCT/JP2021/010863
Other languages
English (en)
Japanese (ja)
Inventor
康太郎 檜谷
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to JP2022534904A priority Critical patent/JPWO2022009478A1/ja
Publication of WO2022009478A1 publication Critical patent/WO2022009478A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof

Definitions

  • the present invention relates to an image processing device, an endoscope system, an operation method of the image processing device, and a program for the image processing device.
  • diagnosis using an endoscope system equipped with a light source device, an endoscope, and a processor device is widely performed.
  • diagnosis using an endoscope system an image obtained by photographing an observation target with an endoscope by a method called image-enhanced endoscopy or image-enhanced observation (IEE, image enhanced endoscopy) (hereinafter referred to as "inside").
  • IEE image enhanced endoscopy
  • a plurality of support information for diagnosing the observation target such as the surface structure of the observation target, biological information on the surface layer of the mucous membrane, or the possibility of a lesion, is obtained.
  • IEE there are various types of IEE, such as a method of digitally processing an endoscopic image obtained by imaging an observation target and using it, or a method of using an endoscopic image captured by illuminating the observation target with a specific illumination light.
  • the method is known.
  • biological information such as a region where blood vessels are densely packed or a region having a low oxygen saturation in an observation target, and to emphasize and display these regions.
  • These displays provide support information for the doctor to diagnose the observation target.
  • the stage of the disease is determined from the range of the area where there is a possibility of lesion in the observation target or the degree of inflammation by IEE, etc., and the obtained determination result is provided as support information for the diagnosis of the doctor.
  • CAD Computer-Aided Diagnosis
  • an endoscopic system that determines the severity or progression of a disease such as the stage of ulcerative colitis with high accuracy by using an endoscopic image obtained by IEE (Patent Document 1).
  • the support information that can be obtained well may differ. Therefore, the effect or application obtained may differ depending on the type of IEE.
  • different IEEs may be used as the information to be observed for the purpose of acquiring the oxygen saturation and the purpose of acquiring the polyp region. Therefore, by using a plurality of IEEs, it is possible to obtain a plurality of types of diagnostic support information regarding the observation target, which is useful in screening, diagnosis of lesions, and the like.
  • An object of the present invention is to provide an image processing device, an endoscope system, an operation method of the image processing device, and a program for the image processing device, which can easily recognize detailed information of an observation target.
  • the present invention is an image processing device and includes a processor.
  • the processor acquires a plurality of image signals obtained by imaging an observation target using an endoscope, and uses a display image signal obtained based on at least one image signal to display a display image on a display. Is generated, in each of a plurality of image signals, a specific area in which the observation target is in a specific state is determined, a support image indicating the specific area is generated, and a plurality of support images are displayed when the display image is displayed on the display. Is controlled to be superimposed on the display image in a manner that can be distinguished from each other.
  • the processor displays each of the plurality of support images in different colors from each other.
  • the processor displays each of the plurality of support images with figures having different shapes from each other.
  • the processor controls to superimpose and display the support image on the display image each time the support image is generated.
  • the processor enhances the image signal to obtain a support image signal and determines a specific area based on the support image signal.
  • the processor controls to display a legend display showing the association between the support image and the enhancement process on the display.
  • the present invention is an endoscope system, and includes the image processing apparatus of the present invention and a light source unit that emits illumination light to irradiate an observation target.
  • the display image signal is preferably obtained by imaging an observation target illuminated by white illumination light emitted by a light source unit.
  • the processor generates a support image by capturing an observation target illuminated by each of a plurality of support image illumination lights having different spectral spectra from each other emitted by the light source unit.
  • the plurality of support image illumination lights include narrow band light having a preset wavelength band.
  • the light source unit repeatedly emits white illumination light and each of a plurality of support image illumination lights in a preset order.
  • the display control unit controls to display a legend display indicating the association between the support image and the illumination light for the support image on the display.
  • the present invention is a method of operating an image processing device, which is based on an image acquisition step of acquiring a plurality of image signals obtained by imaging an observation target using an endoscope, and at least one image signal.
  • a support image showing a specific area in which the observation target is in a specific state based on each of a display image generation step of generating a display image to be displayed on the display and a plurality of image signals using the display image signal obtained. It is provided with a support image generation step for generating a display image, and a display control step for controlling display of a plurality of support images superimposed on the display image in a manner distinguishable from each other when the display image is displayed on the display.
  • the present invention is a program for an image processing device, which is a program for an image processing device installed in an image processing device that performs image processing on an image signal obtained by imaging an observation target using an endoscope.
  • a display image generation function that generates a display image to be displayed on the screen, a support image generation function that generates a support image indicating a specific area in which the observation target is in a specific state based on each of a plurality of image signals, and a display image.
  • the endoscope system 10 includes an endoscope 12, a light source device 14, a processor device 16, a display 18, and a keyboard 19.
  • the endoscope 12 photographs the observation target.
  • the light source device 14 emits illumination light to irradiate the observation target.
  • the processor device 16 controls the system of the endoscope system 10.
  • the display 18 is a display unit that displays an endoscopic image or the like.
  • the keyboard 19 is an input device for inputting settings to the processor device 16 and the like.
  • the endoscope system 10 has three modes as an observation mode: a normal mode, a special mode, and a support mode.
  • a normal mode a normal image having a natural color is displayed on the display 18 as a display image by irradiating the observation target with normal light and taking an image.
  • a special mode a special image emphasizing a specific state of the observation target is displayed on the display 18 as a display image by illuminating the observation target with special light having a wavelength band or a spectral spectrum different from that of normal light and taking an image.
  • the support mode when the normal image obtained by irradiating the observation target with normal light and taking an image is displayed on the display 18 as a display image, the support image is superimposed and displayed on the display image.
  • the support image is an image showing a specific area in which the observation target is in a specific state.
  • the display image in the normal mode may be a special image that has undergone enhancement processing or the like, or may be used as a display image as long as it has
  • the fact that the observation target is in a specific state means that various information such as biological information of the observation target obtained from image analysis of the endoscopic image obtained by capturing the observation target satisfy preset conditions. Determine if the state of the observation target is specific.
  • the biological information of the observation target is a numerical value or the like representing the whole or partial characteristics of the observation target, and is, for example, oxygen saturation, blood concentration, blood vessel density, or a lesion or a candidate for a lesion (subject of biological tissue examination).
  • the observation target is in a specific state.
  • a plurality of biological information obtained from the endoscopic image may be used to determine the state of the observation target, or the state of the observation target may be determined by the image recognition technique by machine learning using the endoscopic image. good.
  • the observation target when the density of the surface blood vessels of the observation target is equal to or higher than a specific value, the observation target is in a dense state with a high density of the surface blood vessels. judge.
  • the support image is an image showing the region in the dense state where the superficial blood vessels in the observation target are in a high density. be. Since the support image is superimposed on the display image displaying the observation target and shown on the display, it is possible to show a specific area in which the observation target is in a specific state.
  • the display image displaying the observation target may be arbitrarily selected from a normal light image or a special image. In the present embodiment, the display image in the support mode is a normal image.
  • the support image shows a specific area.
  • the support image is preferably generated based on the support image signal.
  • the support image signal is a support image signal obtained by imaging using various special lights as illumination light to irradiate the observation target, and a support obtained by performing various image processing such as enhancement processing on the image signal.
  • the support image signal obtained by performing the image processing is, for example, a color-enhanced image obtained by performing a color difference expansion process in which the color difference of a plurality of regions in the observation target is expanded with respect to the image signal obtained by imaging, and a color-enhanced image. Includes contrast-enhanced images with enhanced contrast.
  • Various special lights will be described later.
  • the endoscope 12 includes an insertion portion 12a to be inserted into a subject having an observation target, an operation portion 12b provided at the base end portion of the insertion portion 12a, and a bending portion 12c provided on the distal end side of the insertion portion 12a. It has a tip portion 12d.
  • the angle knob 12e see FIG. 2
  • the operation unit 12b is provided with an angle knob 12e, a treatment tool insertion port 12f, a scope button 12g, and a zoom operation unit 13.
  • the treatment tool insertion port 12f is an entrance for inserting a treatment tool such as a biopsy forceps, a snare, or an electric knife.
  • the treatment tool inserted into the treatment tool insertion port 12f protrudes from the tip portion 12d.
  • Various operations can be assigned to the scope button 12g, and in the present embodiment, it is used for the operation of switching the observation mode. By operating the zoom operation unit 13, the observation target can be enlarged or reduced for shooting.
  • the light source device 14 includes a light source unit 20 including a light source that emits illumination light, and a light source control unit 22 that controls the operation of the light source unit 20.
  • the light source unit 20 emits illumination light that illuminates the observation target.
  • the illumination light includes light emission such as excitation light used to emit the illumination light.
  • the light source unit 20 includes, for example, a light source of a laser diode, an LED (Light Emitting Diode), a xenon lamp, or a halogen lamp, and at least an excitation light used to emit white illumination light or white illumination light. Emit.
  • the white color includes so-called pseudo-white color, which is substantially equivalent to white color in the imaging of the observation target using the endoscope 12.
  • the light source unit 20 includes, if necessary, a phosphor that emits light when irradiated with excitation light, an optical filter that adjusts the wavelength band, spectral spectrum, light amount, etc. of the illumination light or excitation light.
  • the light source unit 20 can emit illumination light composed of at least narrow band light (hereinafter referred to as narrow band light).
  • the light source unit 20 can emit a plurality of illumination lights having different spectral spectra from each other.
  • the plurality of illumination lights may include narrow band light.
  • the light source unit 20 may emit light having a specific wavelength band or spectral spectrum necessary for capturing an image used for calculating biological information such as oxygen saturation of hemoglobin contained in the observation target, for example. can.
  • the “narrow band” means a substantially single wavelength band in relation to the characteristics of the observation target and / or the spectral characteristics of the color filter of the image sensor 45.
  • the wavelength band is, for example, about ⁇ 20 nm or less (preferably about ⁇ 10 nm or less), this light is a narrow band.
  • the light source unit 20 has four color LEDs of V-LED20a, B-LED20b, G-LED20c, and R-LED20d.
  • the V-LED 20a emits purple light VL having a center wavelength of 405 nm and a wavelength band of 380 to 420 nm.
  • the B-LED 20b emits blue light BL having a center wavelength of 460 nm and a wavelength band of 420 to 500 nm.
  • the G-LED 20c emits green light GL having a wavelength band of 480 to 600 nm.
  • the R-LED 20d emits red light RL having a center wavelength of 620 to 630 nm and a wavelength band of 600 to 650 nm.
  • the center wavelengths of the V-LED 20a and the B-LED 20b have a width of about ⁇ 20 nm, preferably about ⁇ 5 nm to about ⁇ 10 nm.
  • the purple light VL is short-wavelength light used for detecting superficial blood vessel congestion, intramucosal hemorrhage, and extramucosal hemorrhage used in the special light mode or the support mode, and has a central wavelength or a peak wavelength of 410 nm. It is preferable to include it. Further, the purple light VL is preferably narrow band light.
  • the light source control unit 22 controls the timing of turning on, off, or shielding each light source constituting the light source unit 20, the amount of light emitted, and the like. As a result, the light source unit 20 can emit a plurality of types of illumination light having different spectral spectra for a preset period and emission amount. In the present embodiment, the light source control unit 22 sets the lighting and extinguishing of each of the LEDs 20a to 20d, the amount of light emitted at the time of lighting, the insertion and removal of the optical filter, and the like by inputting an independent control signal to each of the illumination lights. Adjust the spectral spectrum. As a result, the light source unit 20 emits white illumination light, a plurality of types of illumination light having different spectral spectra, or illumination light composed of at least narrow band light.
  • the light source unit 20 emits white illumination light under the control of the light source control unit 22.
  • the white illumination light is normal light.
  • the light source control unit 22 has a normal light intensity ratio of Vc: Bc: Gc: Rc between the purple light VL, the blue light BL, the green light GL, and the red light RL.
  • Each LED 20a to 20d is controlled so as to emit light.
  • the light intensity ratio Vc: Bc: Gc: Rc corresponds to the light intensity condition of the white illumination light.
  • the light source unit 20 emits a plurality of special lights having different spectral spectra from each other under the control of the light source control unit 22. This is because, by imaging with a plurality of special lights, a plurality of support image signals can be obtained, and a plurality of support images different from each other can be obtained.
  • the plurality of special lights may be two or more kinds of n kinds. Note that n is an integer.
  • the plurality of special lights are three types of the first illumination light, the second illumination light, and the third illumination light.
  • the light source control unit 22 has a light intensity ratio of Vs1: Bs1: between purple light VL, blue light BL, green light GL, and red light RL as the first illumination light.
  • Each LED 20a to 20d is controlled so as to emit a special light of Gs1: Rs1.
  • the light intensity ratio Vs1: Bs1: Gs1: Rs1 corresponds to the light intensity condition of the first illumination light.
  • the first illumination light preferably emphasizes the surface blood vessels. Therefore, as shown in FIG. 5, the first illumination light emits only purple light VL as a short-wavelength narrow-band light with a light intensity ratio of 1: 0: 0: 0.
  • the light source control unit 22 has a light intensity ratio of Vs2: Bs2: between purple light VL, blue light BL, green light GL, and red light RL as the second illumination light.
  • Gs2 Each LED 20a to 20d is controlled so as to emit special light that becomes Rs2.
  • the light intensity ratio Vs2: Bs2: Gs2: Rs2 corresponds to the light intensity condition of the second illumination light.
  • the second illumination light preferably emphasizes structures such as surface blood vessels and polyps. Therefore, in the second illumination light, it is preferable that the light intensity of the purple light VL is higher than the light intensity of the blue light BL.
  • the ratio of the light intensity Vs2 of the purple light VL to the light intensity Bs2 of the blue light BL is set to “4: 1”.
  • the light source control unit 22 has a light intensity ratio of Vs3: Bs3: between purple light VL, blue light BL, green light GL, and red light RL as the third illumination light.
  • Gs3 Each LED 20a to 20d is controlled so as to emit special light that becomes Rs3.
  • the light intensity ratio Vs3: Bs3: Gs3: Rs3 corresponds to the light intensity condition of the third illumination light.
  • the third illumination light preferably emphasizes deep blood vessels. Therefore, it is preferable that the light intensity of the blue light BL is higher than the light intensity of the purple light VL for the third illumination light. For example, as shown in FIG. 7, the ratio of the light intensity Vs3 of the purple light VL to the light intensity Bs3 of the blue light BL is set to “1: 3”.
  • the three types of the first illumination light, the second illumination light, and the third illumination light of the present embodiment are illumination lights including narrow band light.
  • the light intensity ratio includes the case where the ratio of at least one semiconductor light source is 0 (zero). Therefore, this includes the case where any one or more of the semiconductor light sources are not lit. For example, as in the case where the light intensity ratio between the purple light VL, the blue light BL, the green light GL, and the red light RL is 1: 0: 0: 0, only one of the semiconductor light sources is turned on, and the other 3 are turned on. Even if one does not light up, it shall have a light intensity ratio.
  • the plurality of illumination lights are repeatedly emitted in a preset order.
  • the normal light, the special light, the first illumination light, the second illumination light, and the third illumination light are repeatedly emitted in a preset order.
  • the light source control unit 22 emits the normal light CL for 5 frames (5FL) in succession, and then emits the first illumination light 1S for 1 frame (1FL).
  • the normal light CL is continuously emitted for 5 frames (5FL)
  • the second illumination light 2S is emitted for 1 frame (1FL
  • the normal light CL is emitted again for 5 frames (5FL), and then.
  • the illumination pattern having a light emission order in which the third illumination light 3S emits light for one frame (1FL) is set as one cycle (1CY), and this cycle is repeated.
  • a normal image signal to be a normal image is obtained, the first image signal is obtained by the first illumination light, the second image signal is obtained by the second illumination light, and the second image signal is obtained.
  • the third image signal is obtained by the three illumination lights. Since the first illumination light, the second illumination light, and the third illumination light are special lights, the first image signal, the second image signal, and the third image signal are special images, respectively.
  • the tip portion 12d of the endoscope 12 is provided with an illumination optical system 30a and a photographing optical system 30b (see FIG. 3).
  • the illumination optical system 30a has an illumination lens 42, and the illumination light is emitted toward the observation target through the illumination lens 42.
  • the photographing optical system 30b has an objective lens 43, a zoom lens 44, and an image sensor 45.
  • the image sensor 45 was administered to the observation target via the objective lens 43 and the zoom lens 44, such as reflected light of the illumination light returning from the observation target (in addition to the reflected light, scattered light, fluorescence emitted by the observation target, or administration to the observation target.
  • the observation target is photographed using (including fluorescence caused by the drug).
  • the zoom lens 44 moves by operating the zoom operation unit 13, and enlarges or reduces the observation target image.
  • the image sensor 45 has a color filter of one of a plurality of color filters for each pixel.
  • the image sensor 45 is a color sensor having a primary color system color filter.
  • the image sensor 45 includes an R pixel having a red color filter (R filter), a G pixel having a green color filter (G filter), and a B pixel having a blue color filter (B filter).
  • R filter red color filter
  • G filter green color filter
  • B filter blue color filter
  • a CCD (Charge Coupled Device) sensor or a CMOS (Complementary Metal Oxide Semiconductor) sensor can be used.
  • the image sensor 45 of the present embodiment is a primary color system color sensor, a complementary color system color sensor can also be used.
  • Complementary color sensors include, for example, a cyan pixel provided with a cyan color filter, a magenta pixel provided with a magenta color filter, a yellow pixel provided with a yellow color filter, and a green pixel provided with a green color filter. Have.
  • the complementary color sensor is used, the image obtained from the pixels of each of the above colors can be converted into the same image as the image obtained by the primary color sensor by performing the complementary color-primary color conversion.
  • the primary color system or complementary color system sensor has one or a plurality of types of pixels having characteristics other than the above, such as W pixels (white pixels that receive light in almost all wavelength bands).
  • W pixels white pixels that receive light in almost all wavelength bands.
  • the image sensor 45 of the present embodiment is a color sensor, a monochrome sensor having no color filter may be used.
  • the processor device 16 incorporates a program (not shown) related to processing performed by the control unit 51, the image acquisition unit 52, the image processing unit 56, the display control unit 57, and the like, which will be described later.
  • the program is operated by the control unit 51 composed of the processor included in the processor device 16 that functions as an image processing device, so that the functions of the control unit 51, the image acquisition unit 52, the image processing unit 56, and the display control unit 57 can be performed. Realize.
  • the control unit 51 comprehensively controls the endoscope system 10 such as synchronous control of the irradiation timing of the illumination light and the shooting timing.
  • the control unit 51 sets the settings in each part of the endoscope system 10 such as the light source control unit 22, the image sensor 45, or the image processing unit 56. Enter in.
  • the image acquisition unit 52 acquires an image of an observation target captured using pixels of each color, that is, a RAW image, from the image sensor 45.
  • the RAW image is an image (endoscopic image) before the demosaic processing is performed. If the image is an image before the demosaic processing is performed, the RAW image also includes an image obtained by performing arbitrary processing such as noise reduction processing on the image acquired from the image sensor 45.
  • the image acquisition unit 52 includes a DSP (Digital Signal Processor) 53, a noise reduction unit 54, and a conversion unit 55 in order to perform various processing on the acquired RAW image as needed.
  • DSP Digital Signal Processor
  • the DSP 53 includes, for example, an offset processing unit, a defect correction processing unit, a demosaic processing unit, a linear matrix processing unit, a YC conversion processing unit, and the like (none of which are shown).
  • the DSP 53 performs various processing on the RAW image or the image generated by using the RAW image using these.
  • the offset processing unit performs offset processing on the RAW image.
  • the offset process is a process of reducing the dark current component from the RAW image and setting an accurate zero level.
  • the offset process may be referred to as a clamp process.
  • the defect correction processing unit performs defect correction processing on the RAW image.
  • the defect correction process is a process of correcting or generating a pixel value of a RAW pixel corresponding to a defective pixel of the image sensor 45 when the image sensor 45 includes a pixel (defective pixel) having a defect due to a manufacturing process or a change with time. Is.
  • the demosaic processing unit performs demosaic processing on the RAW image of each color corresponding to the color filter of each color.
  • the demosaic process is a process of generating pixel values that are missing due to the arrangement of color filters in a RAW image by interpolation.
  • the linear matrix processing unit performs linear matrix processing on the endoscopic image generated by assigning one or a plurality of RAW images to channels of each RGB color.
  • the linear matrix processing is a processing for enhancing the color reproducibility of an endoscopic image.
  • an endoscope image generated by assigning one or a plurality of RAW images to channels of each RGB color is used as an endoscope having a brightness channel Y, a color difference channel Cb, and a color difference channel Cr. This is the process of converting to an image.
  • the noise reduction unit 54 performs noise reduction processing on an endoscope image having a brightness channel Y, a color difference channel Cb, and a color difference channel Cr by using, for example, a moving average method or a median filter method.
  • the conversion unit 55 reconverts the luminance channel Y, the color difference channel Cb, and the color difference channel Cr after the noise reduction processing into an endoscopic image having channels of each color of BGR.
  • the image processing unit 56 performs necessary image processing or calculation on the endoscopic image output by the image acquisition unit 52.
  • the image processing unit 56 includes a display image generation unit 61, an enhancement processing unit 62, a support image signal acquisition unit 63, a specific area determination unit 64, and a support image generation unit 65.
  • the enhancement processing unit 62 includes a color enhancement unit 66 and a contrast enhancement unit 67.
  • the display image generation unit 61 generates a display image by performing image processing necessary for displaying the endoscope image output by the image acquisition unit 52 on the display 18.
  • the enhancement processing unit 62 performs enhancement processing on the endoscopic image output by the image acquisition unit 52.
  • the support image signal acquisition unit 63 acquires the support image signal.
  • the support image signal includes an endoscopic image that has undergone enhancement processing and an endoscopic image that is captured by using special light.
  • the specific area determination unit 64 performs a process of determining a specific area for each of the support image signals.
  • the support image generation unit 65 generates a support image showing a specific area.
  • the display image generation unit 61 generates a display image to be displayed on the display using a display image signal obtained based on at least one image signal.
  • the display image generation unit 61 generates a display image using an image signal obtained by normal light as a display image signal in the normal mode and the support mode, and normally has a natural hue.
  • the image is displayed on the display 18 as a display image 68.
  • the image signal obtained by the special light is used as a display image signal to generate a display image, and for example, a special image in which a specific state of an observation target is emphasized is displayed on the display 18.
  • the light source control unit 22 repeats a specific illumination pattern in units of frames (see FIG. 8).
  • a frame is a unit of imaging. Therefore, one imaging and image signal acquisition are performed in one frame.
  • the display image is generated from the image signal obtained by the normal light and displayed on the display, the first illumination light 1S and the second illumination light 2S, which are the illumination lights other than the normal light CL, among the illumination patterns.
  • the display image is not generated from the image signal obtained by the special light of the third illumination light 3S, and is used as an image signal for generating a support image.
  • the display image by the image signal obtained by the immediately preceding normal light is continuously displayed. Therefore, all the displayed images are images based on image signals obtained by ordinary light.
  • the enhancement processing unit 62 performs enhancement processing on the endoscopic image output by the image acquisition unit 52. Emphasis means making it possible to obtain information on a specific part by distinguishing it from other organizations or structures. For example, a process of changing the color or brightness of a portion having a specific feature relative to another portion (for example, a normal mucous membrane) is an emphasis process.
  • the endoscopic image processed by the enhancement processing unit 62 may be a normal image or a special image.
  • the enhanced endoscopic image is used as a support image signal for generating a support image.
  • the support image signal is an image signal for generating a support image.
  • the enhancement processing unit 62 includes a color enhancement unit 66 and a contrast enhancement unit 67.
  • the third image signal obtained by the third illumination light is processed by the color enhancing unit 66 to obtain a support image signal.
  • the color enhancement unit 66 refers to the acquired endoscopic image so that, for example, the boundary between the normal region and the abnormal region in the observation target is clearly represented by the color and saturation. Perform enhancement processing.
  • the color enhancement unit 66 performs color information conversion processing on the acquired endoscopic image.
  • the color information conversion process is a process of transferring each of a plurality of ranges distributed in the color space of the acquired endoscopic image to the range of the conversion destination associated with the range before conversion.
  • the boundary between the normal region and the abnormal region is clear, so that the abnormal region can be determined more easily and accurately as a specific region. It is an image that can be done.
  • the contrast enhancement unit 67 performs enhancement processing on the acquired endoscopic image so that the endoscopic image is represented by emphasizing the blood vessels in the observation target, for example.
  • the contrast enhancement unit 67 obtains a density histogram, which is a graph in which the pixel value (luminance value) is taken on the horizontal axis and the frequency is taken on the vertical axis in the acquired endoscopic image, and the memory of the image processing unit 56 (not shown).
  • the gradation correction is performed by the gradation correction table stored in advance.
  • the gradation correction table has a gradation correction curve in which the horizontal axis represents an input value and the vertical axis represents an output value, and the correspondence between the input value and the output value is shown.
  • Gradation correction is performed based on the correction curve to widen the dynamic range of the acquired endoscopic image.
  • the density is lower in the low density portion and higher in the high density portion. Therefore, for example, the density difference between the blood vessel region and the region where no blood vessel exists.
  • Spreads and the contrast of blood vessels improves. Therefore, in the endoscopic image processed by the contrast enhancing unit 67, since the contrast of the blood vessels is improved, it is easier and more accurate to determine, for example, a region where the density of blood vessels is high as a specific region. It is an image that can be.
  • the support image signal acquisition unit 63 acquires the support image signal for which the specific area determination unit 64 determines the specific area.
  • the support image signal includes an endoscopic image that has undergone enhancement processing and an endoscopic image that is captured by using special light.
  • the support image signal is relative to the first image signal using the first illumination light, the second image signal using the second illumination light, and the third image signal using the third illumination light.
  • the support image signal acquisition unit 63 acquires these support image signals from the image acquisition unit 52 or the enhancement processing unit 62.
  • the specific area determination unit 64 determines a specific area in which the observation target is in a specific state in each of the plurality of support image signals. Therefore, in one support image signal, a specific area is determined, and whether or not there is a specific area, and if so, that area is determined.
  • the determination of a specific area includes the case of determining that there is no specific area.
  • the specific area determination unit 64 includes a determination processing unit corresponding to each of the plurality of support image signals, and includes a first determination processing unit 71, a second determination processing unit 72, and a third determination processing unit. Each determination processing unit including 73 and the like is provided. Each determination processing unit differs in what kind of specific state of the observation target the specific region of the observation target is determined according to each type of the plurality of support image signals. Therefore, the specific area determination unit 64 includes the first determination processing unit 71 to the nth determination processing unit 74.
  • the first determination processing unit 71 performs determination processing of the first image signal.
  • the surface blood vessels are emphasized by the first illumination light.
  • the first determination processing unit 71 determines a region where the irregularity of the surface blood vessel is higher than a preset threshold value, and sets the region as the first specific region. ..
  • the region with high irregularity of the superficial blood vessels emphasized by the first illumination light is, for example, an index for determining the degree of inflammation of ulcerative colitis when the observation target is the large intestine. Therefore, the first specific region in the first image signal is a non-remission region of ulcerative colitis.
  • the second determination processing unit 72 performs determination processing of the second image signal.
  • the structures such as surface blood vessels and polyps are emphasized by the second illumination light.
  • the second determination processing unit 72 determines a region having a high luminance value indicating a structure such as a polyp, and sets the region as the second specific region.
  • the region with a high luminance value emphasized by the second illumination light indicates, for example, the existence of a polyp or the like whose surface layer to be observed is irregular, and thus serves as an index for determining a lesion such as cancer. Therefore, the second specific region in the second image signal is a region where there is a possibility of a lesion such as cancer having a polyp or the like.
  • the third determination processing unit 73 performs determination processing of the third image signal that has been enhanced.
  • the boundary between the normal region and the abnormal region in the observation target is clearly represented by the color and saturation by the enhancement processing by the third illumination light and the color enhancing unit 66.
  • the third determination processing unit 73 sets the region where the mucous membrane is abnormal as the third specific region.
  • the third illumination light and the enhancement process the third image signal becomes an index for determining a lesion such as cancer whose color is different from that of the surroundings, for example. Therefore, the third specific region in the enhanced third image signal is a region where there is a possibility of a lesion such as cancer having redness or the like.
  • the support image generation unit 65 generates a support image indicating a specific area in the observation image determined by each of the plurality of support image signals by the specific area determination unit 64.
  • the support image is an image in which a specific area on the observation image can be recognized.
  • the support image is displayed by the area corresponding to the area of the specific area. Therefore, the size of the support image can be changed according to the size of the specific area.
  • the support image is, for example, an image in the form of a figure surrounding a portion including a specific area in an observation image. As shown in FIG. 12, the support image 81 indicates that the first specific area is within the area surrounded by the square figure which is the support image 81.
  • the support image 82 indicates that the second specific area is within the area surrounded by the square figure which is the support image 82.
  • the diagonal lines in the support image 82 indicate a specific color.
  • the support image 83 indicates that the third specific area is within the area surrounded by the square figure which is the support image 83.
  • the display control unit 57 controls to superimpose and display a plurality of support images on the display image in a manner that allows them to be distinguished from each other.
  • the plurality of support images are superimposed on the display image in a manner in which each support image can be distinguished and recognized. Therefore, each of the support images can be displayed in different colors. For example, as shown in FIG. 15, the display control unit uses different colors for the support image 81 showing the first specific area, the support image 82 showing the second specific area, and the support image 83 showing the third specific area. Controls to superimpose on the display image 68.
  • the processor device 16 or the endoscope system 10 makes it possible to grasp a plurality of different types of determination results at a glance while displaying an endoscope image in a natural color by normal light on a display.
  • the first specific region is a non-resolving region of ulcerative colitis
  • the second specific region is a region where there is a possibility of a lesion such as cancer in which a polyp or the like is present
  • the third specific region is redness or the like.
  • the display control unit 57 displays the support image 81 showing the first specific area, the support image 82 showing the second specific area, and the support image 83 showing the third specific area with figures having different shapes. You may perform control to superimpose on. As shown in FIG. 16, the support image 81 showing the first specific area may be shown by a quadrangle, the support image 82 showing the second specific area may be shown by a hexagon, and the support image 83 showing the third specific area may be shown by a circle. can. In this case, the colors of the support images may be the same or different. Further, in addition to showing each specific area with one figure, each specific area may be shown by a different pattern such as studded with smaller figures.
  • the display control unit 57 may control the support image to be superimposed and displayed on the display image each time the support image generation unit 65 generates the support image.
  • the specific area determination unit 64 determines the specific area
  • the support image generation unit 65 generates a support image indicating the specific area. .. Therefore, for example, as shown in FIG. 17, in the present embodiment, when the first image signal, which is the support image signal, is obtained by imaging the observation target illuminated by the first illumination light 1S, the support image is obtained.
  • the generation unit 65 generates a support image 81 showing the first specific region.
  • the support image 81 is acquired once in one cycle (1 CY) of the illumination light (see FIG. 8).
  • the acquired support image 81 is continuously displayed on the display image 68 until the next support image 81 is acquired.
  • the support image 82 or the support image 83 is acquired once in one cycle (1CY) of the illumination light, and continues until the next support image 82 or the support image 83 is acquired. Then, it is superimposed on the display image 68 and displayed.
  • the displayed image is updated and displayed as soon as the normal image is acquired.
  • the display control unit 57 controls to superimpose the support image on the display image, so that the support image is displayed in real time.
  • the frame rate which is the period of the frame
  • the support image follows the observation target even if there is some movement of the observation target, and is almost real-time.
  • the display control unit 57 may control to display a legend display indicating the association between the support image and the enhancement process or the illumination light for the support image on the display.
  • the support image is generated based on the support image signal, and the support image signal is the support image signal obtained by imaging using various special lights as the illumination light to irradiate the observation target, and the image signal.
  • the support image 81 showing the first specific region is obtained by the first illumination light
  • the support image 82 showing the second specific region is obtained by the second illumination light
  • the support image 83 showing the third specific region is obtained from the enhanced third support image. Therefore, the "illumination light 1" indicating the IEE by the first illumination light with the same color as the support image 81, the "illumination light 2" indicating the IEE by the second illumination light with the same color as the support image 82, and the support image 83.
  • "Color enhancement” indicating the IEE by the enhancement process with the same color can be displayed as the legend display 91 of the support image at the lower right of the display image.
  • the legend display 91 is also useful as a display indicating what the IEE being performed at that time is in endoscopy.
  • the support image 82 and the support image 83 may be displayed in an overlapping manner. Further, as shown in FIG. 20, when the areas of a plurality of specific regions are the same in the observation target, the support image 82 and the support image 83 are displayed in a distinguishable manner.
  • a location where two specific regions overlap, a specific region where the structure of a polyp or the like is determined and a specific region where a region with high superficial blood vessel density is determined, may be a lesion of cancer. If there is a finding that it is high, it can be easily recognized that these two specific regions overlap, and the determination can be made with better accuracy. Therefore, if there is a finding that the observation target is a lesion in a specific state when a plurality of specific areas overlap, the support image signal for determining the specific area can be set according to the purpose. In the observation target, a region in a desired specific state is shown by a support image, which is preferable.
  • two or more image signals may be used to determine one type of specific area.
  • the determination may be made using two or more image signals. It is preferable to use two or more image signals to determine one type of specific region because the number of types of the specific state of the observation target indicated by the specific region increases.
  • the support image showing the specific area may have text information.
  • the character information is information in which the determination result of a specific area or the like is displayed in characters.
  • the support image 81 has the text information 92 of "UC severity: mild".
  • the text information 92 is information indicating in text that the specific state of the observation target shown by the support image 81 is determined to be mild of ulcerative colitis.
  • the UC severity is shown by dividing the state of the intensity of inflammation into three categories: severe, moderate, and mild.
  • the support image 82 has the character information 93 of "polyp: ⁇ 2 mm".
  • the specific state of the observation target shown by the support image 82 is a polyp, which is information indicating in characters that the diameter of the polyp is determined to be 2 mm. Further, the support image 83 has the character information 94 of "cancer suspicion rate: 30%".
  • the text information 94 is information indicating in text that the specific state of the observation target shown by the support image 83 is suspected to be cancer and the suspicion rate of cancer, which is the probability thereof, is determined to be 30%. Is. Since the support image has text information, more detailed support information can be displayed.
  • the illumination light is emitted in a preset order according to a predetermined illumination light pattern (see FIG. 8).
  • the illumination light is normal light, and a normal image is acquired (step ST110).
  • the normal image by normal light is displayed on the display as a display image (step ST120).
  • the illumination light becomes the first illumination light, and the first image signal is acquired (step ST130).
  • the first specific area is determined, the support image indicating the first specific area is superimposed on the display image, and the display image with the support image is displayed (step ST140).
  • the illumination light becomes normal light, and a normal image is acquired (step ST150).
  • the normal image by normal light is displayed on the display as a display image (step ST160).
  • the illumination light becomes the second illumination light, and the second image signal is acquired (step ST170).
  • the second specific area is determined based on the second image signal, the support image indicating the second specific area is superimposed on the display image, and the display image with the support image is displayed (step ST180).
  • the illumination light becomes normal light, and a normal image is acquired (step ST190).
  • the normal image by normal light is displayed on the display as a display image (step ST200).
  • the illumination light becomes the third illumination light
  • the third image signal is acquired (step ST210).
  • Emphasis processing is performed on the third image signal (step ST220).
  • the third specific area is determined, the support image indicating the third specific area is superimposed on the display image, and the display image with the support image is displayed (step ST230). ..
  • the processor device 16 functions as an image processing device, but an image processing device including an image processing unit 56 may be provided separately from the processor device 16.
  • the image processing unit 56 has taken an image with the endoscope 12 directly from the endoscope system 10 or indirectly from the PACS (Picture Archiving and Communication Systems) 910, for example. It can be provided in the diagnostic support device 911 that acquires RAW images.
  • various inspection devices such as the first inspection device 921, the second inspection device 922, ..., and the K inspection device 923, including the endoscope system 10, are connected via the network 926.
  • the image processing unit 56 can be provided in the medical service support device 930.
  • the endoscope 12 uses a so-called flexible endoscope having a flexible insertion portion 12a, but the observation target swallows and uses a capsule-type endoscope.
  • the present invention is also suitable when a rigid endoscope (laparoscope) used for a mirror, surgery, or the like is used.
  • the above-described embodiment and modification are methods of operating an image processing device including a processor, which include an image acquisition step of acquiring a plurality of image signals obtained by imaging an observation target using an endoscope, and at least an image acquisition step.
  • a display image generation step of generating a display image to be displayed on a display using a display image signal obtained based on one of the image signals, and an observation target in a specific state based on each of a plurality of image signals.
  • a support image generation step for generating a support image that defines and shows a specific area, and a control for displaying a plurality of support images superimposed on the display image in a manner that can be distinguished from each other when the display image is displayed on the display. Includes a method of operating an image processing apparatus comprising a display control step to perform.
  • a display image generation function that generates an image, a support image generation function that generates a support image that defines and indicates a specific area in which the observation target is in a specific state based on each of a plurality of image signals, and a display image display. Includes a program for an image processing device for realizing a display control function for controlling display of a plurality of support images superimposed on the display image in a manner that allows them to be distinguished from each other.
  • a processing unit that executes various processes such as a control unit 51, an image acquisition unit 52, a DSP 53, a noise reduction unit 54, a conversion unit 55, an image processing unit 56, and a display control unit 57 included in the processor device 16.
  • the hardware structure of the (processing unit) is various processors as shown below.
  • the circuit configuration is changed after manufacturing the CPU (Central Processing Unit), FPGA (Field Programmable Gate Array), etc., which are general-purpose processors that execute software (programs) and function as various processing units. It includes a programmable logic device (PLD), which is a possible processor, a dedicated electric circuit, which is a processor having a circuit configuration specially designed for executing various processes, and the like.
  • PLD programmable logic device
  • One processing unit may be composed of one of these various processors, or may be composed of a combination of two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of a CPU and an FPGA). May be done. Further, a plurality of processing units may be configured by one processor. As an example of configuring a plurality of processing units with one processor, first, as represented by a computer such as a client or a server, one processor is configured by a combination of one or more CPUs and software. There is a form in which this processor functions as a plurality of processing units.
  • SoC System On Chip
  • the various processing units are configured by using one or more of the above-mentioned various processors as a hardware-like structure.
  • the hardware-like structure of these various processors is, more specifically, an electric circuit (circuitry) in which circuit elements such as semiconductor elements are combined.
  • the present invention is a system or device for acquiring medical images (including moving images) other than endoscopic images. It can also be used in such cases.
  • the present invention can be applied to an ultrasonic inspection device, an X-ray imaging device (including a CT (Computed Tomography) inspection device, a mammography device, etc.), an MRI (magnetic resonance imaging) device, and the like.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

L'invention concerne un dispositif de traitement d'image qui peut facilement reconnaître des informations détaillées concernant un sujet observé, un système d'endoscope, un procédé de fonctionnement pour un dispositif de traitement d'image, et un programme pour un dispositif de traitement d'image. L'invention décrit un dispositif de traitement d'image (16) qui effectue une commande pour acquérir une pluralité de signaux d'image à l'aide d'un endoscope, pour générer une image d'affichage à afficher sur un dispositif d'affichage à l'aide d'un signal d'image d'affichage obtenu sur la base d'au moins l'un des signaux d'image, pour déterminer une région spécifique dans laquelle un sujet observé est dans un état spécifique dans chacun de la pluralité de signaux d'image, pour générer une image d'assistance qui indique la région spécifique, et lors de l'affichage de l'image d'affichage sur le dispositif d'affichage, pour l'afficher par superposition d'une pluralité d'images d'assistance sur l'image d'affichage de telle sorte que les images d'assistance soient distinguées les unes des autres.
PCT/JP2021/010863 2020-07-07 2021-03-17 Dispositif de traitement d'image, système d'endoscope, procédé de fonctionnement pour dispositif de traitement d'image et programme pour dispositif de traitement d'image WO2022009478A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022534904A JPWO2022009478A1 (fr) 2020-07-07 2021-03-17

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-117098 2020-07-07
JP2020117098 2020-07-07

Publications (1)

Publication Number Publication Date
WO2022009478A1 true WO2022009478A1 (fr) 2022-01-13

Family

ID=79553244

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/010863 WO2022009478A1 (fr) 2020-07-07 2021-03-17 Dispositif de traitement d'image, système d'endoscope, procédé de fonctionnement pour dispositif de traitement d'image et programme pour dispositif de traitement d'image

Country Status (2)

Country Link
JP (1) JPWO2022009478A1 (fr)
WO (1) WO2022009478A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020012872A1 (fr) * 2018-07-09 2020-01-16 富士フイルム株式会社 Dispositif de traitement d'image médicale, système de traitement d'image médicale, procédé de traitement d'image médicale et programme
WO2020075578A1 (fr) * 2018-10-12 2020-04-16 富士フイルム株式会社 Dispositif de traitement d'image médicale, système d'endoscope et procédé de traitement d'image médicale

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020012872A1 (fr) * 2018-07-09 2020-01-16 富士フイルム株式会社 Dispositif de traitement d'image médicale, système de traitement d'image médicale, procédé de traitement d'image médicale et programme
WO2020075578A1 (fr) * 2018-10-12 2020-04-16 富士フイルム株式会社 Dispositif de traitement d'image médicale, système d'endoscope et procédé de traitement d'image médicale

Also Published As

Publication number Publication date
JPWO2022009478A1 (fr) 2022-01-13

Similar Documents

Publication Publication Date Title
CN110325100B (zh) 内窥镜系统及其操作方法
JP6785948B2 (ja) 医療用画像処理装置及び内視鏡システム並びに医療用画像処理装置の作動方法
JP7335399B2 (ja) 医用画像処理装置及び内視鏡システム並びに医用画像処理装置の作動方法
JP7190597B2 (ja) 内視鏡システム
JPWO2020036121A1 (ja) 内視鏡システム
JP2020065685A (ja) 内視鏡システム
US20230027950A1 (en) Medical image processing apparatus, endoscope system, method of operating medical image processing apparatus, and non-transitory computer readable medium
JP6785990B2 (ja) 医療画像処理装置、及び、内視鏡装置
US11627864B2 (en) Medical image processing apparatus, endoscope system, and method for emphasizing region of interest
US20230141302A1 (en) Image analysis processing apparatus, endoscope system, operation method of image analysis processing apparatus, and non-transitory computer readable medium
US20230237659A1 (en) Image processing apparatus, endoscope system, operation method of image processing apparatus, and non-transitory computer readable medium
US12020350B2 (en) Image processing apparatus
WO2022009478A1 (fr) Dispositif de traitement d'image, système d'endoscope, procédé de fonctionnement pour dispositif de traitement d'image et programme pour dispositif de traitement d'image
JP7214886B2 (ja) 画像処理装置及びその作動方法
WO2021006121A1 (fr) Dispositif de traitement d'image, système d'endoscope, et procédé de fonctionnement de dispositif de traitement d'image
WO2021210331A1 (fr) Dispositif de traitement d'image et son procédé de fonctionnement
WO2022059233A1 (fr) Dispositif de traitement d'image, système d'endoscope, procédé de fonctionnement pour dispositif de traitement d'image et programme pour dispositif de traitement d'image
WO2022210508A1 (fr) Dispositif processeur, dispositif de traitement d'image médicale, système de traitement d'image médicale, et système endoscopique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21837972

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022534904

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21837972

Country of ref document: EP

Kind code of ref document: A1