WO2022009478A1 - Image processing device, endoscope system, operation method for image processing device, and program for image processing device - Google Patents

Image processing device, endoscope system, operation method for image processing device, and program for image processing device Download PDF

Info

Publication number
WO2022009478A1
WO2022009478A1 PCT/JP2021/010863 JP2021010863W WO2022009478A1 WO 2022009478 A1 WO2022009478 A1 WO 2022009478A1 JP 2021010863 W JP2021010863 W JP 2021010863W WO 2022009478 A1 WO2022009478 A1 WO 2022009478A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
support
light
observation target
Prior art date
Application number
PCT/JP2021/010863
Other languages
French (fr)
Japanese (ja)
Inventor
康太郎 檜谷
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to JP2022534904A priority Critical patent/JPWO2022009478A1/ja
Publication of WO2022009478A1 publication Critical patent/WO2022009478A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof

Definitions

  • the present invention relates to an image processing device, an endoscope system, an operation method of the image processing device, and a program for the image processing device.
  • diagnosis using an endoscope system equipped with a light source device, an endoscope, and a processor device is widely performed.
  • diagnosis using an endoscope system an image obtained by photographing an observation target with an endoscope by a method called image-enhanced endoscopy or image-enhanced observation (IEE, image enhanced endoscopy) (hereinafter referred to as "inside").
  • IEE image enhanced endoscopy
  • a plurality of support information for diagnosing the observation target such as the surface structure of the observation target, biological information on the surface layer of the mucous membrane, or the possibility of a lesion, is obtained.
  • IEE there are various types of IEE, such as a method of digitally processing an endoscopic image obtained by imaging an observation target and using it, or a method of using an endoscopic image captured by illuminating the observation target with a specific illumination light.
  • the method is known.
  • biological information such as a region where blood vessels are densely packed or a region having a low oxygen saturation in an observation target, and to emphasize and display these regions.
  • These displays provide support information for the doctor to diagnose the observation target.
  • the stage of the disease is determined from the range of the area where there is a possibility of lesion in the observation target or the degree of inflammation by IEE, etc., and the obtained determination result is provided as support information for the diagnosis of the doctor.
  • CAD Computer-Aided Diagnosis
  • an endoscopic system that determines the severity or progression of a disease such as the stage of ulcerative colitis with high accuracy by using an endoscopic image obtained by IEE (Patent Document 1).
  • the support information that can be obtained well may differ. Therefore, the effect or application obtained may differ depending on the type of IEE.
  • different IEEs may be used as the information to be observed for the purpose of acquiring the oxygen saturation and the purpose of acquiring the polyp region. Therefore, by using a plurality of IEEs, it is possible to obtain a plurality of types of diagnostic support information regarding the observation target, which is useful in screening, diagnosis of lesions, and the like.
  • An object of the present invention is to provide an image processing device, an endoscope system, an operation method of the image processing device, and a program for the image processing device, which can easily recognize detailed information of an observation target.
  • the present invention is an image processing device and includes a processor.
  • the processor acquires a plurality of image signals obtained by imaging an observation target using an endoscope, and uses a display image signal obtained based on at least one image signal to display a display image on a display. Is generated, in each of a plurality of image signals, a specific area in which the observation target is in a specific state is determined, a support image indicating the specific area is generated, and a plurality of support images are displayed when the display image is displayed on the display. Is controlled to be superimposed on the display image in a manner that can be distinguished from each other.
  • the processor displays each of the plurality of support images in different colors from each other.
  • the processor displays each of the plurality of support images with figures having different shapes from each other.
  • the processor controls to superimpose and display the support image on the display image each time the support image is generated.
  • the processor enhances the image signal to obtain a support image signal and determines a specific area based on the support image signal.
  • the processor controls to display a legend display showing the association between the support image and the enhancement process on the display.
  • the present invention is an endoscope system, and includes the image processing apparatus of the present invention and a light source unit that emits illumination light to irradiate an observation target.
  • the display image signal is preferably obtained by imaging an observation target illuminated by white illumination light emitted by a light source unit.
  • the processor generates a support image by capturing an observation target illuminated by each of a plurality of support image illumination lights having different spectral spectra from each other emitted by the light source unit.
  • the plurality of support image illumination lights include narrow band light having a preset wavelength band.
  • the light source unit repeatedly emits white illumination light and each of a plurality of support image illumination lights in a preset order.
  • the display control unit controls to display a legend display indicating the association between the support image and the illumination light for the support image on the display.
  • the present invention is a method of operating an image processing device, which is based on an image acquisition step of acquiring a plurality of image signals obtained by imaging an observation target using an endoscope, and at least one image signal.
  • a support image showing a specific area in which the observation target is in a specific state based on each of a display image generation step of generating a display image to be displayed on the display and a plurality of image signals using the display image signal obtained. It is provided with a support image generation step for generating a display image, and a display control step for controlling display of a plurality of support images superimposed on the display image in a manner distinguishable from each other when the display image is displayed on the display.
  • the present invention is a program for an image processing device, which is a program for an image processing device installed in an image processing device that performs image processing on an image signal obtained by imaging an observation target using an endoscope.
  • a display image generation function that generates a display image to be displayed on the screen, a support image generation function that generates a support image indicating a specific area in which the observation target is in a specific state based on each of a plurality of image signals, and a display image.
  • the endoscope system 10 includes an endoscope 12, a light source device 14, a processor device 16, a display 18, and a keyboard 19.
  • the endoscope 12 photographs the observation target.
  • the light source device 14 emits illumination light to irradiate the observation target.
  • the processor device 16 controls the system of the endoscope system 10.
  • the display 18 is a display unit that displays an endoscopic image or the like.
  • the keyboard 19 is an input device for inputting settings to the processor device 16 and the like.
  • the endoscope system 10 has three modes as an observation mode: a normal mode, a special mode, and a support mode.
  • a normal mode a normal image having a natural color is displayed on the display 18 as a display image by irradiating the observation target with normal light and taking an image.
  • a special mode a special image emphasizing a specific state of the observation target is displayed on the display 18 as a display image by illuminating the observation target with special light having a wavelength band or a spectral spectrum different from that of normal light and taking an image.
  • the support mode when the normal image obtained by irradiating the observation target with normal light and taking an image is displayed on the display 18 as a display image, the support image is superimposed and displayed on the display image.
  • the support image is an image showing a specific area in which the observation target is in a specific state.
  • the display image in the normal mode may be a special image that has undergone enhancement processing or the like, or may be used as a display image as long as it has
  • the fact that the observation target is in a specific state means that various information such as biological information of the observation target obtained from image analysis of the endoscopic image obtained by capturing the observation target satisfy preset conditions. Determine if the state of the observation target is specific.
  • the biological information of the observation target is a numerical value or the like representing the whole or partial characteristics of the observation target, and is, for example, oxygen saturation, blood concentration, blood vessel density, or a lesion or a candidate for a lesion (subject of biological tissue examination).
  • the observation target is in a specific state.
  • a plurality of biological information obtained from the endoscopic image may be used to determine the state of the observation target, or the state of the observation target may be determined by the image recognition technique by machine learning using the endoscopic image. good.
  • the observation target when the density of the surface blood vessels of the observation target is equal to or higher than a specific value, the observation target is in a dense state with a high density of the surface blood vessels. judge.
  • the support image is an image showing the region in the dense state where the superficial blood vessels in the observation target are in a high density. be. Since the support image is superimposed on the display image displaying the observation target and shown on the display, it is possible to show a specific area in which the observation target is in a specific state.
  • the display image displaying the observation target may be arbitrarily selected from a normal light image or a special image. In the present embodiment, the display image in the support mode is a normal image.
  • the support image shows a specific area.
  • the support image is preferably generated based on the support image signal.
  • the support image signal is a support image signal obtained by imaging using various special lights as illumination light to irradiate the observation target, and a support obtained by performing various image processing such as enhancement processing on the image signal.
  • the support image signal obtained by performing the image processing is, for example, a color-enhanced image obtained by performing a color difference expansion process in which the color difference of a plurality of regions in the observation target is expanded with respect to the image signal obtained by imaging, and a color-enhanced image. Includes contrast-enhanced images with enhanced contrast.
  • Various special lights will be described later.
  • the endoscope 12 includes an insertion portion 12a to be inserted into a subject having an observation target, an operation portion 12b provided at the base end portion of the insertion portion 12a, and a bending portion 12c provided on the distal end side of the insertion portion 12a. It has a tip portion 12d.
  • the angle knob 12e see FIG. 2
  • the operation unit 12b is provided with an angle knob 12e, a treatment tool insertion port 12f, a scope button 12g, and a zoom operation unit 13.
  • the treatment tool insertion port 12f is an entrance for inserting a treatment tool such as a biopsy forceps, a snare, or an electric knife.
  • the treatment tool inserted into the treatment tool insertion port 12f protrudes from the tip portion 12d.
  • Various operations can be assigned to the scope button 12g, and in the present embodiment, it is used for the operation of switching the observation mode. By operating the zoom operation unit 13, the observation target can be enlarged or reduced for shooting.
  • the light source device 14 includes a light source unit 20 including a light source that emits illumination light, and a light source control unit 22 that controls the operation of the light source unit 20.
  • the light source unit 20 emits illumination light that illuminates the observation target.
  • the illumination light includes light emission such as excitation light used to emit the illumination light.
  • the light source unit 20 includes, for example, a light source of a laser diode, an LED (Light Emitting Diode), a xenon lamp, or a halogen lamp, and at least an excitation light used to emit white illumination light or white illumination light. Emit.
  • the white color includes so-called pseudo-white color, which is substantially equivalent to white color in the imaging of the observation target using the endoscope 12.
  • the light source unit 20 includes, if necessary, a phosphor that emits light when irradiated with excitation light, an optical filter that adjusts the wavelength band, spectral spectrum, light amount, etc. of the illumination light or excitation light.
  • the light source unit 20 can emit illumination light composed of at least narrow band light (hereinafter referred to as narrow band light).
  • the light source unit 20 can emit a plurality of illumination lights having different spectral spectra from each other.
  • the plurality of illumination lights may include narrow band light.
  • the light source unit 20 may emit light having a specific wavelength band or spectral spectrum necessary for capturing an image used for calculating biological information such as oxygen saturation of hemoglobin contained in the observation target, for example. can.
  • the “narrow band” means a substantially single wavelength band in relation to the characteristics of the observation target and / or the spectral characteristics of the color filter of the image sensor 45.
  • the wavelength band is, for example, about ⁇ 20 nm or less (preferably about ⁇ 10 nm or less), this light is a narrow band.
  • the light source unit 20 has four color LEDs of V-LED20a, B-LED20b, G-LED20c, and R-LED20d.
  • the V-LED 20a emits purple light VL having a center wavelength of 405 nm and a wavelength band of 380 to 420 nm.
  • the B-LED 20b emits blue light BL having a center wavelength of 460 nm and a wavelength band of 420 to 500 nm.
  • the G-LED 20c emits green light GL having a wavelength band of 480 to 600 nm.
  • the R-LED 20d emits red light RL having a center wavelength of 620 to 630 nm and a wavelength band of 600 to 650 nm.
  • the center wavelengths of the V-LED 20a and the B-LED 20b have a width of about ⁇ 20 nm, preferably about ⁇ 5 nm to about ⁇ 10 nm.
  • the purple light VL is short-wavelength light used for detecting superficial blood vessel congestion, intramucosal hemorrhage, and extramucosal hemorrhage used in the special light mode or the support mode, and has a central wavelength or a peak wavelength of 410 nm. It is preferable to include it. Further, the purple light VL is preferably narrow band light.
  • the light source control unit 22 controls the timing of turning on, off, or shielding each light source constituting the light source unit 20, the amount of light emitted, and the like. As a result, the light source unit 20 can emit a plurality of types of illumination light having different spectral spectra for a preset period and emission amount. In the present embodiment, the light source control unit 22 sets the lighting and extinguishing of each of the LEDs 20a to 20d, the amount of light emitted at the time of lighting, the insertion and removal of the optical filter, and the like by inputting an independent control signal to each of the illumination lights. Adjust the spectral spectrum. As a result, the light source unit 20 emits white illumination light, a plurality of types of illumination light having different spectral spectra, or illumination light composed of at least narrow band light.
  • the light source unit 20 emits white illumination light under the control of the light source control unit 22.
  • the white illumination light is normal light.
  • the light source control unit 22 has a normal light intensity ratio of Vc: Bc: Gc: Rc between the purple light VL, the blue light BL, the green light GL, and the red light RL.
  • Each LED 20a to 20d is controlled so as to emit light.
  • the light intensity ratio Vc: Bc: Gc: Rc corresponds to the light intensity condition of the white illumination light.
  • the light source unit 20 emits a plurality of special lights having different spectral spectra from each other under the control of the light source control unit 22. This is because, by imaging with a plurality of special lights, a plurality of support image signals can be obtained, and a plurality of support images different from each other can be obtained.
  • the plurality of special lights may be two or more kinds of n kinds. Note that n is an integer.
  • the plurality of special lights are three types of the first illumination light, the second illumination light, and the third illumination light.
  • the light source control unit 22 has a light intensity ratio of Vs1: Bs1: between purple light VL, blue light BL, green light GL, and red light RL as the first illumination light.
  • Each LED 20a to 20d is controlled so as to emit a special light of Gs1: Rs1.
  • the light intensity ratio Vs1: Bs1: Gs1: Rs1 corresponds to the light intensity condition of the first illumination light.
  • the first illumination light preferably emphasizes the surface blood vessels. Therefore, as shown in FIG. 5, the first illumination light emits only purple light VL as a short-wavelength narrow-band light with a light intensity ratio of 1: 0: 0: 0.
  • the light source control unit 22 has a light intensity ratio of Vs2: Bs2: between purple light VL, blue light BL, green light GL, and red light RL as the second illumination light.
  • Gs2 Each LED 20a to 20d is controlled so as to emit special light that becomes Rs2.
  • the light intensity ratio Vs2: Bs2: Gs2: Rs2 corresponds to the light intensity condition of the second illumination light.
  • the second illumination light preferably emphasizes structures such as surface blood vessels and polyps. Therefore, in the second illumination light, it is preferable that the light intensity of the purple light VL is higher than the light intensity of the blue light BL.
  • the ratio of the light intensity Vs2 of the purple light VL to the light intensity Bs2 of the blue light BL is set to “4: 1”.
  • the light source control unit 22 has a light intensity ratio of Vs3: Bs3: between purple light VL, blue light BL, green light GL, and red light RL as the third illumination light.
  • Gs3 Each LED 20a to 20d is controlled so as to emit special light that becomes Rs3.
  • the light intensity ratio Vs3: Bs3: Gs3: Rs3 corresponds to the light intensity condition of the third illumination light.
  • the third illumination light preferably emphasizes deep blood vessels. Therefore, it is preferable that the light intensity of the blue light BL is higher than the light intensity of the purple light VL for the third illumination light. For example, as shown in FIG. 7, the ratio of the light intensity Vs3 of the purple light VL to the light intensity Bs3 of the blue light BL is set to “1: 3”.
  • the three types of the first illumination light, the second illumination light, and the third illumination light of the present embodiment are illumination lights including narrow band light.
  • the light intensity ratio includes the case where the ratio of at least one semiconductor light source is 0 (zero). Therefore, this includes the case where any one or more of the semiconductor light sources are not lit. For example, as in the case where the light intensity ratio between the purple light VL, the blue light BL, the green light GL, and the red light RL is 1: 0: 0: 0, only one of the semiconductor light sources is turned on, and the other 3 are turned on. Even if one does not light up, it shall have a light intensity ratio.
  • the plurality of illumination lights are repeatedly emitted in a preset order.
  • the normal light, the special light, the first illumination light, the second illumination light, and the third illumination light are repeatedly emitted in a preset order.
  • the light source control unit 22 emits the normal light CL for 5 frames (5FL) in succession, and then emits the first illumination light 1S for 1 frame (1FL).
  • the normal light CL is continuously emitted for 5 frames (5FL)
  • the second illumination light 2S is emitted for 1 frame (1FL
  • the normal light CL is emitted again for 5 frames (5FL), and then.
  • the illumination pattern having a light emission order in which the third illumination light 3S emits light for one frame (1FL) is set as one cycle (1CY), and this cycle is repeated.
  • a normal image signal to be a normal image is obtained, the first image signal is obtained by the first illumination light, the second image signal is obtained by the second illumination light, and the second image signal is obtained.
  • the third image signal is obtained by the three illumination lights. Since the first illumination light, the second illumination light, and the third illumination light are special lights, the first image signal, the second image signal, and the third image signal are special images, respectively.
  • the tip portion 12d of the endoscope 12 is provided with an illumination optical system 30a and a photographing optical system 30b (see FIG. 3).
  • the illumination optical system 30a has an illumination lens 42, and the illumination light is emitted toward the observation target through the illumination lens 42.
  • the photographing optical system 30b has an objective lens 43, a zoom lens 44, and an image sensor 45.
  • the image sensor 45 was administered to the observation target via the objective lens 43 and the zoom lens 44, such as reflected light of the illumination light returning from the observation target (in addition to the reflected light, scattered light, fluorescence emitted by the observation target, or administration to the observation target.
  • the observation target is photographed using (including fluorescence caused by the drug).
  • the zoom lens 44 moves by operating the zoom operation unit 13, and enlarges or reduces the observation target image.
  • the image sensor 45 has a color filter of one of a plurality of color filters for each pixel.
  • the image sensor 45 is a color sensor having a primary color system color filter.
  • the image sensor 45 includes an R pixel having a red color filter (R filter), a G pixel having a green color filter (G filter), and a B pixel having a blue color filter (B filter).
  • R filter red color filter
  • G filter green color filter
  • B filter blue color filter
  • a CCD (Charge Coupled Device) sensor or a CMOS (Complementary Metal Oxide Semiconductor) sensor can be used.
  • the image sensor 45 of the present embodiment is a primary color system color sensor, a complementary color system color sensor can also be used.
  • Complementary color sensors include, for example, a cyan pixel provided with a cyan color filter, a magenta pixel provided with a magenta color filter, a yellow pixel provided with a yellow color filter, and a green pixel provided with a green color filter. Have.
  • the complementary color sensor is used, the image obtained from the pixels of each of the above colors can be converted into the same image as the image obtained by the primary color sensor by performing the complementary color-primary color conversion.
  • the primary color system or complementary color system sensor has one or a plurality of types of pixels having characteristics other than the above, such as W pixels (white pixels that receive light in almost all wavelength bands).
  • W pixels white pixels that receive light in almost all wavelength bands.
  • the image sensor 45 of the present embodiment is a color sensor, a monochrome sensor having no color filter may be used.
  • the processor device 16 incorporates a program (not shown) related to processing performed by the control unit 51, the image acquisition unit 52, the image processing unit 56, the display control unit 57, and the like, which will be described later.
  • the program is operated by the control unit 51 composed of the processor included in the processor device 16 that functions as an image processing device, so that the functions of the control unit 51, the image acquisition unit 52, the image processing unit 56, and the display control unit 57 can be performed. Realize.
  • the control unit 51 comprehensively controls the endoscope system 10 such as synchronous control of the irradiation timing of the illumination light and the shooting timing.
  • the control unit 51 sets the settings in each part of the endoscope system 10 such as the light source control unit 22, the image sensor 45, or the image processing unit 56. Enter in.
  • the image acquisition unit 52 acquires an image of an observation target captured using pixels of each color, that is, a RAW image, from the image sensor 45.
  • the RAW image is an image (endoscopic image) before the demosaic processing is performed. If the image is an image before the demosaic processing is performed, the RAW image also includes an image obtained by performing arbitrary processing such as noise reduction processing on the image acquired from the image sensor 45.
  • the image acquisition unit 52 includes a DSP (Digital Signal Processor) 53, a noise reduction unit 54, and a conversion unit 55 in order to perform various processing on the acquired RAW image as needed.
  • DSP Digital Signal Processor
  • the DSP 53 includes, for example, an offset processing unit, a defect correction processing unit, a demosaic processing unit, a linear matrix processing unit, a YC conversion processing unit, and the like (none of which are shown).
  • the DSP 53 performs various processing on the RAW image or the image generated by using the RAW image using these.
  • the offset processing unit performs offset processing on the RAW image.
  • the offset process is a process of reducing the dark current component from the RAW image and setting an accurate zero level.
  • the offset process may be referred to as a clamp process.
  • the defect correction processing unit performs defect correction processing on the RAW image.
  • the defect correction process is a process of correcting or generating a pixel value of a RAW pixel corresponding to a defective pixel of the image sensor 45 when the image sensor 45 includes a pixel (defective pixel) having a defect due to a manufacturing process or a change with time. Is.
  • the demosaic processing unit performs demosaic processing on the RAW image of each color corresponding to the color filter of each color.
  • the demosaic process is a process of generating pixel values that are missing due to the arrangement of color filters in a RAW image by interpolation.
  • the linear matrix processing unit performs linear matrix processing on the endoscopic image generated by assigning one or a plurality of RAW images to channels of each RGB color.
  • the linear matrix processing is a processing for enhancing the color reproducibility of an endoscopic image.
  • an endoscope image generated by assigning one or a plurality of RAW images to channels of each RGB color is used as an endoscope having a brightness channel Y, a color difference channel Cb, and a color difference channel Cr. This is the process of converting to an image.
  • the noise reduction unit 54 performs noise reduction processing on an endoscope image having a brightness channel Y, a color difference channel Cb, and a color difference channel Cr by using, for example, a moving average method or a median filter method.
  • the conversion unit 55 reconverts the luminance channel Y, the color difference channel Cb, and the color difference channel Cr after the noise reduction processing into an endoscopic image having channels of each color of BGR.
  • the image processing unit 56 performs necessary image processing or calculation on the endoscopic image output by the image acquisition unit 52.
  • the image processing unit 56 includes a display image generation unit 61, an enhancement processing unit 62, a support image signal acquisition unit 63, a specific area determination unit 64, and a support image generation unit 65.
  • the enhancement processing unit 62 includes a color enhancement unit 66 and a contrast enhancement unit 67.
  • the display image generation unit 61 generates a display image by performing image processing necessary for displaying the endoscope image output by the image acquisition unit 52 on the display 18.
  • the enhancement processing unit 62 performs enhancement processing on the endoscopic image output by the image acquisition unit 52.
  • the support image signal acquisition unit 63 acquires the support image signal.
  • the support image signal includes an endoscopic image that has undergone enhancement processing and an endoscopic image that is captured by using special light.
  • the specific area determination unit 64 performs a process of determining a specific area for each of the support image signals.
  • the support image generation unit 65 generates a support image showing a specific area.
  • the display image generation unit 61 generates a display image to be displayed on the display using a display image signal obtained based on at least one image signal.
  • the display image generation unit 61 generates a display image using an image signal obtained by normal light as a display image signal in the normal mode and the support mode, and normally has a natural hue.
  • the image is displayed on the display 18 as a display image 68.
  • the image signal obtained by the special light is used as a display image signal to generate a display image, and for example, a special image in which a specific state of an observation target is emphasized is displayed on the display 18.
  • the light source control unit 22 repeats a specific illumination pattern in units of frames (see FIG. 8).
  • a frame is a unit of imaging. Therefore, one imaging and image signal acquisition are performed in one frame.
  • the display image is generated from the image signal obtained by the normal light and displayed on the display, the first illumination light 1S and the second illumination light 2S, which are the illumination lights other than the normal light CL, among the illumination patterns.
  • the display image is not generated from the image signal obtained by the special light of the third illumination light 3S, and is used as an image signal for generating a support image.
  • the display image by the image signal obtained by the immediately preceding normal light is continuously displayed. Therefore, all the displayed images are images based on image signals obtained by ordinary light.
  • the enhancement processing unit 62 performs enhancement processing on the endoscopic image output by the image acquisition unit 52. Emphasis means making it possible to obtain information on a specific part by distinguishing it from other organizations or structures. For example, a process of changing the color or brightness of a portion having a specific feature relative to another portion (for example, a normal mucous membrane) is an emphasis process.
  • the endoscopic image processed by the enhancement processing unit 62 may be a normal image or a special image.
  • the enhanced endoscopic image is used as a support image signal for generating a support image.
  • the support image signal is an image signal for generating a support image.
  • the enhancement processing unit 62 includes a color enhancement unit 66 and a contrast enhancement unit 67.
  • the third image signal obtained by the third illumination light is processed by the color enhancing unit 66 to obtain a support image signal.
  • the color enhancement unit 66 refers to the acquired endoscopic image so that, for example, the boundary between the normal region and the abnormal region in the observation target is clearly represented by the color and saturation. Perform enhancement processing.
  • the color enhancement unit 66 performs color information conversion processing on the acquired endoscopic image.
  • the color information conversion process is a process of transferring each of a plurality of ranges distributed in the color space of the acquired endoscopic image to the range of the conversion destination associated with the range before conversion.
  • the boundary between the normal region and the abnormal region is clear, so that the abnormal region can be determined more easily and accurately as a specific region. It is an image that can be done.
  • the contrast enhancement unit 67 performs enhancement processing on the acquired endoscopic image so that the endoscopic image is represented by emphasizing the blood vessels in the observation target, for example.
  • the contrast enhancement unit 67 obtains a density histogram, which is a graph in which the pixel value (luminance value) is taken on the horizontal axis and the frequency is taken on the vertical axis in the acquired endoscopic image, and the memory of the image processing unit 56 (not shown).
  • the gradation correction is performed by the gradation correction table stored in advance.
  • the gradation correction table has a gradation correction curve in which the horizontal axis represents an input value and the vertical axis represents an output value, and the correspondence between the input value and the output value is shown.
  • Gradation correction is performed based on the correction curve to widen the dynamic range of the acquired endoscopic image.
  • the density is lower in the low density portion and higher in the high density portion. Therefore, for example, the density difference between the blood vessel region and the region where no blood vessel exists.
  • Spreads and the contrast of blood vessels improves. Therefore, in the endoscopic image processed by the contrast enhancing unit 67, since the contrast of the blood vessels is improved, it is easier and more accurate to determine, for example, a region where the density of blood vessels is high as a specific region. It is an image that can be.
  • the support image signal acquisition unit 63 acquires the support image signal for which the specific area determination unit 64 determines the specific area.
  • the support image signal includes an endoscopic image that has undergone enhancement processing and an endoscopic image that is captured by using special light.
  • the support image signal is relative to the first image signal using the first illumination light, the second image signal using the second illumination light, and the third image signal using the third illumination light.
  • the support image signal acquisition unit 63 acquires these support image signals from the image acquisition unit 52 or the enhancement processing unit 62.
  • the specific area determination unit 64 determines a specific area in which the observation target is in a specific state in each of the plurality of support image signals. Therefore, in one support image signal, a specific area is determined, and whether or not there is a specific area, and if so, that area is determined.
  • the determination of a specific area includes the case of determining that there is no specific area.
  • the specific area determination unit 64 includes a determination processing unit corresponding to each of the plurality of support image signals, and includes a first determination processing unit 71, a second determination processing unit 72, and a third determination processing unit. Each determination processing unit including 73 and the like is provided. Each determination processing unit differs in what kind of specific state of the observation target the specific region of the observation target is determined according to each type of the plurality of support image signals. Therefore, the specific area determination unit 64 includes the first determination processing unit 71 to the nth determination processing unit 74.
  • the first determination processing unit 71 performs determination processing of the first image signal.
  • the surface blood vessels are emphasized by the first illumination light.
  • the first determination processing unit 71 determines a region where the irregularity of the surface blood vessel is higher than a preset threshold value, and sets the region as the first specific region. ..
  • the region with high irregularity of the superficial blood vessels emphasized by the first illumination light is, for example, an index for determining the degree of inflammation of ulcerative colitis when the observation target is the large intestine. Therefore, the first specific region in the first image signal is a non-remission region of ulcerative colitis.
  • the second determination processing unit 72 performs determination processing of the second image signal.
  • the structures such as surface blood vessels and polyps are emphasized by the second illumination light.
  • the second determination processing unit 72 determines a region having a high luminance value indicating a structure such as a polyp, and sets the region as the second specific region.
  • the region with a high luminance value emphasized by the second illumination light indicates, for example, the existence of a polyp or the like whose surface layer to be observed is irregular, and thus serves as an index for determining a lesion such as cancer. Therefore, the second specific region in the second image signal is a region where there is a possibility of a lesion such as cancer having a polyp or the like.
  • the third determination processing unit 73 performs determination processing of the third image signal that has been enhanced.
  • the boundary between the normal region and the abnormal region in the observation target is clearly represented by the color and saturation by the enhancement processing by the third illumination light and the color enhancing unit 66.
  • the third determination processing unit 73 sets the region where the mucous membrane is abnormal as the third specific region.
  • the third illumination light and the enhancement process the third image signal becomes an index for determining a lesion such as cancer whose color is different from that of the surroundings, for example. Therefore, the third specific region in the enhanced third image signal is a region where there is a possibility of a lesion such as cancer having redness or the like.
  • the support image generation unit 65 generates a support image indicating a specific area in the observation image determined by each of the plurality of support image signals by the specific area determination unit 64.
  • the support image is an image in which a specific area on the observation image can be recognized.
  • the support image is displayed by the area corresponding to the area of the specific area. Therefore, the size of the support image can be changed according to the size of the specific area.
  • the support image is, for example, an image in the form of a figure surrounding a portion including a specific area in an observation image. As shown in FIG. 12, the support image 81 indicates that the first specific area is within the area surrounded by the square figure which is the support image 81.
  • the support image 82 indicates that the second specific area is within the area surrounded by the square figure which is the support image 82.
  • the diagonal lines in the support image 82 indicate a specific color.
  • the support image 83 indicates that the third specific area is within the area surrounded by the square figure which is the support image 83.
  • the display control unit 57 controls to superimpose and display a plurality of support images on the display image in a manner that allows them to be distinguished from each other.
  • the plurality of support images are superimposed on the display image in a manner in which each support image can be distinguished and recognized. Therefore, each of the support images can be displayed in different colors. For example, as shown in FIG. 15, the display control unit uses different colors for the support image 81 showing the first specific area, the support image 82 showing the second specific area, and the support image 83 showing the third specific area. Controls to superimpose on the display image 68.
  • the processor device 16 or the endoscope system 10 makes it possible to grasp a plurality of different types of determination results at a glance while displaying an endoscope image in a natural color by normal light on a display.
  • the first specific region is a non-resolving region of ulcerative colitis
  • the second specific region is a region where there is a possibility of a lesion such as cancer in which a polyp or the like is present
  • the third specific region is redness or the like.
  • the display control unit 57 displays the support image 81 showing the first specific area, the support image 82 showing the second specific area, and the support image 83 showing the third specific area with figures having different shapes. You may perform control to superimpose on. As shown in FIG. 16, the support image 81 showing the first specific area may be shown by a quadrangle, the support image 82 showing the second specific area may be shown by a hexagon, and the support image 83 showing the third specific area may be shown by a circle. can. In this case, the colors of the support images may be the same or different. Further, in addition to showing each specific area with one figure, each specific area may be shown by a different pattern such as studded with smaller figures.
  • the display control unit 57 may control the support image to be superimposed and displayed on the display image each time the support image generation unit 65 generates the support image.
  • the specific area determination unit 64 determines the specific area
  • the support image generation unit 65 generates a support image indicating the specific area. .. Therefore, for example, as shown in FIG. 17, in the present embodiment, when the first image signal, which is the support image signal, is obtained by imaging the observation target illuminated by the first illumination light 1S, the support image is obtained.
  • the generation unit 65 generates a support image 81 showing the first specific region.
  • the support image 81 is acquired once in one cycle (1 CY) of the illumination light (see FIG. 8).
  • the acquired support image 81 is continuously displayed on the display image 68 until the next support image 81 is acquired.
  • the support image 82 or the support image 83 is acquired once in one cycle (1CY) of the illumination light, and continues until the next support image 82 or the support image 83 is acquired. Then, it is superimposed on the display image 68 and displayed.
  • the displayed image is updated and displayed as soon as the normal image is acquired.
  • the display control unit 57 controls to superimpose the support image on the display image, so that the support image is displayed in real time.
  • the frame rate which is the period of the frame
  • the support image follows the observation target even if there is some movement of the observation target, and is almost real-time.
  • the display control unit 57 may control to display a legend display indicating the association between the support image and the enhancement process or the illumination light for the support image on the display.
  • the support image is generated based on the support image signal, and the support image signal is the support image signal obtained by imaging using various special lights as the illumination light to irradiate the observation target, and the image signal.
  • the support image 81 showing the first specific region is obtained by the first illumination light
  • the support image 82 showing the second specific region is obtained by the second illumination light
  • the support image 83 showing the third specific region is obtained from the enhanced third support image. Therefore, the "illumination light 1" indicating the IEE by the first illumination light with the same color as the support image 81, the "illumination light 2" indicating the IEE by the second illumination light with the same color as the support image 82, and the support image 83.
  • "Color enhancement” indicating the IEE by the enhancement process with the same color can be displayed as the legend display 91 of the support image at the lower right of the display image.
  • the legend display 91 is also useful as a display indicating what the IEE being performed at that time is in endoscopy.
  • the support image 82 and the support image 83 may be displayed in an overlapping manner. Further, as shown in FIG. 20, when the areas of a plurality of specific regions are the same in the observation target, the support image 82 and the support image 83 are displayed in a distinguishable manner.
  • a location where two specific regions overlap, a specific region where the structure of a polyp or the like is determined and a specific region where a region with high superficial blood vessel density is determined, may be a lesion of cancer. If there is a finding that it is high, it can be easily recognized that these two specific regions overlap, and the determination can be made with better accuracy. Therefore, if there is a finding that the observation target is a lesion in a specific state when a plurality of specific areas overlap, the support image signal for determining the specific area can be set according to the purpose. In the observation target, a region in a desired specific state is shown by a support image, which is preferable.
  • two or more image signals may be used to determine one type of specific area.
  • the determination may be made using two or more image signals. It is preferable to use two or more image signals to determine one type of specific region because the number of types of the specific state of the observation target indicated by the specific region increases.
  • the support image showing the specific area may have text information.
  • the character information is information in which the determination result of a specific area or the like is displayed in characters.
  • the support image 81 has the text information 92 of "UC severity: mild".
  • the text information 92 is information indicating in text that the specific state of the observation target shown by the support image 81 is determined to be mild of ulcerative colitis.
  • the UC severity is shown by dividing the state of the intensity of inflammation into three categories: severe, moderate, and mild.
  • the support image 82 has the character information 93 of "polyp: ⁇ 2 mm".
  • the specific state of the observation target shown by the support image 82 is a polyp, which is information indicating in characters that the diameter of the polyp is determined to be 2 mm. Further, the support image 83 has the character information 94 of "cancer suspicion rate: 30%".
  • the text information 94 is information indicating in text that the specific state of the observation target shown by the support image 83 is suspected to be cancer and the suspicion rate of cancer, which is the probability thereof, is determined to be 30%. Is. Since the support image has text information, more detailed support information can be displayed.
  • the illumination light is emitted in a preset order according to a predetermined illumination light pattern (see FIG. 8).
  • the illumination light is normal light, and a normal image is acquired (step ST110).
  • the normal image by normal light is displayed on the display as a display image (step ST120).
  • the illumination light becomes the first illumination light, and the first image signal is acquired (step ST130).
  • the first specific area is determined, the support image indicating the first specific area is superimposed on the display image, and the display image with the support image is displayed (step ST140).
  • the illumination light becomes normal light, and a normal image is acquired (step ST150).
  • the normal image by normal light is displayed on the display as a display image (step ST160).
  • the illumination light becomes the second illumination light, and the second image signal is acquired (step ST170).
  • the second specific area is determined based on the second image signal, the support image indicating the second specific area is superimposed on the display image, and the display image with the support image is displayed (step ST180).
  • the illumination light becomes normal light, and a normal image is acquired (step ST190).
  • the normal image by normal light is displayed on the display as a display image (step ST200).
  • the illumination light becomes the third illumination light
  • the third image signal is acquired (step ST210).
  • Emphasis processing is performed on the third image signal (step ST220).
  • the third specific area is determined, the support image indicating the third specific area is superimposed on the display image, and the display image with the support image is displayed (step ST230). ..
  • the processor device 16 functions as an image processing device, but an image processing device including an image processing unit 56 may be provided separately from the processor device 16.
  • the image processing unit 56 has taken an image with the endoscope 12 directly from the endoscope system 10 or indirectly from the PACS (Picture Archiving and Communication Systems) 910, for example. It can be provided in the diagnostic support device 911 that acquires RAW images.
  • various inspection devices such as the first inspection device 921, the second inspection device 922, ..., and the K inspection device 923, including the endoscope system 10, are connected via the network 926.
  • the image processing unit 56 can be provided in the medical service support device 930.
  • the endoscope 12 uses a so-called flexible endoscope having a flexible insertion portion 12a, but the observation target swallows and uses a capsule-type endoscope.
  • the present invention is also suitable when a rigid endoscope (laparoscope) used for a mirror, surgery, or the like is used.
  • the above-described embodiment and modification are methods of operating an image processing device including a processor, which include an image acquisition step of acquiring a plurality of image signals obtained by imaging an observation target using an endoscope, and at least an image acquisition step.
  • a display image generation step of generating a display image to be displayed on a display using a display image signal obtained based on one of the image signals, and an observation target in a specific state based on each of a plurality of image signals.
  • a support image generation step for generating a support image that defines and shows a specific area, and a control for displaying a plurality of support images superimposed on the display image in a manner that can be distinguished from each other when the display image is displayed on the display. Includes a method of operating an image processing apparatus comprising a display control step to perform.
  • a display image generation function that generates an image, a support image generation function that generates a support image that defines and indicates a specific area in which the observation target is in a specific state based on each of a plurality of image signals, and a display image display. Includes a program for an image processing device for realizing a display control function for controlling display of a plurality of support images superimposed on the display image in a manner that allows them to be distinguished from each other.
  • a processing unit that executes various processes such as a control unit 51, an image acquisition unit 52, a DSP 53, a noise reduction unit 54, a conversion unit 55, an image processing unit 56, and a display control unit 57 included in the processor device 16.
  • the hardware structure of the (processing unit) is various processors as shown below.
  • the circuit configuration is changed after manufacturing the CPU (Central Processing Unit), FPGA (Field Programmable Gate Array), etc., which are general-purpose processors that execute software (programs) and function as various processing units. It includes a programmable logic device (PLD), which is a possible processor, a dedicated electric circuit, which is a processor having a circuit configuration specially designed for executing various processes, and the like.
  • PLD programmable logic device
  • One processing unit may be composed of one of these various processors, or may be composed of a combination of two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of a CPU and an FPGA). May be done. Further, a plurality of processing units may be configured by one processor. As an example of configuring a plurality of processing units with one processor, first, as represented by a computer such as a client or a server, one processor is configured by a combination of one or more CPUs and software. There is a form in which this processor functions as a plurality of processing units.
  • SoC System On Chip
  • the various processing units are configured by using one or more of the above-mentioned various processors as a hardware-like structure.
  • the hardware-like structure of these various processors is, more specifically, an electric circuit (circuitry) in which circuit elements such as semiconductor elements are combined.
  • the present invention is a system or device for acquiring medical images (including moving images) other than endoscopic images. It can also be used in such cases.
  • the present invention can be applied to an ultrasonic inspection device, an X-ray imaging device (including a CT (Computed Tomography) inspection device, a mammography device, etc.), an MRI (magnetic resonance imaging) device, and the like.

Abstract

Provided are an image processing device which can easily recognize detailed information concerning a subject being observed, an endoscope system, an operation method for an image processing device, and a program for an image processing device. An image processing device (16) performs control to acquire a plurality of image signals using an endoscope, to generate a display image to be displayed on a display using a display image signal obtained on the basis of at least one of the image signals, to determine a specific region in which a subject being observed is in a specific state in each of the plurality of image signals, to generate an assistance image which indicates the specific region, and when displaying the display image on the display, to display same by superimposing a plurality of the assistance images on the display image such that the assistance images are distinguishable from one another.

Description

画像処理装置、内視鏡システム、画像処理装置の作動方法、及び画像処理装置用プログラムImage processing device, endoscope system, operation method of image processing device, and program for image processing device
 本発明は、画像処理装置、内視鏡システム、画像処理装置の作動方法、及び画像処理装置用プログラムに関する。 The present invention relates to an image processing device, an endoscope system, an operation method of the image processing device, and a program for the image processing device.
 医療分野においては、光源装置、内視鏡、及びプロセッサ装置を備える内視鏡システムを用いた診断が広く行われている。内視鏡システムを用いた診断では、画像強調内視鏡又は画像強調観察(IEE、image enhanced endoscopy)と称される方法により、観察対象を内視鏡で撮影して得た画像(以下、内視鏡画像という)を用いて、観察対象の表面構造、粘膜表層の生体情報、又は病変の可能性等、観察対象の診断のための複数の支援情報を得ることが行われている。 In the medical field, diagnosis using an endoscope system equipped with a light source device, an endoscope, and a processor device is widely performed. In diagnosis using an endoscope system, an image obtained by photographing an observation target with an endoscope by a method called image-enhanced endoscopy or image-enhanced observation (IEE, image enhanced endoscopy) (hereinafter referred to as "inside"). Using an endoscopic image), a plurality of support information for diagnosing the observation target, such as the surface structure of the observation target, biological information on the surface layer of the mucous membrane, or the possibility of a lesion, is obtained.
 IEEには、観察対象を撮像して得た内視鏡画像をデジタル画像処理して用いる方法、又は、観察対象を特定の照明光により照明して撮像した内視鏡画像を用いる方法等、各種の方法が知られている。これらの方法により、例えば、観察対象において血管が密集する領域、又は酸素飽和度が低い領域等の生体情報を判定し、これらの領域を強調して表示すること等が可能である。これらの表示は、医師が観察対象を診断するための支援情報となる。 There are various types of IEE, such as a method of digitally processing an endoscopic image obtained by imaging an observation target and using it, or a method of using an endoscopic image captured by illuminating the observation target with a specific illumination light. The method is known. By these methods, for example, it is possible to determine biological information such as a region where blood vessels are densely packed or a region having a low oxygen saturation in an observation target, and to emphasize and display these regions. These displays provide support information for the doctor to diagnose the observation target.
 また、IEE等により、観察対象における病変の可能性がある領域の範囲、又は、炎症度等から、疾患のステージ等を判定し、得られた判定結果を医師の診断のための支援情報として提供するCAD(Computer-Aided Diagnosis)技術も開発されている。例えば、IEEによる内視鏡画像を用いて、潰瘍性大腸炎のステージ等の疾患の重症度又は進行度を高い精度で判定する内視鏡システムが知られている(特許文献1)。 In addition, the stage of the disease is determined from the range of the area where there is a possibility of lesion in the observation target or the degree of inflammation by IEE, etc., and the obtained determination result is provided as support information for the diagnosis of the doctor. CAD (Computer-Aided Diagnosis) technology has also been developed. For example, there is known an endoscopic system that determines the severity or progression of a disease such as the stage of ulcerative colitis with high accuracy by using an endoscopic image obtained by IEE (Patent Document 1).
特開2020-65685号公報Japanese Unexamined Patent Publication No. 2020-656685
 各種の方法によるIEEでは、それぞれ良好に得られる支援情報が異なる場合がある。したがって、IEEの種類により得られる効果又は用途が異なる場合がある。例えば、観察対象の情報として、酸素飽和度を取得する目的と、ポリープの領域を取得する目的とでは、異なるIEEが用いられる場合がある。したがって、複数のIEEを用いることにより、観察対象に関する診断の支援情報を複数種類得ることができるため、スクリーニング、又は病変の診断等において有益である。 In IEE by various methods, the support information that can be obtained well may differ. Therefore, the effect or application obtained may differ depending on the type of IEE. For example, different IEEs may be used as the information to be observed for the purpose of acquiring the oxygen saturation and the purpose of acquiring the polyp region. Therefore, by using a plurality of IEEs, it is possible to obtain a plurality of types of diagnostic support information regarding the observation target, which is useful in screening, diagnosis of lesions, and the like.
 しかしながら、例えば、内視鏡のユーザである医師が、通常観察する白色の照明光にて観察対象を観察中に、複数のIEEにより複数の病変等が示された場合、なんらかの異常、又は閾値からの逸脱等が示されていることは認識できるが、これら複数の病変のそれぞれを手間や時間をかけずに識別するのは難しい場合があった。 However, for example, when a doctor who is a user of an endoscope shows a plurality of lesions or the like by a plurality of IEEs while observing an observation target with a white illumination light which is normally observed, some abnormality or a threshold value is used. Although it is recognizable that deviations from the above are shown, it was sometimes difficult to identify each of these multiple lesions without taking time and effort.
 本発明は、観察対象の詳細な情報を容易に認識できる画像処理装置、内視鏡システム、画像処理装置の作動方法、及び画像処理装置用プログラムを提供することを目的とする。 An object of the present invention is to provide an image processing device, an endoscope system, an operation method of the image processing device, and a program for the image processing device, which can easily recognize detailed information of an observation target.
 本発明は、画像処理装置であって、プロセッサを備える。プロセッサは、内視鏡を用いて観察対象を撮像することにより得られる複数の画像信号を取得し、少なくとも1つの画像信号に基づいて得られる表示用画像信号を用いて、ディスプレイに表示する表示画像を生成し、複数の画像信号のそれぞれにおいて、観察対象が特定の状態である特定領域を判定し、特定領域を示す支援画像を生成し、表示画像をディスプレイに表示する際に、複数の支援画像を互いに区別可能な態様により表示画像に重畳して表示する制御を行う。 The present invention is an image processing device and includes a processor. The processor acquires a plurality of image signals obtained by imaging an observation target using an endoscope, and uses a display image signal obtained based on at least one image signal to display a display image on a display. Is generated, in each of a plurality of image signals, a specific area in which the observation target is in a specific state is determined, a support image indicating the specific area is generated, and a plurality of support images are displayed when the display image is displayed on the display. Is controlled to be superimposed on the display image in a manner that can be distinguished from each other.
 複数の支援画像は、同一又は相似の図形により表示することが好ましい。 It is preferable to display a plurality of support images with the same or similar figures.
 支援画像は、特定領域の面積に対応した面積により表示することが好ましい。 It is preferable to display the support image by the area corresponding to the area of the specific area.
 プロセッサは、複数の支援画像のそれぞれを互いに異なる色により表示することが好ましい。 It is preferable that the processor displays each of the plurality of support images in different colors from each other.
 プロセッサは、複数の支援画像のそれぞれを互いに異なる形状の図形により表示することが好ましい。 It is preferable that the processor displays each of the plurality of support images with figures having different shapes from each other.
 プロセッサは、支援画像を生成する毎に、支援画像を表示画像に重畳して表示する制御を行うことが好ましい。 It is preferable that the processor controls to superimpose and display the support image on the display image each time the support image is generated.
 プロセッサは、画像信号に対して強調処理を行い支援用画像信号とし、支援用画像信号に基づいて、特定領域を判定することが好ましい。 It is preferable that the processor enhances the image signal to obtain a support image signal and determines a specific area based on the support image signal.
 プロセッサは、支援画像と強調処理との関連付けを示す凡例表示をディスプレイに表示する制御を行うことが好ましい。 It is preferable that the processor controls to display a legend display showing the association between the support image and the enhancement process on the display.
 また、本発明は、内視鏡システムであって、本発明の画像処理装置と、観察対象に照射する照明光を発する光源部とを備える。 Further, the present invention is an endoscope system, and includes the image processing apparatus of the present invention and a light source unit that emits illumination light to irradiate an observation target.
 表示用画像信号は、光源部が発する白色の照明光により照明した観察対象を撮像することにより得られることが好ましい。 The display image signal is preferably obtained by imaging an observation target illuminated by white illumination light emitted by a light source unit.
 プロセッサは、光源部が発する互いに分光スペクトルが異なる複数の支援画像用照明光のそれぞれにより照明した観察対象を撮像することにより、支援画像を生成することが好ましい。 It is preferable that the processor generates a support image by capturing an observation target illuminated by each of a plurality of support image illumination lights having different spectral spectra from each other emitted by the light source unit.
 複数の支援画像用照明光は、それぞれ予め設定した波長帯域の狭帯域光を含むことが好ましい。 It is preferable that the plurality of support image illumination lights include narrow band light having a preset wavelength band.
 光源部は、白色の照明光と、複数の支援画像用照明光のそれぞれとを、予め設定した順序により繰り返し発光することが好ましい。 It is preferable that the light source unit repeatedly emits white illumination light and each of a plurality of support image illumination lights in a preset order.
 表示制御部は、支援画像と支援画像用照明光との関連付けを示す凡例表示をディスプレイに表示する制御を行うことが好ましい。 It is preferable that the display control unit controls to display a legend display indicating the association between the support image and the illumination light for the support image on the display.
 また、本発明は、画像処理装置の作動方法であって、内視鏡を用いて観察対象を撮像することにより得られる複数の画像信号を取得する画像取得ステップと、少なくとも1つの画像信号に基づいて得られる表示用画像信号を用いて、ディスプレイに表示する表示画像
を生成する表示画像生成ステップと、複数の画像信号のそれぞれに基づいて、観察対象が特定の状態である特定領域を示す支援画像を生成する支援画像生成ステップと、表示画像をディスプレイに表示する際に、複数の支援画像を互いに区別可能な態様により表示画像に重畳して表示する制御を行う表示制御ステップとを備える。
Further, the present invention is a method of operating an image processing device, which is based on an image acquisition step of acquiring a plurality of image signals obtained by imaging an observation target using an endoscope, and at least one image signal. A support image showing a specific area in which the observation target is in a specific state based on each of a display image generation step of generating a display image to be displayed on the display and a plurality of image signals using the display image signal obtained. It is provided with a support image generation step for generating a display image, and a display control step for controlling display of a plurality of support images superimposed on the display image in a manner distinguishable from each other when the display image is displayed on the display.
 また、本発明は、画像処理装置用プログラムであって、内視鏡を用いて観察対象を撮像することにより得られる画像信号に画像処理を施す画像処理装置にインストールされる画像処理装置用プログラムにおいて、コンピュータに、内視鏡を用いて観察対象を撮像することにより得られる複数の画像信号を取得する画像取得機能と、少なくとも1つの画像信号に基づいて得られる表示用画像信号を用いて、ディスプレイに表示する表示画像を生成する表示画像生成機能と、複数の画像信号のそれぞれに基づいて、観察対象が特定の状態である特定領域を示す支援画像を生成する支援画像生成機能と、表示画像をディスプレイに表示する際に、複数の支援画像を互いに区別可能な態様により表示画像に重畳して表示する制御を行う表示制御機能とを実現させるための画像処理装置用プログラムである。 Further, the present invention is a program for an image processing device, which is a program for an image processing device installed in an image processing device that performs image processing on an image signal obtained by imaging an observation target using an endoscope. A display using an image acquisition function for acquiring a plurality of image signals obtained by imaging an observation target with an endoscope and a display image signal obtained based on at least one image signal on a computer. A display image generation function that generates a display image to be displayed on the screen, a support image generation function that generates a support image indicating a specific area in which the observation target is in a specific state based on each of a plurality of image signals, and a display image. It is a program for an image processing device for realizing a display control function that controls display of a plurality of support images superimposed on the display image in a manner that allows them to be distinguished from each other when displayed on the display.
 本発明によれば、観察対象の詳細な情報を容易に認識することができる。 According to the present invention, detailed information on the observation target can be easily recognized.
内視鏡システムの外観図である。It is an external view of an endoscope system. 内視鏡の操作部の外観図である。It is an external view of the operation part of an endoscope. 内視鏡システムの機能を示すブロック図である。It is a block diagram which shows the function of an endoscope system. 通常光のスペクトルを示すグラフである。It is a graph which shows the spectrum of normal light. 第1照明光のスペクトルを示すグラフである。It is a graph which shows the spectrum of the 1st illumination light. 第2照明光のスペクトルを示すグラフである。It is a graph which shows the spectrum of the 2nd illumination light. 第3照明光のスペクトルを示すグラフである。It is a graph which shows the spectrum of the 3rd illumination light. 照明光パターンを説明する説明図である。It is explanatory drawing explaining the illumination light pattern. 画像処理部の機能を示すブロック図である。It is a block diagram which shows the function of an image processing part. 表示画像の画像図である。It is an image diagram of a display image. 特定領域判定部の機能を示すブロック図である。It is a block diagram which shows the function of the specific area determination part. 第1特定領域を示した支援画像を重畳した表示画像の画像図である。It is an image diagram of the display image which superposed the support image which showed the 1st specific area. 第2特定領域を示した支援画像を重畳した表示画像の画像図である。It is an image diagram of the display image which superposed the support image which showed the 2nd specific area. 第3特定領域を示した支援画像を重畳した表示画像の画像図である。It is an image diagram of the display image which superposed the support image which showed the 3rd specific area. 複数の支援画像を重畳した表示画像の画像図である。It is an image diagram of a display image in which a plurality of support images are superimposed. 異なる形状の図形で表示した複数の支援画像を重畳した表示画像の画像図である。It is an image diagram of a display image in which a plurality of support images displayed by figures having different shapes are superimposed. 支援画像の更新を説明する説明図である。It is explanatory drawing explaining the update of the support image. 特定領域の種類を示す凡例を表示した表示画像の画像図である。It is an image diagram of the display image which displayed the legend which shows the kind of a specific area. 複数の特定領域が一部重なる場合の表示画像の画像図である。It is an image diagram of the display image when a plurality of specific areas partially overlap. 複数の特定領域が全部重なる表示画像の画像図である。It is an image diagram of the display image in which a plurality of specific areas all overlap. 支援画像が文字情報を有する表示画像の画像図である。It is an image diagram of the display image which the support image has the character information. 画像処理装置が行う処理の一連の流れを説明するためのフローチャートである。It is a flowchart for demonstrating a series flow of processing performed by an image processing apparatus. 診断支援装置を示す説明図である。It is explanatory drawing which shows the diagnosis support apparatus. 医療業務支援装置を示す説明図である。It is explanatory drawing which shows the medical work support apparatus.
 図1に示すように、内視鏡システム10は、内視鏡12と、光源装置14と、プロセッサ装置16と、ディスプレイ18と、キーボード19とを備える。内視鏡12は、観察対象を撮影する。光源装置14は、観察対象に照射する照明光を発する。プロセッサ装置16は、内視鏡システム10のシステム制御を行う。ディスプレイ18は、内視鏡画像等を表示する表示部である。キーボード19は、プロセッサ装置16等への設定入力等を行う入力デバイスである。 As shown in FIG. 1, the endoscope system 10 includes an endoscope 12, a light source device 14, a processor device 16, a display 18, and a keyboard 19. The endoscope 12 photographs the observation target. The light source device 14 emits illumination light to irradiate the observation target. The processor device 16 controls the system of the endoscope system 10. The display 18 is a display unit that displays an endoscopic image or the like. The keyboard 19 is an input device for inputting settings to the processor device 16 and the like.
 内視鏡システム10は、観察モードとして、通常モードと特殊モードと支援モードとの3つのモードを備える。通常モードは、通常光を観察対象に照射して撮像することによって、自然な色合いの通常画像を表示画像としてディスプレイ18に表示する。特殊モードでは、通常光と波長帯域又は分光スペクトルが異なる特殊光を観察対象に照明して撮像することによって、観察対象の特定の状態を強調した特殊画像を表示画像としてディスプレイ18に表示する。支援モードでは、通常光を観察対象に照射して撮像することによって得た通常画像を表示画像としてディスプレイ18に表示する際に、表示画像に支援画像を重畳して表示する。支援画像は、観察対象が特定の状態である特定領域を示す画像である。なお、通常モードにおける表示画像は、強調処理等を行った特殊画像であっても、視認性が良いものであれば、表示画像として用いてもよい。 The endoscope system 10 has three modes as an observation mode: a normal mode, a special mode, and a support mode. In the normal mode, a normal image having a natural color is displayed on the display 18 as a display image by irradiating the observation target with normal light and taking an image. In the special mode, a special image emphasizing a specific state of the observation target is displayed on the display 18 as a display image by illuminating the observation target with special light having a wavelength band or a spectral spectrum different from that of normal light and taking an image. In the support mode, when the normal image obtained by irradiating the observation target with normal light and taking an image is displayed on the display 18 as a display image, the support image is superimposed and displayed on the display image. The support image is an image showing a specific area in which the observation target is in a specific state. The display image in the normal mode may be a special image that has undergone enhancement processing or the like, or may be used as a display image as long as it has good visibility.
 観察対象が特定の状態であるとは、観察対象を撮像した内視鏡画像の画像解析から得られる観察対象の生体情報等の各種の情報が、予め設定した条件を満たすことであり、これにより観察対象の状態が特定のものであるかを判定する。観察対象の生体情報とは、観察対象の全体的または部分的な特性を表す数値等であり、例えば、酸素飽和度、血液濃度、血管密度、または、病変もしくは病変の候補(生体組織検査の対象を含む)等の特定の形態を有する確からしさ、等である。具体的には、例えば、観察対象の血管、腺管、ひだ、ポリープ、もしくは、特定の粘膜等の有無、もしくは、大きさ等の測定による情報、又は発赤等の色、蛍光、がんなどの病変、もしくは、低酸素飽和度等を示す内視鏡画像から得られる情報が、予め設定した範囲にある場合、観察対象が特定の状態であるとする。内視鏡画像から得られる生体情報は、複数用いて観察対象の状態を判定してもよいし、内視鏡画像を用いた機械学習による画像認識技術により、観察対象の状態を判定してもよい。 The fact that the observation target is in a specific state means that various information such as biological information of the observation target obtained from image analysis of the endoscopic image obtained by capturing the observation target satisfy preset conditions. Determine if the state of the observation target is specific. The biological information of the observation target is a numerical value or the like representing the whole or partial characteristics of the observation target, and is, for example, oxygen saturation, blood concentration, blood vessel density, or a lesion or a candidate for a lesion (subject of biological tissue examination). The certainty of having a specific form such as), etc. Specifically, for example, the presence or absence of blood vessels, ducts, folds, polyps, specific mucous membranes, etc. to be observed, information by measurement such as size, color such as redness, fluorescence, cancer, etc. When the information obtained from the lesion or the endoscopic image showing the low oxygen saturation and the like is within a preset range, it is assumed that the observation target is in a specific state. A plurality of biological information obtained from the endoscopic image may be used to determine the state of the observation target, or the state of the observation target may be determined by the image recognition technique by machine learning using the endoscopic image. good.
 具体的には、例えば、内視鏡画像から得られる情報において、観察対象の表層血管の密集度が特定の値以上である場合、観察対象は表層血管の密集度が高い密の状態にあると判定する。この場合、観察対象における表層血管の密集度が高い密の状態にある領域が特定領域であるから、支援画像は、観察対象における表層血管の密集度が高い密の状態にある領域を示す画像である。支援画像は、観察対象を表示する表示画像に重畳してディスプレイに示すため、観察対象が特定の状態である特定領域を示すことができる。支援モードにおいて、観察対象を表示する表示画像は、通常光画像、又は特殊画像から任意に選択してもよい。本実施形態では、支援モードにおける表示画像は、通常画像である。 Specifically, for example, in the information obtained from the endoscopic image, when the density of the surface blood vessels of the observation target is equal to or higher than a specific value, the observation target is in a dense state with a high density of the surface blood vessels. judge. In this case, since the region in the dense state where the surface blood vessels are dense in the observation target is a specific region, the support image is an image showing the region in the dense state where the superficial blood vessels in the observation target are in a high density. be. Since the support image is superimposed on the display image displaying the observation target and shown on the display, it is possible to show a specific area in which the observation target is in a specific state. In the support mode, the display image displaying the observation target may be arbitrarily selected from a normal light image or a special image. In the present embodiment, the display image in the support mode is a normal image.
 支援画像は、特定領域を示す。支援画像は、支援用画像信号に基づいて生成することが好ましい。支援用画像信号は複数の種類を有する。支援用画像信号は、観察対象に照射する照明光として各種の特殊光を用いて撮像することにより得る支援用画像信号と、画像信号に対し強調処理等の各種の画像処理を行うことにより得る支援用画像信号とを含む。画像処理を行うことにより得る支援用画像信号は、例えば、撮像して得た画像信号に対し、観察対象における複数の領域の色の差を拡張した色差拡張処理を施した色彩強調画像、及び、コントラストを強調したコントラスト強調画像等を含む。各種の特殊光については後述する。 The support image shows a specific area. The support image is preferably generated based on the support image signal. There are multiple types of assistive image signals. The support image signal is a support image signal obtained by imaging using various special lights as illumination light to irradiate the observation target, and a support obtained by performing various image processing such as enhancement processing on the image signal. For image signals and included. The support image signal obtained by performing the image processing is, for example, a color-enhanced image obtained by performing a color difference expansion process in which the color difference of a plurality of regions in the observation target is expanded with respect to the image signal obtained by imaging, and a color-enhanced image. Includes contrast-enhanced images with enhanced contrast. Various special lights will be described later.
 内視鏡12は、観察対象を有する被検体内に挿入する挿入部12aと、挿入部12aの基端部分に設けた操作部12bと、挿入部12aの先端側に設けた湾曲部12cと、先端部12dとを有している。操作部12bのアングルノブ12e(図2参照)を操作することにより、湾曲部12cが湾曲する。その結果、先端部12dが所望の方向に向く。また、図2に示すように、操作部12bには、アングルノブ12eの他、処置具挿入口12f、スコープボタン12g、及び、ズーム操作部13が設けられている。処置具挿入口12fは、生検鉗子、スネア、または、電気メス等の処置具を挿入する入り口である。処置具挿入口12fに挿入した処置具は、先端部12dから突出する。スコープボタン12gには各種の操作を割り当てることができ、本実施形態では観察モードを切り替える操作に使用する。ズーム操作部13を操作することによって、観察対象を拡大または縮小して撮影できる。 The endoscope 12 includes an insertion portion 12a to be inserted into a subject having an observation target, an operation portion 12b provided at the base end portion of the insertion portion 12a, and a bending portion 12c provided on the distal end side of the insertion portion 12a. It has a tip portion 12d. By operating the angle knob 12e (see FIG. 2) of the operation unit 12b, the curved portion 12c is curved. As a result, the tip portion 12d faces in a desired direction. Further, as shown in FIG. 2, the operation unit 12b is provided with an angle knob 12e, a treatment tool insertion port 12f, a scope button 12g, and a zoom operation unit 13. The treatment tool insertion port 12f is an entrance for inserting a treatment tool such as a biopsy forceps, a snare, or an electric knife. The treatment tool inserted into the treatment tool insertion port 12f protrudes from the tip portion 12d. Various operations can be assigned to the scope button 12g, and in the present embodiment, it is used for the operation of switching the observation mode. By operating the zoom operation unit 13, the observation target can be enlarged or reduced for shooting.
 図3に示すように、光源装置14は、照明光を発する光源を備える光源部20と、光源部20の動作を制御する光源制御部22とを備える。光源部20は、観察対象を照明する照明光を発する。照明光には、照明光を発するために使用する励起光等の発光を含む。光源部20は、例えば、レーザーダイオード、LED(Light Emitting Diode)、キセノンランプ、または、ハロゲンランプの光源を含み、少なくとも、白色の照明光、または、白色の照明光を発するために使用する励起光を発する。白色には、内視鏡12を用いた観察対象の撮影において実質的に白色と同等な、いわゆる擬似白色を含む。 As shown in FIG. 3, the light source device 14 includes a light source unit 20 including a light source that emits illumination light, and a light source control unit 22 that controls the operation of the light source unit 20. The light source unit 20 emits illumination light that illuminates the observation target. The illumination light includes light emission such as excitation light used to emit the illumination light. The light source unit 20 includes, for example, a light source of a laser diode, an LED (Light Emitting Diode), a xenon lamp, or a halogen lamp, and at least an excitation light used to emit white illumination light or white illumination light. Emit. The white color includes so-called pseudo-white color, which is substantially equivalent to white color in the imaging of the observation target using the endoscope 12.
 光源部20は、必要に応じて、励起光の照射を受けて発光する蛍光体、または、照明光または励起光の波長帯域、分光スペクトル、もしくは光量等を調節する光学フィルタ等を含む。この他、光源部20は、少なくとも狭帯域な光(以下、狭帯域光という)からなる照明光を発することができる。また、光源部20は、互いに分光スペクトルが異なる複数の照明光を発することができる。複数の照明光は、狭帯域光を含んでも良い。また、光源部20は、例えば、観察対象が含むヘモグロビンの酸素飽和度等の生体情報を算出するために使用する画像の撮影に必要な、特定の波長帯域又は分光スペクトルを有する光を発することができる。 The light source unit 20 includes, if necessary, a phosphor that emits light when irradiated with excitation light, an optical filter that adjusts the wavelength band, spectral spectrum, light amount, etc. of the illumination light or excitation light. In addition, the light source unit 20 can emit illumination light composed of at least narrow band light (hereinafter referred to as narrow band light). Further, the light source unit 20 can emit a plurality of illumination lights having different spectral spectra from each other. The plurality of illumination lights may include narrow band light. Further, the light source unit 20 may emit light having a specific wavelength band or spectral spectrum necessary for capturing an image used for calculating biological information such as oxygen saturation of hemoglobin contained in the observation target, for example. can.
 「狭帯域」とは、観察対象の特性及び/またはイメージセンサ45が有するカラーフィルタの分光特性との関係において、実質的にほぼ単一の波長帯域であることをいう。例えば、波長帯域が例えば約±20nm以下(好ましくは約±10nm以下)である場合、この光は狭帯域である。 The "narrow band" means a substantially single wavelength band in relation to the characteristics of the observation target and / or the spectral characteristics of the color filter of the image sensor 45. For example, when the wavelength band is, for example, about ± 20 nm or less (preferably about ± 10 nm or less), this light is a narrow band.
 本実施形態では、光源部20は、V-LED20a、B-LED20b、G-LED20c、及びR-LED20dの4色のLEDを有する。V-LED20aは、中心波長405nm、波長帯域380~420nmの紫色光VLを発光する。B-LED20bは、中心波長460nm、波長帯域420~500nmの青色光BLを発光する。G-LED20cは、波長帯域が480~600nmに及ぶ緑色光GLを発光する。R-LED20dは、中心波長620~630nmで、波長帯域が600~650nmに及ぶ赤色光RLを発光する。なお、V-LED20aとB-LED20bの中心波長は約±20nm、好ましくは約±5nmから約±10nm程度の幅を有する。なお、紫色光VLは、特殊光モード又は支援モードにて用いる表層血管の密集、粘膜内出血、及び粘膜外出血を検出するために用いられる短波長の光であり、中心波長又はピーク波長に410nmを含めることが好ましい。また、紫色光VLは、狭帯域光であることが好ましい。 In the present embodiment, the light source unit 20 has four color LEDs of V-LED20a, B-LED20b, G-LED20c, and R-LED20d. The V-LED 20a emits purple light VL having a center wavelength of 405 nm and a wavelength band of 380 to 420 nm. The B-LED 20b emits blue light BL having a center wavelength of 460 nm and a wavelength band of 420 to 500 nm. The G-LED 20c emits green light GL having a wavelength band of 480 to 600 nm. The R-LED 20d emits red light RL having a center wavelength of 620 to 630 nm and a wavelength band of 600 to 650 nm. The center wavelengths of the V-LED 20a and the B-LED 20b have a width of about ± 20 nm, preferably about ± 5 nm to about ± 10 nm. The purple light VL is short-wavelength light used for detecting superficial blood vessel congestion, intramucosal hemorrhage, and extramucosal hemorrhage used in the special light mode or the support mode, and has a central wavelength or a peak wavelength of 410 nm. It is preferable to include it. Further, the purple light VL is preferably narrow band light.
 光源制御部22は、光源部20を構成する各光源の点灯または消灯もしくは遮蔽のタイミング、及び、発光量等を制御する。その結果、光源部20は、分光スペクトルが異なる複数種類の照明光を、予め設定した期間及び発光量で発することができる。本実施形態においては、光源制御部22は、各LED20a~20dの点灯や消灯、点灯時の発光量、又は光学フィルタの挿抜等を、各々に独立した制御信号を入力することにより、照明光の分光スペクトルを調節する。これにより、光源部20は白色の照明光、分光スペクトルが異なる複数種類の照明光、又は、少なくとも狭帯域光からなる照明光等を発する。 The light source control unit 22 controls the timing of turning on, off, or shielding each light source constituting the light source unit 20, the amount of light emitted, and the like. As a result, the light source unit 20 can emit a plurality of types of illumination light having different spectral spectra for a preset period and emission amount. In the present embodiment, the light source control unit 22 sets the lighting and extinguishing of each of the LEDs 20a to 20d, the amount of light emitted at the time of lighting, the insertion and removal of the optical filter, and the like by inputting an independent control signal to each of the illumination lights. Adjust the spectral spectrum. As a result, the light source unit 20 emits white illumination light, a plurality of types of illumination light having different spectral spectra, or illumination light composed of at least narrow band light.
 本実施形態では、図4に示すように、光源部20は、光源制御部22の制御により、白色の照明光を発する。白色の照明光は、通常光である。通常モード又は支援モードに設定している場合、光源制御部22は、紫色光VL、青色光BL、緑色光GL、及び赤色光RL間の光強度比がVc:Bc:Gc:Rcとなる通常光を発するように、各LED20a~20dを制御する。光強度比Vc:Bc:Gc:Rcは、白色の照明光の光量条件に対応する。 In this embodiment, as shown in FIG. 4, the light source unit 20 emits white illumination light under the control of the light source control unit 22. The white illumination light is normal light. When set to the normal mode or the support mode, the light source control unit 22 has a normal light intensity ratio of Vc: Bc: Gc: Rc between the purple light VL, the blue light BL, the green light GL, and the red light RL. Each LED 20a to 20d is controlled so as to emit light. The light intensity ratio Vc: Bc: Gc: Rc corresponds to the light intensity condition of the white illumination light.
 また、光源部20は、光源制御部22の制御により、互いに分光スペクトルが異なる複数の特殊光を発することが好ましい。複数の特殊光により撮像することにより、複数の支援用画像信号を得ることができ、複数の互いに異なる支援画像を得ることができるためである。複数の特殊光は、2種以上のn種類であってよい。なお、nは整数である。本実施形態では、複数の特殊光は、第1照明光と第2照明光と第3照明光との3種類である。 Further, it is preferable that the light source unit 20 emits a plurality of special lights having different spectral spectra from each other under the control of the light source control unit 22. This is because, by imaging with a plurality of special lights, a plurality of support image signals can be obtained, and a plurality of support images different from each other can be obtained. The plurality of special lights may be two or more kinds of n kinds. Note that n is an integer. In the present embodiment, the plurality of special lights are three types of the first illumination light, the second illumination light, and the third illumination light.
 特殊モード及び支援モードに設定している場合、光源制御部22は、第1照明光として、紫色光VL、青色光BL、緑色光GL、及び赤色光RL間の光強度比がVs1:Bs1:Gs1:Rs1となる特殊光を発するように、各LED20a~20dを制御する。光強度比Vs1:Bs1:Gs1:Rs1は、第1照明光の光量条件に対応する。第1照明光は、表層血管を強調することが好ましい。そのため、第1照明光は、図5に示すように、光強度比を1:0:0:0にして、短波長の狭帯域光としての紫色光VLのみを発する。 When the special mode and the support mode are set, the light source control unit 22 has a light intensity ratio of Vs1: Bs1: between purple light VL, blue light BL, green light GL, and red light RL as the first illumination light. Each LED 20a to 20d is controlled so as to emit a special light of Gs1: Rs1. The light intensity ratio Vs1: Bs1: Gs1: Rs1 corresponds to the light intensity condition of the first illumination light. The first illumination light preferably emphasizes the surface blood vessels. Therefore, as shown in FIG. 5, the first illumination light emits only purple light VL as a short-wavelength narrow-band light with a light intensity ratio of 1: 0: 0: 0.
 特殊モード及び支援モードに設定している場合、光源制御部22は、第2照明光として、紫色光VL、青色光BL、緑色光GL、及び赤色光RL間の光強度比がVs2:Bs2:Gs2:Rs2となる特殊光を発するように、各LED20a~20dを制御する。光強度比Vs2:Bs2:Gs2:Rs2は、第2照明光の光量条件に対応する。第2照明光は、表層血管及びポリープ等の構造を強調することが好ましい。そのため、第2照明光は、紫色光VLの光強度を青色光BLの光強度よりも大きくすることが好ましい。例えば、図6に示すように、紫色光VLの光強度Vs2と青色光BLの光強度Bs2との比率を「4:1」とする。 When the special mode and the support mode are set, the light source control unit 22 has a light intensity ratio of Vs2: Bs2: between purple light VL, blue light BL, green light GL, and red light RL as the second illumination light. Gs2: Each LED 20a to 20d is controlled so as to emit special light that becomes Rs2. The light intensity ratio Vs2: Bs2: Gs2: Rs2 corresponds to the light intensity condition of the second illumination light. The second illumination light preferably emphasizes structures such as surface blood vessels and polyps. Therefore, in the second illumination light, it is preferable that the light intensity of the purple light VL is higher than the light intensity of the blue light BL. For example, as shown in FIG. 6, the ratio of the light intensity Vs2 of the purple light VL to the light intensity Bs2 of the blue light BL is set to “4: 1”.
 特殊モード及び支援モードに設定している場合、光源制御部22は、第3照明光として、紫色光VL、青色光BL、緑色光GL、及び赤色光RL間の光強度比がVs3:Bs3:Gs3:Rs3となる特殊光を発するように、各LED20a~20dを制御する。光強度比Vs3:Bs3:Gs3:Rs3は、第3照明光の光量条件に対応する。第3照明光は、深層血管を強調することが好ましい。そのため、第3照明光は、青色光BLの光強度を紫色光VLの光強度よりも大きくすることが好ましい。例えば、図7に示すように、紫色光VLの光強度Vs3と青色光BLの光強度Bs3との比率を「1:3」とする。 When the special mode and the support mode are set, the light source control unit 22 has a light intensity ratio of Vs3: Bs3: between purple light VL, blue light BL, green light GL, and red light RL as the third illumination light. Gs3: Each LED 20a to 20d is controlled so as to emit special light that becomes Rs3. The light intensity ratio Vs3: Bs3: Gs3: Rs3 corresponds to the light intensity condition of the third illumination light. The third illumination light preferably emphasizes deep blood vessels. Therefore, it is preferable that the light intensity of the blue light BL is higher than the light intensity of the purple light VL for the third illumination light. For example, as shown in FIG. 7, the ratio of the light intensity Vs3 of the purple light VL to the light intensity Bs3 of the blue light BL is set to “1: 3”.
 本実施形態の第1照明光と第2照明光と第3照明光との3種類は、狭帯域光を含む照明光である。なお、本明細書において、光強度比は、少なくとも1つの半導体光源の比率が0(ゼロ)の場合を含む。したがって、各半導体光源のいずれか1つまたは2つ以上が点灯しない場合を含む。例えば、紫色光VL、青色光BL、緑色光GL、及び赤色光RL間の光強度比が1:0:0:0の場合のように、半導体光源の1つのみを点灯し、他の3つは点灯しない場合も、光強度比を有するものとする。 The three types of the first illumination light, the second illumination light, and the third illumination light of the present embodiment are illumination lights including narrow band light. In the present specification, the light intensity ratio includes the case where the ratio of at least one semiconductor light source is 0 (zero). Therefore, this includes the case where any one or more of the semiconductor light sources are not lit. For example, as in the case where the light intensity ratio between the purple light VL, the blue light BL, the green light GL, and the red light RL is 1: 0: 0: 0, only one of the semiconductor light sources is turned on, and the other 3 are turned on. Even if one does not light up, it shall have a light intensity ratio.
 複数の照明光は、予め設定した順序により繰り返し発光することが好ましい。本実施形態では、通常光と特殊光である第1照明光と第2照明光と第3照明光とは、予め設定した順序により繰り返し発光する。本実施形態では、光源制御部22は、例えば、図8に示すように、通常光CLを5フレーム(5FL)続けて発光し、次に、第1照明光1Sを1フレーム(1FL)発光し、再度通常光CLを5フレーム(5FL)続けて発光し、次に、第2照明光2Sを1フレーム(1FL)発光し、再度通常光CLを5フレーム(5FL)続けて発光し、次に、第3照明光3Sを1フレーム(1FL)発光する発光順序からなる照明パターンを1周期(1CY)として、この周期を繰り返す。 It is preferable that the plurality of illumination lights are repeatedly emitted in a preset order. In the present embodiment, the normal light, the special light, the first illumination light, the second illumination light, and the third illumination light are repeatedly emitted in a preset order. In the present embodiment, for example, as shown in FIG. 8, the light source control unit 22 emits the normal light CL for 5 frames (5FL) in succession, and then emits the first illumination light 1S for 1 frame (1FL). , The normal light CL is continuously emitted for 5 frames (5FL), then the second illumination light 2S is emitted for 1 frame (1FL), the normal light CL is emitted again for 5 frames (5FL), and then. The illumination pattern having a light emission order in which the third illumination light 3S emits light for one frame (1FL) is set as one cycle (1CY), and this cycle is repeated.
 通常光により照明した観察対象を撮像することにより通常画像となる通常画像信号が得られ、第1照明光により第1画像信号が得られ、第2照明光により第2画像信号が得られ、第3照明光により第3画像信号が得られる。第1照明光と第2照明光と第3照明光とは特殊光であるので、第1画像信号、第2画像信号、及び第3画像信号はそれぞれ特殊画像である。 By imaging the observation target illuminated by the normal light, a normal image signal to be a normal image is obtained, the first image signal is obtained by the first illumination light, the second image signal is obtained by the second illumination light, and the second image signal is obtained. The third image signal is obtained by the three illumination lights. Since the first illumination light, the second illumination light, and the third illumination light are special lights, the first image signal, the second image signal, and the third image signal are special images, respectively.
 内視鏡12の先端部12dには、照明光学系30aと撮影光学系30bが設けられている(図3参照)。照明光学系30aは、照明レンズ42を有しており、この照明レンズ42を介して照明光が観察対象に向けて出射する。 The tip portion 12d of the endoscope 12 is provided with an illumination optical system 30a and a photographing optical system 30b (see FIG. 3). The illumination optical system 30a has an illumination lens 42, and the illumination light is emitted toward the observation target through the illumination lens 42.
 撮影光学系30bは、対物レンズ43、ズームレンズ44、及びイメージセンサ45を有する。イメージセンサ45は、対物レンズ43及びズームレンズ44を介して、観察対象から戻る照明光の反射光等(反射光の他、散乱光、観察対象が発光する蛍光、または、観察対象に投与等した薬剤に起因した蛍光等を含む)を用いて観察対象を撮影する。ズームレンズ44は、ズーム操作部13の操作をすることで移動し、観察対象像を拡大または縮小する。 The photographing optical system 30b has an objective lens 43, a zoom lens 44, and an image sensor 45. The image sensor 45 was administered to the observation target via the objective lens 43 and the zoom lens 44, such as reflected light of the illumination light returning from the observation target (in addition to the reflected light, scattered light, fluorescence emitted by the observation target, or administration to the observation target. The observation target is photographed using (including fluorescence caused by the drug). The zoom lens 44 moves by operating the zoom operation unit 13, and enlarges or reduces the observation target image.
 イメージセンサ45は、画素ごとに、複数色のカラーフィルタのうち1色のカラーフィルタを有する。本実施形態においては、イメージセンサ45は原色系のカラーフィルタを有するカラーセンサである。具体的には、イメージセンサ45は、赤色カラーフィルタ(Rフィルタ)を有するR画素と、緑色カラーフィルタ(Gフィルタ)を有するG画素と、青色カラーフィルタ(Bフィルタ)を有するB画素と、を有する。 The image sensor 45 has a color filter of one of a plurality of color filters for each pixel. In the present embodiment, the image sensor 45 is a color sensor having a primary color system color filter. Specifically, the image sensor 45 includes an R pixel having a red color filter (R filter), a G pixel having a green color filter (G filter), and a B pixel having a blue color filter (B filter). Have.
 なお、イメージセンサ45としては、CCD(Charge Coupled Device)センサや、CMOS(Complementary Metal Oxide Semiconductor)センサを利用可能である。また、本実施形態のイメージセンサ45は、原色系のカラーセンサであるが、補色系のカラーセンサを用いることもできる。補色系のカラーセンサは、例えば、シアンカラーフィルタが設けられたシアン画素、マゼンタカラーフィルタが設けられたマゼンタ画素、イエローカラーフィルタが設けられたイエロー画素、及び、グリーンカラーフィルタが設けられたグリーン画素を有する。補色系カラーセンサを用いる場合に上記各色の画素から得る画像は、補色-原色色変換をすれば、原色系のカラーセンサで得る画像と同様の画像に変換できる。原色系または補色系のセンサにおいて、W画素(ほぼ全波長帯域の光を受光するホワイト画素)等、上記以外の特性を有する画素を1または複数種類有する場合も同様である。また、本実施形態のイメージセンサ45はカラーセンサであるが、カラーフィルタを有しないモノクロのセンサを使用してもよい。 As the image sensor 45, a CCD (Charge Coupled Device) sensor or a CMOS (Complementary Metal Oxide Semiconductor) sensor can be used. Further, although the image sensor 45 of the present embodiment is a primary color system color sensor, a complementary color system color sensor can also be used. Complementary color sensors include, for example, a cyan pixel provided with a cyan color filter, a magenta pixel provided with a magenta color filter, a yellow pixel provided with a yellow color filter, and a green pixel provided with a green color filter. Have. When the complementary color sensor is used, the image obtained from the pixels of each of the above colors can be converted into the same image as the image obtained by the primary color sensor by performing the complementary color-primary color conversion. The same applies to the case where the primary color system or complementary color system sensor has one or a plurality of types of pixels having characteristics other than the above, such as W pixels (white pixels that receive light in almost all wavelength bands). Further, although the image sensor 45 of the present embodiment is a color sensor, a monochrome sensor having no color filter may be used.
 プロセッサ装置16には、後述するような制御部51、画像取得部52、画像処理部56、及び表示制御部57等が行う処理等に関するプログラムがメモリ(図示せず)に組み込まれている。画像処理装置として機能するプロセッサ装置16が備えるプロセッサにより構成される制御部51によってそのプログラムが動作することで、制御部51、画像取得部52、画像処理部56、及び表示制御部57の機能が実現する。 The processor device 16 incorporates a program (not shown) related to processing performed by the control unit 51, the image acquisition unit 52, the image processing unit 56, the display control unit 57, and the like, which will be described later. The program is operated by the control unit 51 composed of the processor included in the processor device 16 that functions as an image processing device, so that the functions of the control unit 51, the image acquisition unit 52, the image processing unit 56, and the display control unit 57 can be performed. Realize.
 制御部51は、照明光の照射タイミングと撮影のタイミングの同期制御等の内視鏡システム10の統括的な制御を行う。キーボード19等を用いて、各種設定の入力等をした場合には、制御部51は、その設定を、光源制御部22、イメージセンサ45、又は画像処理部56等の内視鏡システム10の各部に入力する。 The control unit 51 comprehensively controls the endoscope system 10 such as synchronous control of the irradiation timing of the illumination light and the shooting timing. When various settings are input using the keyboard 19 or the like, the control unit 51 sets the settings in each part of the endoscope system 10 such as the light source control unit 22, the image sensor 45, or the image processing unit 56. Enter in.
 画像取得部52は、イメージセンサ45から、各色の画素を用いて観察対象を撮影した画像、すなわちRAW画像を取得する。また、RAW画像は、デモザイク処理を実施する前の画像(内視鏡画像)である。デモザイク処理を実施する前の画像であれば、イメージセンサ45から取得した画像に対してノイズ低減処理等の任意の処理を実施した画像もRAW画像に含む。 The image acquisition unit 52 acquires an image of an observation target captured using pixels of each color, that is, a RAW image, from the image sensor 45. The RAW image is an image (endoscopic image) before the demosaic processing is performed. If the image is an image before the demosaic processing is performed, the RAW image also includes an image obtained by performing arbitrary processing such as noise reduction processing on the image acquired from the image sensor 45.
 画像取得部52は、取得したRAW画像に必要に応じて各種処理を施すために、DSP(Digital Signal Processor)53と、ノイズ低減部54と、変換部55と、を備える。 The image acquisition unit 52 includes a DSP (Digital Signal Processor) 53, a noise reduction unit 54, and a conversion unit 55 in order to perform various processing on the acquired RAW image as needed.
 DSP53は、例えば、オフセット処理部、欠陥補正処理部、デモザイク処理部、リニアマトリクス処理部、及び、YC変換処理部、等(いずれも図示しない)を備える。DSP53は、これらを用いてRAW画像またはRAW画像を用いて生成した画像に対して各種処理を施す。 The DSP 53 includes, for example, an offset processing unit, a defect correction processing unit, a demosaic processing unit, a linear matrix processing unit, a YC conversion processing unit, and the like (none of which are shown). The DSP 53 performs various processing on the RAW image or the image generated by using the RAW image using these.
 オフセット処理部は、RAW画像に対してオフセット処理を施す。オフセット処理は、RAW画像から暗電流成分を低減し、正確な零レベルを設定する処理である。オフセット処理は、クランプ処理と称する場合がある。欠陥補正処理部は、RAW画像に対して欠陥補正処理を施す。欠陥補正処理は、イメージセンサ45が製造工程または経時変化に起因する欠陥を有する画素(欠陥画素)を含む場合に、イメージセンサ45の欠陥画素に対応するRAW画素の画素値を補正または生成する処理である。 The offset processing unit performs offset processing on the RAW image. The offset process is a process of reducing the dark current component from the RAW image and setting an accurate zero level. The offset process may be referred to as a clamp process. The defect correction processing unit performs defect correction processing on the RAW image. The defect correction process is a process of correcting or generating a pixel value of a RAW pixel corresponding to a defective pixel of the image sensor 45 when the image sensor 45 includes a pixel (defective pixel) having a defect due to a manufacturing process or a change with time. Is.
 デモザイク処理部は、各色のカラーフィルタに対応する各色のRAW画像に対してデモザイク処理を施す。デモザイク処理は、RAW画像においてカラーフィルタの配列に起因して欠落する画素値を補間によって生成する処理である。リニアマトリクス処理部は、1または複数のRAW画像をRGB各色のチャンネルに割り当てることにより生成する内視鏡画像に対してリニアマトリクス処理を行う。リニアマトリクス処理は、内視鏡画像の色再現性を高める処理である。YC変換処理部が行うYC変換処理は、1または複数のRAW画像をRGB各色のチャンネルに割り当てることにより生成する内視鏡画像を、輝度チャンネルYと色差チャンネルCb及び色差チャンネルCrを有する内視鏡画像に変換する処理である。 The demosaic processing unit performs demosaic processing on the RAW image of each color corresponding to the color filter of each color. The demosaic process is a process of generating pixel values that are missing due to the arrangement of color filters in a RAW image by interpolation. The linear matrix processing unit performs linear matrix processing on the endoscopic image generated by assigning one or a plurality of RAW images to channels of each RGB color. The linear matrix processing is a processing for enhancing the color reproducibility of an endoscopic image. In the YC conversion process performed by the YC conversion processing unit, an endoscope image generated by assigning one or a plurality of RAW images to channels of each RGB color is used as an endoscope having a brightness channel Y, a color difference channel Cb, and a color difference channel Cr. This is the process of converting to an image.
 ノイズ低減部54は、輝度チャンネルY、色差チャンネルCb及び色差チャンネルCrを有する内視鏡画像に対して、例えば、移動平均法またはメディアンフィルタ法等を用いてノイズ低減処理を施す。変換部55は、ノイズ低減処理後の輝度チャンネルY、色差チャンネルCb及び色差チャンネルCrを再びBGRの各色のチャンネルを有する内視鏡画像に再変換する。 The noise reduction unit 54 performs noise reduction processing on an endoscope image having a brightness channel Y, a color difference channel Cb, and a color difference channel Cr by using, for example, a moving average method or a median filter method. The conversion unit 55 reconverts the luminance channel Y, the color difference channel Cb, and the color difference channel Cr after the noise reduction processing into an endoscopic image having channels of each color of BGR.
 画像処理部56は、画像取得部52が出力する内視鏡画像に、必要な画像処理、又は演算を行う。図9に示すように、画像処理部56は、表示画像生成部61、強調処理部62、支援用画像信号取得部63、特定領域判定部64、及び支援画像生成部65を備える。強調処理部62は、色彩強調部66、及びコントラスト強調部67を備える。 The image processing unit 56 performs necessary image processing or calculation on the endoscopic image output by the image acquisition unit 52. As shown in FIG. 9, the image processing unit 56 includes a display image generation unit 61, an enhancement processing unit 62, a support image signal acquisition unit 63, a specific area determination unit 64, and a support image generation unit 65. The enhancement processing unit 62 includes a color enhancement unit 66 and a contrast enhancement unit 67.
 表示画像生成部61は、画像取得部52が出力する内視鏡画像に、ディスプレイ18に表示するために必要な画像処理を施して、表示画像を生成する。強調処理部62は、画像取得部52が出力する内視鏡画像に強調処理を行う。支援用画像信号取得部63は、支援用画像信号を取得する。支援用画像信号は、強調処理を行った内視鏡画像と、特殊光を用いて撮像した内視鏡画像とを含む。特定領域判定部64は、支援用画像信号のそれぞれに対し特定領域を判定する処理を行う。支援画像生成部65は、特定領域を示す支援画像を生成する。 The display image generation unit 61 generates a display image by performing image processing necessary for displaying the endoscope image output by the image acquisition unit 52 on the display 18. The enhancement processing unit 62 performs enhancement processing on the endoscopic image output by the image acquisition unit 52. The support image signal acquisition unit 63 acquires the support image signal. The support image signal includes an endoscopic image that has undergone enhancement processing and an endoscopic image that is captured by using special light. The specific area determination unit 64 performs a process of determining a specific area for each of the support image signals. The support image generation unit 65 generates a support image showing a specific area.
 表示画像生成部61は、少なくとも1つの画像信号に基づいて得られる表示用画像信号を用いてディスプレイに表示する表示画像を生成する。本実施形態では、図10に示すように、表示画像生成部61は、通常モード及び支援モードにおいて、通常光によって得られる画像信号を表示用画像信号として表示画像を生成し、自然な色合いの通常画像を表示画像68としてディスプレイ18に表示する。また、特殊モードに及び支援モードにおいて、特殊光によって得られる画像信号を表示用画像信号として表示画像を生成し、例えば、観察対象の特定の状態が強調された特殊画像をディスプレイ18に表示する。 The display image generation unit 61 generates a display image to be displayed on the display using a display image signal obtained based on at least one image signal. In the present embodiment, as shown in FIG. 10, the display image generation unit 61 generates a display image using an image signal obtained by normal light as a display image signal in the normal mode and the support mode, and normally has a natural hue. The image is displayed on the display 18 as a display image 68. Further, in the special mode and the support mode, the image signal obtained by the special light is used as a display image signal to generate a display image, and for example, a special image in which a specific state of an observation target is emphasized is displayed on the display 18.
 上記したように、本実施形態では、光源制御部22により、フレームを単位として特定の照明パターンが繰り返される(図8参照)。フレームは、撮像の単位である。したがって、1フレームにおいて1回の撮像及び画像信号の取得を行う。本実施形態では、通常光によって得られる画像信号から表示画像を生成してディスプレイに表示するため、照明パターンのうち、通常光CL以外の照明光である第1照明光1S、第2照明光2S、及び第3照明光3Sの特殊光により得られる画像信号からは表示画像を生成せず、支援画像を生成するための画像信号として用いる。第1照明光1S、第2照明光2S、及び第3照明光3Sの特殊光を照射するフレームの期間は、直前の通常光によって得られる画像信号による表示画像を続けて表示する。したがって、表示画像はすべて通常光によって得られる画像信号による画像である。 As described above, in the present embodiment, the light source control unit 22 repeats a specific illumination pattern in units of frames (see FIG. 8). A frame is a unit of imaging. Therefore, one imaging and image signal acquisition are performed in one frame. In the present embodiment, since the display image is generated from the image signal obtained by the normal light and displayed on the display, the first illumination light 1S and the second illumination light 2S, which are the illumination lights other than the normal light CL, among the illumination patterns. , And the display image is not generated from the image signal obtained by the special light of the third illumination light 3S, and is used as an image signal for generating a support image. During the period of the frame for irradiating the special light of the first illumination light 1S, the second illumination light 2S, and the third illumination light 3S, the display image by the image signal obtained by the immediately preceding normal light is continuously displayed. Therefore, all the displayed images are images based on image signals obtained by ordinary light.
 強調処理部62は、画像取得部52が出力する内視鏡画像に強調処理を行う。強調とは、他の組織または構造等と区別して、特定の部分の情報を得られるようにすることをいう。例えば、特定の特徴を有する部分を、他の部分(例えば正常な粘膜等)に対して相対的に色彩もしくは明るさを変更する等の処理は強調処理である。強調処理部62が処理を行う内視鏡画像は、通常画像でも特殊画像でもよい。強調処理を行った内視鏡画像は、支援画像を生成するための支援用画像信号として用いる。支援用画像信号は、支援画像を生成するための画像信号である。強調処理部62は、色彩強調部66と、コントラスト強調部67とを備える。本実施形態では、第3照明光により得られる第3画像信号に対し、色彩強調部66による処理を行って、支援用画像信号とする。 The enhancement processing unit 62 performs enhancement processing on the endoscopic image output by the image acquisition unit 52. Emphasis means making it possible to obtain information on a specific part by distinguishing it from other organizations or structures. For example, a process of changing the color or brightness of a portion having a specific feature relative to another portion (for example, a normal mucous membrane) is an emphasis process. The endoscopic image processed by the enhancement processing unit 62 may be a normal image or a special image. The enhanced endoscopic image is used as a support image signal for generating a support image. The support image signal is an image signal for generating a support image. The enhancement processing unit 62 includes a color enhancement unit 66 and a contrast enhancement unit 67. In the present embodiment, the third image signal obtained by the third illumination light is processed by the color enhancing unit 66 to obtain a support image signal.
 色彩強調部66は、例えば、観察対象における正常な領域と異常な領域との境界が色及び彩度により明瞭に表される内視鏡画像となるように、取得した内視鏡画像に対して強調処理を行う。色彩強調部66は、取得した内視鏡画像において、色情報変換処理を行う。色情報変換処理は、取得した内視鏡画像について、色空間上に分布する複数の範囲のそれぞれを、変換前の範囲と対応付けられた変換先の範囲に移す処理である。色彩強調部66により処理された内視鏡画像は、正常な領域と異常な領域との境界が明瞭であるため、より容易に、また、精度良く、異常な領域を特定領域として判定することができる画像である。 The color enhancement unit 66 refers to the acquired endoscopic image so that, for example, the boundary between the normal region and the abnormal region in the observation target is clearly represented by the color and saturation. Perform enhancement processing. The color enhancement unit 66 performs color information conversion processing on the acquired endoscopic image. The color information conversion process is a process of transferring each of a plurality of ranges distributed in the color space of the acquired endoscopic image to the range of the conversion destination associated with the range before conversion. In the endoscopic image processed by the color enhancement unit 66, the boundary between the normal region and the abnormal region is clear, so that the abnormal region can be determined more easily and accurately as a specific region. It is an image that can be done.
 コントラスト強調部67は、例えば、観察対象における血管が強調されて表された内視鏡画像となるように、取得した内視鏡画像に対して強調処理を行う。コントラスト強調部67は、取得した内視鏡画像において、横軸に画素値(輝度値)を、縦軸に頻度を取ったグラフである濃度ヒストグラムを求め、画像処理部56のメモリ(図示せず)等に予め記憶しておいた階調補正テーブルにより、階調補正を行う。階調補正テーブルは、横軸が入力値を、縦軸が出力値を表し、入力値と出力値の対応関係を示す階調補正カーブを有しており、例えば、略S字形状の階調補正カーブに基づいて階調補正を行って、取得した内視鏡画像のダイナミックレンジを広げる。これにより、コントラスト強調の強調処理前の原画像において濃度が低い部分は、濃度がより低く、濃度が高い部分はより高くなるようになるため、例えば、血管領域と血管が存在しない領域の濃度差が広がって、血管のコントラストが向上する。したがって、コントラスト強調部67により処理された内視鏡画像は、血管のコントラストが向上されているため、より容易に、また、精度良く、例えば、血管の密集度が高い領域を特定領域として判定することができる画像である。 The contrast enhancement unit 67 performs enhancement processing on the acquired endoscopic image so that the endoscopic image is represented by emphasizing the blood vessels in the observation target, for example. The contrast enhancement unit 67 obtains a density histogram, which is a graph in which the pixel value (luminance value) is taken on the horizontal axis and the frequency is taken on the vertical axis in the acquired endoscopic image, and the memory of the image processing unit 56 (not shown). ) Etc., the gradation correction is performed by the gradation correction table stored in advance. The gradation correction table has a gradation correction curve in which the horizontal axis represents an input value and the vertical axis represents an output value, and the correspondence between the input value and the output value is shown. Gradation correction is performed based on the correction curve to widen the dynamic range of the acquired endoscopic image. As a result, in the original image before the contrast enhancement processing, the density is lower in the low density portion and higher in the high density portion. Therefore, for example, the density difference between the blood vessel region and the region where no blood vessel exists. Spreads and the contrast of blood vessels improves. Therefore, in the endoscopic image processed by the contrast enhancing unit 67, since the contrast of the blood vessels is improved, it is easier and more accurate to determine, for example, a region where the density of blood vessels is high as a specific region. It is an image that can be.
 支援用画像信号取得部63は、特定領域判定部64が特定領域を判定する支援用画像信号を取得する。支援用画像信号は、強調処理を行った内視鏡画像と、特殊光を用いて撮像した内視鏡画像とを含む。本実施形態では、支援用画像信号は、第1照明光を用いた第1画像信号と、第2照明光を用いた第2画像信号と、第3照明光を用いた第3画像信号に対して色彩強調部66が色彩強調処理を行った後の強調処理済の第3画像信号との3種類である。支援用画像信号取得部63は、画像取得部52又は強調処理部62から、これらの支援用画像信号を取得する。 The support image signal acquisition unit 63 acquires the support image signal for which the specific area determination unit 64 determines the specific area. The support image signal includes an endoscopic image that has undergone enhancement processing and an endoscopic image that is captured by using special light. In the present embodiment, the support image signal is relative to the first image signal using the first illumination light, the second image signal using the second illumination light, and the third image signal using the third illumination light. There are three types of signals, that is, the third image signal that has been enhanced after the color enhancement unit 66 has performed the color enhancement processing. The support image signal acquisition unit 63 acquires these support image signals from the image acquisition unit 52 or the enhancement processing unit 62.
 特定領域判定部64は、複数の支援用画像信号のそれぞれにおいて、観察対象が特定の状態である特定領域を判定する。したがって、一つの支援用画像信号において、特定領域を判定し、特定領域があるかどうか、ある場合はその領域を判定する。特定領域を判定するとは、特定領域が無いことを判定する場合を含む。 The specific area determination unit 64 determines a specific area in which the observation target is in a specific state in each of the plurality of support image signals. Therefore, in one support image signal, a specific area is determined, and whether or not there is a specific area, and if so, that area is determined. The determination of a specific area includes the case of determining that there is no specific area.
 図11に示すように、特定領域判定部64は、複数の支援用画像信号のそれぞれに応じた判定処理部を備え、第1判定処理部71、第2判定処理部72、第3判定処理部73等を含む各判定処理部を備える。各判定処理部は、複数の支援用画像信号のそれぞれの種類に応じて、観察対象のどのような特定の状態の特定領域を判定するかが異なる。したがって、特定領域判定部64は、第1判定処理部71から、第n判定処理部74までを備える。 As shown in FIG. 11, the specific area determination unit 64 includes a determination processing unit corresponding to each of the plurality of support image signals, and includes a first determination processing unit 71, a second determination processing unit 72, and a third determination processing unit. Each determination processing unit including 73 and the like is provided. Each determination processing unit differs in what kind of specific state of the observation target the specific region of the observation target is determined according to each type of the plurality of support image signals. Therefore, the specific area determination unit 64 includes the first determination processing unit 71 to the nth determination processing unit 74.
 本実施形態では、支援用画像信号は、第1画像信号と、第2画像信号と、強調処理済の第3画像信号の3種類である。第1判定処理部71は、第1画像信号の判定処理を行う。第1画像信号は、第1照明光により表層血管が強調されている。本実施形態では、支援用画像信号が第1画像信号の場合、第1判定処理部71は、表層血管の不規則度が予め設定した閾値よりも高い領域を判定し、第1特定領域とする。第1照明光により強調された表層血管の不規則度が高い領域は、例えば、観察対象が大腸の場合の潰瘍性大腸炎の炎症度を判定する指標となる。したがって、第1画像信号における第1特定領域は、潰瘍性大腸炎の非寛解領域である。 In the present embodiment, there are three types of support image signals: a first image signal, a second image signal, and a third image signal that has been enhanced. The first determination processing unit 71 performs determination processing of the first image signal. In the first image signal, the surface blood vessels are emphasized by the first illumination light. In the present embodiment, when the support image signal is the first image signal, the first determination processing unit 71 determines a region where the irregularity of the surface blood vessel is higher than a preset threshold value, and sets the region as the first specific region. .. The region with high irregularity of the superficial blood vessels emphasized by the first illumination light is, for example, an index for determining the degree of inflammation of ulcerative colitis when the observation target is the large intestine. Therefore, the first specific region in the first image signal is a non-remission region of ulcerative colitis.
 第2判定処理部72は、第2画像信号の判定処理を行う。第2画像信号は、第2照明光により表層血管及びポリープ等の構造が強調されている。本実施形態では、支援用画像信号が第2画像信号の場合、第2判定処理部72は、ポリープ等の構造を示す輝度値が高い領域を判定し、第2特定領域とする。第2照明光により強調された輝度値が高い領域は、例えば、観察対象の表層が不規則であるポリープ等が存在することを示すため、がん等の病変を判定する指標となる。したがって、第2画像信号における第2特定領域は、ポリープ等を有するがん等の病変の可能性がある領域である。 The second determination processing unit 72 performs determination processing of the second image signal. In the second image signal, the structures such as surface blood vessels and polyps are emphasized by the second illumination light. In the present embodiment, when the support image signal is the second image signal, the second determination processing unit 72 determines a region having a high luminance value indicating a structure such as a polyp, and sets the region as the second specific region. The region with a high luminance value emphasized by the second illumination light indicates, for example, the existence of a polyp or the like whose surface layer to be observed is irregular, and thus serves as an index for determining a lesion such as cancer. Therefore, the second specific region in the second image signal is a region where there is a possibility of a lesion such as cancer having a polyp or the like.
 第3判定処理部73は、強調処理済の第3画像信号の判定処理を行う。強調処理済の第3画像信号は、第3照明光及び色彩強調部66による強調処理により、観察対象における正常な領域と異常な領域との境界が色及び彩度により明瞭に表されている。本実施形態では、支援用画像信号が強調処理済の第3画像信号の場合、第3判定処理部73は、粘膜が異常である領域を第3特定領域とする。第3照明光及び強調処理により、第3画像信号は、例えば、周囲と色が異なるがん等の病変を判定する指標となる。したがって、強調処理済の第3画像信号における第3特定領域は、発赤等を有するがん等の病変の可能性がある領域である。 The third determination processing unit 73 performs determination processing of the third image signal that has been enhanced. In the enhanced third image signal, the boundary between the normal region and the abnormal region in the observation target is clearly represented by the color and saturation by the enhancement processing by the third illumination light and the color enhancing unit 66. In the present embodiment, when the support image signal is the enhanced third image signal, the third determination processing unit 73 sets the region where the mucous membrane is abnormal as the third specific region. By the third illumination light and the enhancement process, the third image signal becomes an index for determining a lesion such as cancer whose color is different from that of the surroundings, for example. Therefore, the third specific region in the enhanced third image signal is a region where there is a possibility of a lesion such as cancer having redness or the like.
 支援画像生成部65は、特定領域判定部64により、複数の支援用画像信号のそれぞれにおいて判定された、観察画像における特定領域を示す支援画像を生成する。支援画像は、観察画像上の特定領域が認識できる態様の画像である。 The support image generation unit 65 generates a support image indicating a specific area in the observation image determined by each of the plurality of support image signals by the specific area determination unit 64. The support image is an image in which a specific area on the observation image can be recognized.
 複数の支援画像は、同一又は相似の図形により表示することが好ましい。また、支援画像は、特定の領域の面積に対応した面積により表示することが好ましい。したがって、支援画像は、特定領域の大きさに対応して、画像の大きさを変えることができる。支援画像は、具体的には、例えば、観察画像において特定領域を含む箇所を囲む図形の態様の画像である。図12に示すように、支援画像81により、第1特定領域は、支援画像81である正方形の図形で囲まれた領域内にあることが示される。 It is preferable to display a plurality of support images with the same or similar figures. Further, it is preferable that the support image is displayed by the area corresponding to the area of the specific area. Therefore, the size of the support image can be changed according to the size of the specific area. Specifically, the support image is, for example, an image in the form of a figure surrounding a portion including a specific area in an observation image. As shown in FIG. 12, the support image 81 indicates that the first specific area is within the area surrounded by the square figure which is the support image 81.
 図13に示すように、支援画像82により、第2特定領域は、支援画像82である正方形の図形で囲まれた領域内にあることが示される。支援画像82の斜線は特定の色を示す。同様に、図14に示すように、支援画像83により、第3特定領域は、支援画像83である正方形の図形で囲まれた領域内にあることが示される。 As shown in FIG. 13, the support image 82 indicates that the second specific area is within the area surrounded by the square figure which is the support image 82. The diagonal lines in the support image 82 indicate a specific color. Similarly, as shown in FIG. 14, the support image 83 indicates that the third specific area is within the area surrounded by the square figure which is the support image 83.
 表示制御部57は、表示画像をディスプレイに表示する際に、複数の支援画像を互いに区別可能な態様により表示画像に重畳して表示する制御を行う。複数の支援画像は、それぞれの支援画像を区別して認識できる態様で表示画像に重畳する。したがって、支援画像のそれぞれを互いに異なる色により表示することができる。例えば、図15に示すように、表示制御部は、第1特定領域を示す支援画像81と、第2特定領域を示す支援画像82と、第3特定領域を示す支援画像83とを、異なる色により表示画像68に重畳する制御を行う。 When the display image is displayed on the display, the display control unit 57 controls to superimpose and display a plurality of support images on the display image in a manner that allows them to be distinguished from each other. The plurality of support images are superimposed on the display image in a manner in which each support image can be distinguished and recognized. Therefore, each of the support images can be displayed in different colors. For example, as shown in FIG. 15, the display control unit uses different colors for the support image 81 showing the first specific area, the support image 82 showing the second specific area, and the support image 83 showing the third specific area. Controls to superimpose on the display image 68.
 以上のとおり、プロセッサ装置16又は内視鏡システム10により、通常光による自然な色による内視鏡画像をディスプレイに表示しながら、複数の異なる種類の判定結果をひと目で把握することができる。本実施形態では、第1特定領域を潰瘍性大腸炎の非寛解領域とし、第2特定領域をポリープ等が存在するがん等の病変の可能性がある領域とし、第3特定領域を発赤等が存在するがん等の病変の可能性がある領域としたが、それぞれの特定領域をどのようなIEEに対応させるかは、内視鏡検査の部位又は目的等に応じて予め設定することができる。したがって、例えば、スクリーニングでは、複数の特定領域を、色又は形状等の各種の異常箇所を判定するように設定することにより、一度のスクリーニングにおいて、各種の病変の見逃しを防ぐことができるため、病変の見逃しのみならず、内視鏡検査の時間短縮にもつながる。また、潰瘍性大腸炎の炎症度の判定では、複数の特定領域を、軽症、中等症、又は重症といった炎症度の違いをよりよく判定するように、それぞれ設定することにより、病変のより詳細な分布、又は病状等を一度の内視鏡検査により詳細に把握することができるため、回復、又は悪化の判定等をより精度良く行うことができる。したがって、プロセッサ装置16又は内視鏡システム10により、容易に観察対象の詳細な情報を認識することができる。 As described above, the processor device 16 or the endoscope system 10 makes it possible to grasp a plurality of different types of determination results at a glance while displaying an endoscope image in a natural color by normal light on a display. In the present embodiment, the first specific region is a non-resolving region of ulcerative colitis, the second specific region is a region where there is a possibility of a lesion such as cancer in which a polyp or the like is present, and the third specific region is redness or the like. However, it is possible to set in advance what kind of IEE each specific area corresponds to, depending on the site or purpose of endoscopy. can. Therefore, for example, in screening, by setting a plurality of specific regions to determine various abnormal parts such as color or shape, it is possible to prevent overlooking of various lesions in one screening. Not only is it overlooked, but it also shortens the time for endoscopy. In addition, in the determination of the degree of inflammation of ulcerative colitis, more detailed lesions can be determined by setting a plurality of specific areas so as to better determine the difference in the degree of inflammation such as mild, moderate, or severe. Since the distribution, medical condition, etc. can be grasped in detail by one endoscopy, recovery or deterioration can be determined more accurately. Therefore, the processor device 16 or the endoscope system 10 can easily recognize the detailed information of the observation target.
 なお、表示制御部57は、第1特定領域を示す支援画像81と、第2特定領域を示す支援画像82と、第3特定領域を示す支援画像83とを、異なる形状の図形により表示画像68に重畳する制御を行ってもよい。図16に示すように、第1特定領域を示す支援画像81は四角形により、第2特定領域を示す支援画像82は六角形により、第3特定領域を示す支援画像83は円形により、示すことができる。この場合、各支援画像の色は、同じとしてもよいし、異なるようにしてもよい。また、各特定領域を一つの図形で示す他に、より小さい図形をちりばめる等の異なる模様により各特定領域を示してもよい。 The display control unit 57 displays the support image 81 showing the first specific area, the support image 82 showing the second specific area, and the support image 83 showing the third specific area with figures having different shapes. You may perform control to superimpose on. As shown in FIG. 16, the support image 81 showing the first specific area may be shown by a quadrangle, the support image 82 showing the second specific area may be shown by a hexagon, and the support image 83 showing the third specific area may be shown by a circle. can. In this case, the colors of the support images may be the same or different. Further, in addition to showing each specific area with one figure, each specific area may be shown by a different pattern such as studded with smaller figures.
 また、表示制御部57は、支援画像生成部65が支援画像を生成する毎に、支援画像を表示画像に重畳して表示する制御を行ってもよい。この場合、支援用画像信号取得部63が、支援用画像信号を取得する毎に、特定領域判定部64が特定領域を判定し、支援画像生成部65が、特定領域を示す支援画像を生成する。したがって、例えば、図17に示すように、本実施形態では、支援用画像信号である第1画像信号が、第1照明光1Sにより照明された観察対象を撮像して得た際に、支援画像生成部65が、第1特定領域を示す支援画像81を生成する。照明光の1周期(1CY)において、支援画像81は一度取得される(図8参照)。取得された支援画像81は、次の支援画像81が取得されるまでは継続して表示画像68に重畳して表示される。支援画像82又は支援画像83においても同様に、照明光の1周期(1CY)において、支援画像82又は支援画像83は一度取得され、次の支援画像82又は支援画像83が取得されるまでは継続して表示画像68に重畳して表示される。なお、表示画像は、通常画像が取得され次第、表示画像を更新して表示する。 Further, the display control unit 57 may control the support image to be superimposed and displayed on the display image each time the support image generation unit 65 generates the support image. In this case, every time the support image signal acquisition unit 63 acquires the support image signal, the specific area determination unit 64 determines the specific area, and the support image generation unit 65 generates a support image indicating the specific area. .. Therefore, for example, as shown in FIG. 17, in the present embodiment, when the first image signal, which is the support image signal, is obtained by imaging the observation target illuminated by the first illumination light 1S, the support image is obtained. The generation unit 65 generates a support image 81 showing the first specific region. The support image 81 is acquired once in one cycle (1 CY) of the illumination light (see FIG. 8). The acquired support image 81 is continuously displayed on the display image 68 until the next support image 81 is acquired. Similarly, in the support image 82 or the support image 83, the support image 82 or the support image 83 is acquired once in one cycle (1CY) of the illumination light, and continues until the next support image 82 or the support image 83 is acquired. Then, it is superimposed on the display image 68 and displayed. As for the displayed image, the displayed image is updated and displayed as soon as the normal image is acquired.
 支援画像生成部65が支援画像を生成する毎に、表示制御部57は支援画像を表示画像に重畳して表示する制御を行うため、支援画像はリアルタイムに表示される。なお、フレームの期間であるフレームレートは、人の視認能力に対して十分に小さい期間とすることにより、観察対象の多少の動きがあっても、支援画像は観察対象に追随して、ほぼリアルタイムに表示される。具体的には、例えば、3種類のIEEからなる3種の支援画像を重畳する場合、6フレーム毎に支援用画像信号を取得する際は、照明光の1周期(1CY)は18フレームであり、18フレーム毎に観察対象における特定の領域を示す支援画像の位置等が更新される。したがって、ほぼリアルタイムに支援画像が表示される。 Every time the support image generation unit 65 generates a support image, the display control unit 57 controls to superimpose the support image on the display image, so that the support image is displayed in real time. By setting the frame rate, which is the period of the frame, to a period sufficiently small for the human visual ability, the support image follows the observation target even if there is some movement of the observation target, and is almost real-time. Is displayed in. Specifically, for example, in the case of superimposing three types of support images composed of three types of IEE, one cycle (1CY) of the illumination light is 18 frames when the support image signal is acquired every 6 frames. , The position of the support image showing a specific area in the observation target is updated every 18 frames. Therefore, the support image is displayed in almost real time.
 また、表示制御部57は、支援画像と強調処理又は支援画像用照明光との関連付けを示す凡例表示をディスプレイに表示する制御を行ってもよい。支援画像は、支援用画像信号に基づいて生成し、支援用画像信号は、観察対象に照射する照明光として各種の特殊光を用いて撮像することにより得る支援用画像信号と、画像信号に対し強調処理等の各種の画像処理を行うことにより得る支援用画像信号とを含む。したがって、支援画像が、特殊光を用いて撮像することによる支援用画像信号であるか、強調処理等を行った支援用画像信号であるかが、容易に把握することができることが好ましい。 Further, the display control unit 57 may control to display a legend display indicating the association between the support image and the enhancement process or the illumination light for the support image on the display. The support image is generated based on the support image signal, and the support image signal is the support image signal obtained by imaging using various special lights as the illumination light to irradiate the observation target, and the image signal. Includes support image signals obtained by performing various image processing such as enhancement processing. Therefore, it is preferable that it is possible to easily grasp whether the support image is a support image signal obtained by imaging with special light or a support image signal subjected to enhancement processing or the like.
 例えば、図18に示すように、本実施形態では、第1特定領域を示す支援画像81は第1照明光により得られ、第2特定領域を示す支援画像82は第2照明光により得られ、第3特定領域を示す支援画像83は強調処理済の第3支援画像により得られる。したがって、支援画像81と同じ色により第1照明光によるIEEを示す「照明光1」と、支援画像82と同じ色により第2照明光によるIEEを示す「照明光2」と、支援画像83と同じ色により強調処理によるIEEを示す「色彩強調」とを、表示画像の右下に、支援画像の凡例表示91として表示することができる。 For example, as shown in FIG. 18, in the present embodiment, the support image 81 showing the first specific region is obtained by the first illumination light, and the support image 82 showing the second specific region is obtained by the second illumination light. The support image 83 showing the third specific region is obtained from the enhanced third support image. Therefore, the "illumination light 1" indicating the IEE by the first illumination light with the same color as the support image 81, the "illumination light 2" indicating the IEE by the second illumination light with the same color as the support image 82, and the support image 83. "Color enhancement" indicating the IEE by the enhancement process with the same color can be displayed as the legend display 91 of the support image at the lower right of the display image.
 支援画像がどのようなIEEにより得られたものかを凡例として表示することにより、支援画像が表示された際に、どのような病変等が検出されたかを容易に把握することができる。また、凡例表示91により、表示画像が通常画像であるから、観察対象に病変等が存在せず、支援画像が一つも表示されない場合でも、現在どのようなIEEが行われているのか、すなわち、例えば、照明光の1周期(1CY)において、どのような種類の特殊光による照明光が用いられているのかを把握することができる。したがって、凡例表示91は、内視鏡検査において、その時点でおこなわれているIEEが何であるかを示す表示としても有用である。 By displaying as a legend what kind of IEE the support image was obtained from, it is possible to easily grasp what kind of lesion or the like was detected when the support image was displayed. Further, since the displayed image is a normal image according to the legend display 91, what kind of IEE is currently being performed even if there is no lesion or the like in the observation target and no support image is displayed, that is, that is, For example, it is possible to grasp what kind of special light illumination light is used in one cycle (1CY) of illumination light. Therefore, the legend display 91 is also useful as a display indicating what the IEE being performed at that time is in endoscopy.
 また、図19に示すように、観察対象において、複数の特定領域が重なる領域があった場合は、支援画像82と支援画像83とを重複して表示してもよい。また、図20に示すように、観察対象において、複数の特定領域の面積が同じである場合は、支援画像82と支援画像83とを区別可能な態様で表示する。 Further, as shown in FIG. 19, when there is an area where a plurality of specific areas overlap in the observation target, the support image 82 and the support image 83 may be displayed in an overlapping manner. Further, as shown in FIG. 20, when the areas of a plurality of specific regions are the same in the observation target, the support image 82 and the support image 83 are displayed in a distinguishable manner.
 観察対象において、複数の特定領域が重なる、又は、複数の特定領域の面積が同じである場合に、支援画像を区別可能な態様で表示することにより、観察対象におけるより詳細な情報を得ることができる。例えば、ポリープ等の構造が判定された特定領域と、表層血管の密集度が高い領域が判定された特定領域との、2つの特定領域が重なった箇所は、がんの病変である可能性が高いとの知見がある場合、これらの2つの特定領域が重なることが容易に認識でき、よりよい精度で判定をすることができる。したがって、複数の特定領域が重複した場合に観察対象が特定の状態の病変である等の知見がある場合は、目的に応じて、特定領域を判定する支援用画像信号の設定を行うことにより、観察対象において、所望の特定の状態である領域が支援画像により示されるため好ましい。 When a plurality of specific areas overlap or the areas of the plurality of specific areas are the same in the observation target, more detailed information on the observation target can be obtained by displaying the support image in a distinguishable manner. can. For example, a location where two specific regions overlap, a specific region where the structure of a polyp or the like is determined and a specific region where a region with high superficial blood vessel density is determined, may be a lesion of cancer. If there is a finding that it is high, it can be easily recognized that these two specific regions overlap, and the determination can be made with better accuracy. Therefore, if there is a finding that the observation target is a lesion in a specific state when a plurality of specific areas overlap, the support image signal for determining the specific area can be set according to the purpose. In the observation target, a region in a desired specific state is shown by a support image, which is preferable.
 なお、1つの種類の特定領域を判定するために、1つの画像信号を用いる場合に限らず、1つの種類の特定領域を判定するために、2つ以上の画像信号を用いてもよい。例えば、観察対象において、低酸素飽和度の領域を特定領域とする場合等、2つ以上の画像信号を用いて判定してもよい。1つの種類の特定領域を判定するために、2つ以上の画像信号を用いることにより、特定領域が示す観察対象の特定の状態の種類が多くなるため好ましい。 It should be noted that not only when one image signal is used to determine one type of specific area, two or more image signals may be used to determine one type of specific area. For example, in the observation target, when the region of low oxygen saturation is set as a specific region, the determination may be made using two or more image signals. It is preferable to use two or more image signals to determine one type of specific region because the number of types of the specific state of the observation target indicated by the specific region increases.
 なお、特定領域を示す支援画像は、文字情報を有するものであってもよい。文字情報は、特定領域の判定結果等を文字で表示した情報である。例えば、図21に示すように、支援画像81は「UC重症度:軽症」との文字情報92を有する。文字情報92は、支援画像81が示す観察対象の特定の状態が、潰瘍性大腸炎(ulcerative colitis)の軽症と判定されたことを文字で示した情報である。なお、UC重症度は、炎症の強さの状態を重症、中等症、及び軽症の3つに分けて示したものである。また、支援画像82は、「ポリープ:Φ2mm」との文字情報93を有する。支援画像82が示す観察対象の特定の状態が、ポリープであり、その直径が2mmであると判定されたことを文字で示した情報である。また、支援画像83は「がん疑い率:30%」との文字情報94を有する。文字情報94は、支援画像83が示す観察対象の特定の状態が、がんの疑いがあり、その確率であるがんの疑い率が30%であると判定されたことを文字で示した情報である。支援画像が文字情報を有することにより、より詳しい支援情報を表示することができる。 The support image showing the specific area may have text information. The character information is information in which the determination result of a specific area or the like is displayed in characters. For example, as shown in FIG. 21, the support image 81 has the text information 92 of "UC severity: mild". The text information 92 is information indicating in text that the specific state of the observation target shown by the support image 81 is determined to be mild of ulcerative colitis. The UC severity is shown by dividing the state of the intensity of inflammation into three categories: severe, moderate, and mild. Further, the support image 82 has the character information 93 of "polyp: Φ2 mm". The specific state of the observation target shown by the support image 82 is a polyp, which is information indicating in characters that the diameter of the polyp is determined to be 2 mm. Further, the support image 83 has the character information 94 of "cancer suspicion rate: 30%". The text information 94 is information indicating in text that the specific state of the observation target shown by the support image 83 is suspected to be cancer and the suspicion rate of cancer, which is the probability thereof, is determined to be 30%. Is. Since the support image has text information, more detailed support information can be displayed.
 次に、画像処理装置であるプロセッサ装置16又は内視鏡システム10が行う処理の一連の流れについて、図22に示すフローチャートに沿って説明を行う。観察を開始すると、所定の照明光パターン(図8参照)に従って、予め設定した順序により照明光が発せられる。まず、照明光が通常光であり、通常画像が取得される(ステップST110)。通常光による通常画像は、表示画像としてディスプレイに表示する(ステップST120)。次に、照明光が第1照明光となり、第1画像信号が取得される(ステップST130)。第1画像信号に基づき、第1特定領域が判定され、第1特定領域を示す支援画像が表示画像に重畳され、支援画像が付された表示画像が表示される(ステップST140)。次に照明光が通常光となり、通常画像が取得される(ステップST150)。通常光による通常画像は、表示画像としてディスプレイに表示する(ステップST160)。次に、照明光が第2照明光となり、第2画像信号が取得される(ステップST170)。第2画像信号に基づき、第2特定領域が判定され、第2特定領域を示す支援画像が表示画像に重畳され、支援画像が付された表示画像が表示される(ステップST180)。次に照明光が通常光となり、通常画像が取得される(ステップST190)。通常光による通常画像は、表示画像としてディスプレイに表示する(ステップST200)。 Next, a series of processes performed by the processor device 16 or the endoscope system 10 which is an image processing device will be described with reference to the flowchart shown in FIG. When the observation is started, the illumination light is emitted in a preset order according to a predetermined illumination light pattern (see FIG. 8). First, the illumination light is normal light, and a normal image is acquired (step ST110). The normal image by normal light is displayed on the display as a display image (step ST120). Next, the illumination light becomes the first illumination light, and the first image signal is acquired (step ST130). Based on the first image signal, the first specific area is determined, the support image indicating the first specific area is superimposed on the display image, and the display image with the support image is displayed (step ST140). Next, the illumination light becomes normal light, and a normal image is acquired (step ST150). The normal image by normal light is displayed on the display as a display image (step ST160). Next, the illumination light becomes the second illumination light, and the second image signal is acquired (step ST170). The second specific area is determined based on the second image signal, the support image indicating the second specific area is superimposed on the display image, and the display image with the support image is displayed (step ST180). Next, the illumination light becomes normal light, and a normal image is acquired (step ST190). The normal image by normal light is displayed on the display as a display image (step ST200).
 次に、照明光が第3照明光となり、第3画像信号が取得される(ステップST210)。第3画像信号に対し、強調処理が行われる(ステップST220)。強調処理済の第3画像信号に基づき、第3特定領域が判定され、第3特定領域を示す支援画像が表示画像に重畳され、支援画像が付された表示画像が表示される(ステップST230)。 Next, the illumination light becomes the third illumination light, and the third image signal is acquired (step ST210). Emphasis processing is performed on the third image signal (step ST220). Based on the enhanced third image signal, the third specific area is determined, the support image indicating the third specific area is superimposed on the display image, and the display image with the support image is displayed (step ST230). ..
 上記実施形態及び変形例等においては、プロセッサ装置16が画像処理装置として機能するが、プロセッサ装置16とは別に、画像処理部56を含む画像処理装置を設けてもよい。この他、図23に示すように、画像処理部56は、例えば内視鏡システム10から直接的に、または、PACS(Picture Archiving and Communication Systems)910か
ら間接的に、内視鏡12で撮影したRAW画像を取得する診断支援装置911に設けることができる。また、図24に示すように、内視鏡システム10を含む、第1検査装置921、第2検査装置922、…、第K検査装置923等の各種検査装置と、ネットワーク926を介して接続する医療業務支援装置930に、画像処理部56を設けることができる。
In the above-described embodiment and modification, the processor device 16 functions as an image processing device, but an image processing device including an image processing unit 56 may be provided separately from the processor device 16. In addition, as shown in FIG. 23, the image processing unit 56 has taken an image with the endoscope 12 directly from the endoscope system 10 or indirectly from the PACS (Picture Archiving and Communication Systems) 910, for example. It can be provided in the diagnostic support device 911 that acquires RAW images. Further, as shown in FIG. 24, various inspection devices such as the first inspection device 921, the second inspection device 922, ..., and the K inspection device 923, including the endoscope system 10, are connected via the network 926. The image processing unit 56 can be provided in the medical service support device 930.
 上記各実施形態及び変形例は、その一部または全部を任意に組み合わせて実施することができる。また、上記各実施形態及び変形例においては、内視鏡12は可撓性の挿入部12aを有するいわゆる軟性内視鏡を用いているが、観察対象が嚥下して使用するカプセル型の内視鏡、外科手術等に使用する硬性内視鏡(腹腔鏡)を用いる場合も本発明は好適である。 Each of the above embodiments and modifications can be carried out in any combination of some or all of them. Further, in each of the above embodiments and modifications, the endoscope 12 uses a so-called flexible endoscope having a flexible insertion portion 12a, but the observation target swallows and uses a capsule-type endoscope. The present invention is also suitable when a rigid endoscope (laparoscope) used for a mirror, surgery, or the like is used.
 上記実施形態及び変形例等は、プロセッサを備える画像処理装置の作動方法であって、内視鏡を用いて観察対象を撮像することにより得られる複数の画像信号を取得する画像取得ステップと、少なくとも1つの前記画像信号に基づいて得られる表示用画像信号を用いて、ディスプレイに表示する表示画像を生成する表示画像生成ステップと、複数の画像信号のそれぞれに基づいて、観察対象が特定の状態である特定領域を画定して示す支援画像を生成する支援画像生成ステップと、表示画像を前記ディスプレイに表示する際に、複数の支援画像を互いに区別可能な態様により表示画像に重畳して表示する制御を行う表示制御ステップとを備える画像処理装置の作動方法を含む。 The above-described embodiment and modification are methods of operating an image processing device including a processor, which include an image acquisition step of acquiring a plurality of image signals obtained by imaging an observation target using an endoscope, and at least an image acquisition step. A display image generation step of generating a display image to be displayed on a display using a display image signal obtained based on one of the image signals, and an observation target in a specific state based on each of a plurality of image signals. A support image generation step for generating a support image that defines and shows a specific area, and a control for displaying a plurality of support images superimposed on the display image in a manner that can be distinguished from each other when the display image is displayed on the display. Includes a method of operating an image processing apparatus comprising a display control step to perform.
 また、上記実施形態及び変形例等は、内視鏡を用いて観察対象を撮像することにより得られる画像信号に画像処理を施す画像処理装置にインストールされる画像処理装置用プログラムにおいて、コンピュータに、内視鏡を用いて観察対象を撮像することにより得られる複数の画像信号を取得する画像取得機能と、少なくとも1つの画像信号に基づいて得られる表示用画像信号を用いて、ディスプレイに表示する表示画像を生成する表示画像生成機能と、複数の画像信号のそれぞれに基づいて、観察対象が特定の状態である特定領域を画定して示す支援画像を生成する支援画像生成機能と、表示画像をディスプレイに表示する際に、複数の支援画像を互いに区別可能な態様により表示画像に重畳して表示する制御を行う表示制御機能とを実現させるための画像処理装置用プログラムを含む。 Further, the above-described embodiment and modification are described in a computer in a program for an image processing device installed in an image processing device that performs image processing on an image signal obtained by imaging an observation target using an endoscope. A display to be displayed on a display using an image acquisition function that acquires a plurality of image signals obtained by imaging an observation target using an endoscope and a display image signal obtained based on at least one image signal. A display image generation function that generates an image, a support image generation function that generates a support image that defines and indicates a specific area in which the observation target is in a specific state based on each of a plurality of image signals, and a display image display. Includes a program for an image processing device for realizing a display control function for controlling display of a plurality of support images superimposed on the display image in a manner that allows them to be distinguished from each other.
 上記実施形態において、プロセッサ装置16に含まれる制御部51、画像取得部52、DSP53、ノイズ低減部54、変換部55、画像処理部56、及び表示制御部57といった各種の処理を実行する処理部(processing unit)のハードウェア的な構造は、次に示すような各種のプロセッサ(processor)である。各種のプロセッサには、ソフトウエア(プログラム)を実行して各種の処理部として機能する汎用的なプロセッサであるCPU(Central Processing Unit)、FPGA (Field Programmable Gate Array) などの製造後に回路構成を変更可能なプロセッサであるプログラマブルロジックデバイス(Programmable Logic Device:PLD)、各種の処理を実行するために専用に設計された回路構成を有するプロセッサである専用電気回路などが含まれる。 In the above embodiment, a processing unit that executes various processes such as a control unit 51, an image acquisition unit 52, a DSP 53, a noise reduction unit 54, a conversion unit 55, an image processing unit 56, and a display control unit 57 included in the processor device 16. The hardware structure of the (processing unit) is various processors as shown below. For various processors, the circuit configuration is changed after manufacturing the CPU (Central Processing Unit), FPGA (Field Programmable Gate Array), etc., which are general-purpose processors that execute software (programs) and function as various processing units. It includes a programmable logic device (PLD), which is a possible processor, a dedicated electric circuit, which is a processor having a circuit configuration specially designed for executing various processes, and the like.
 1つの処理部は、これら各種のプロセッサのうちの1つで構成されてもよいし、同種または異種の2つ以上のプロセッサの組み合せ(例えば、複数のFPGAや、CPUとFPGAの組み合わせ)で構成されてもよい。また、複数の処理部を1つのプロセッサで構成してもよい。複数の処理部を1つのプロセッサで構成する例としては、第1に、クライアントやサーバなどのコンピュータに代表されるように、1つ以上のCPUとソフトウエアの組み合わせで1つのプロセッサを構成し、このプロセッサが複数の処理部として機能する形態がある。第2に、システムオンチップ(System On Chip:SoC)などに代表されるように、複数の処理部を含むシステム全体の機能を1つのIC(Integrated Circuit)チップで実現するプロセッサを使用する形態がある。このように、各種の処理部は、ハードウェア的な構造として、上記各種のプロセッサを1つ以上用いて構成される。 One processing unit may be composed of one of these various processors, or may be composed of a combination of two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of a CPU and an FPGA). May be done. Further, a plurality of processing units may be configured by one processor. As an example of configuring a plurality of processing units with one processor, first, as represented by a computer such as a client or a server, one processor is configured by a combination of one or more CPUs and software. There is a form in which this processor functions as a plurality of processing units. Second, as typified by System On Chip (SoC), there is a form that uses a processor that realizes the functions of the entire system including multiple processing units with one IC (Integrated Circuit) chip. be. As described above, the various processing units are configured by using one or more of the above-mentioned various processors as a hardware-like structure.
 さらに、これらの各種のプロセッサのハードウェア的な構造は、より具体的には、半導体素子などの回路素子を組み合わせた形態の電気回路(circuitry)である。 Furthermore, the hardware-like structure of these various processors is, more specifically, an electric circuit (circuitry) in which circuit elements such as semiconductor elements are combined.
 なお、本発明は、内視鏡画像を取得等する内視鏡システム、プロセッサ装置、その他関連する装置等の他に、内視鏡画像以外の医療画像(動画を含む)を取得するシステムまたは装置等においても利用できる。例えば、本発明は、超音波検査装置、X線画像撮影装置(CT(Computed Tomography)検査装置及びマンモグラフィ装置等を含む)、MRI(magnetic resonance imaging)装置、等に適用できる。 In addition to the endoscopic system for acquiring endoscopic images, processor devices, and other related devices, the present invention is a system or device for acquiring medical images (including moving images) other than endoscopic images. It can also be used in such cases. For example, the present invention can be applied to an ultrasonic inspection device, an X-ray imaging device (including a CT (Computed Tomography) inspection device, a mammography device, etc.), an MRI (magnetic resonance imaging) device, and the like.
 10 内視鏡システム
 12 内視鏡
 12a 挿入部
 12b 操作部
 12c 湾曲部
 12d 先端部
 12e アングルノブ
 12f 処置具挿入口
 12g スコープボタン
 13 ズーム操作部
 14 光源装置
 16 プロセッサ装置
 18 ディスプレイ
 19 キーボード
 20 光源部
 20a V-LED
 20b B-LED
 20c G-LED
 20d R-LED
 22 光源制御部
 30a 照明光学系
 30b 撮影光学系
 41 ライトガイド
 42 照明レンズ
 43 対物レンズ
 44 ズームレンズ
 45 イメージセンサ
 51 制御部
 52 画像取得部
 53 DSP
 54 ノイズ低減部
 55 変換部
 56 画像処理部
 57 表示制御部
 61 表示画像生成部
 62 強調処理部
 63 支援用画像信号取得部
 64 特定領域判定部
 65 支援画像生成部
 66 色彩強調部
 67 コントラスト強調部
 68 表示画像
 71 第1判定処理部
 72 第2判定処理部
 73 第3判定処理部
 74 第n判定処理部
 81、82、83 支援画像
 91 凡例表示
 92、93、94 文字表示
 910 PACS
 911 診断支援装置
 921 第1検査装置
 922 第2検査装置
 923 第K検査装置
 926 ネットワーク
 930 医療業務支援装置
 ST110~ST230 ステップ
 
 
10 Endoscope system 12 Endoscope 12a Insertion part 12b Operation part 12c Curved part 12d Tip part 12e Angle knob 12f Treatment tool insertion port 12g Scope button 13 Zoom operation part 14 Light source device 16 Processor device 18 Display 19 Keyboard 20 Light source part 20a V-LED
20b B-LED
20c G-LED
20d R-LED
22 Light source control unit 30a Illumination optical system 30b Imaging optical system 41 Light guide 42 Illumination lens 43 Objective lens 44 Zoom lens 45 Image sensor 51 Control unit 52 Image acquisition unit 53 DSP
54 Noise reduction unit 55 Conversion unit 56 Image processing unit 57 Display control unit 61 Display image generation unit 62 Emphasis processing unit 63 Support image signal acquisition unit 64 Specific area determination unit 65 Support image generation unit 66 Color enhancement unit 67 Contrast enhancement unit 68 Display image 71 1st judgment processing unit 72 2nd judgment processing unit 73 3rd judgment processing unit 74 nth judgment processing unit 81, 82, 83 Support image 91 Legend display 92, 93, 94 Character display 910 PACS
911 Diagnosis support device 921 1st inspection device 922 2nd inspection device 923 K inspection device 926 Network 930 Medical service support device ST110-ST230 Step

Claims (16)

  1.  プロセッサを備える画像処理装置であって、
     前記プロセッサは、
     内視鏡を用いて観察対象を撮像することにより得られる複数の画像信号を取得し、
     少なくとも1つの前記画像信号に基づいて得られる表示用画像信号を用いて、ディスプレイに表示する表示画像を生成し、
     複数の前記画像信号のそれぞれにおいて、前記観察対象が特定の状態である特定領域を判定し、
     前記特定領域を示す支援画像を生成し、
     前記表示画像を前記ディスプレイに表示する際に、複数の前記支援画像を互いに区別可能な態様により前記表示画像に重畳して表示する制御を行う画像処理装置。
    An image processing device equipped with a processor
    The processor
    Acquire multiple image signals obtained by imaging the observation target using an endoscope, and
    A display image to be displayed on a display is generated by using a display image signal obtained based on at least one of the image signals.
    In each of the plurality of the image signals, a specific region in which the observation target is in a specific state is determined.
    A support image showing the specific area is generated, and the support image is generated.
    An image processing device that controls display of a plurality of the support images superimposed on the display image in a manner that can be distinguished from each other when the display image is displayed on the display.
  2.  複数の前記支援画像は、同一又は相似の図形により表示する請求項1に記載の画像処理装置。 The image processing device according to claim 1, wherein the plurality of support images are displayed by the same or similar figures.
  3.  前記支援画像は、前記特定領域の面積に対応した面積により表示する請求項1または2に記載の画像処理装置。 The image processing device according to claim 1 or 2, wherein the support image is displayed by an area corresponding to the area of the specific area.
  4.  前記プロセッサは、複数の前記支援画像のそれぞれを互いに異なる色により表示する請求項1ないし3のいずれか1項に記載の画像処理装置。 The image processing device according to any one of claims 1 to 3, wherein the processor displays each of the plurality of support images in different colors.
  5.  前記プロセッサは、複数の前記支援画像のそれぞれを互いに異なる形状の図形により表示する請求項1に記載の画像処理装置。 The image processing device according to claim 1, wherein the processor displays each of the plurality of support images by figures having different shapes from each other.
  6.  前記プロセッサは、前記支援画像を生成する毎に、前記支援画像を前記表示画像に重畳して表示する制御を行う請求項1ないし5のいずれか1項に記載の画像処理装置。 The image processing device according to any one of claims 1 to 5, wherein the processor controls to superimpose and display the support image on the display image each time the support image is generated.
  7.  前記画像信号に対して強調処理を行い支援用画像信号とし、
     前記支援用画像信号に基づいて、前記特定領域を判定する請求項1ないし6のいずれか1項に記載の画像処理装置。
    The image signal is enhanced to obtain a support image signal.
    The image processing apparatus according to any one of claims 1 to 6, which determines the specific area based on the support image signal.
  8.  前記プロセッサは、前記支援画像と前記強調処理との関連付けを示す凡例表示を前記ディスプレイに表示する制御を行う請求項7に記載の画像処理装置。 The image processing device according to claim 7, wherein the processor controls to display a legend display indicating the association between the support image and the enhancement process on the display.
  9.  請求項1ないし8のいずれか1項に記載の画像処理装置と、
     前記観察対象に照射する照明光を発する光源部とを備える内視鏡システム。
    The image processing apparatus according to any one of claims 1 to 8.
    An endoscope system including a light source unit that emits illumination light to irradiate the observation target.
  10.  前記表示用画像信号は、前記光源部が発する白色の照明光により照明した前記観察対象を撮像することにより得られる請求項9記載の内視鏡システム。 The endoscope system according to claim 9, wherein the display image signal is obtained by imaging the observation target illuminated by the white illumination light emitted by the light source unit.
  11.  前記プロセッサは、前記光源部が発する互いに分光スペクトルが異なる複数の支援画像用照明光のそれぞれにより照明した前記観察対象を撮像することにより、前記支援画像を生成する請求項10記載の内視鏡システム。 The endoscope system according to claim 10, wherein the processor captures the observation target illuminated by each of a plurality of support image illumination lights having different spectral spectra from each other emitted by the light source unit. ..
  12.  複数の前記支援画像用照明光は、それぞれ予め設定した波長帯域の狭帯域光を含む請求項11に記載の内視鏡システム。 The endoscope system according to claim 11, wherein the plurality of illumination lights for the support image include narrow band light having a wavelength band set in advance.
  13.  前記光源部は、前記白色の照明光と、複数の前記支援画像用照明光のそれぞれとを、予め設定した順序により繰り返し発光する請求項11又は12に記載の内視鏡システム。 The endoscope system according to claim 11 or 12, wherein the light source unit repeatedly emits the white illumination light and each of the plurality of support image illumination lights in a preset order.
  14.  前記プロセッサは、前記支援画像と前記支援画像用照明光との関連付けを示す凡例表示を前記ディスプレイに表示する制御を行う請求項11ないし13のいずれか1項に記載の内視鏡システム。 The endoscope system according to any one of claims 11 to 13, wherein the processor controls to display a legend display indicating the association between the support image and the illumination light for the support image on the display.
  15.  内視鏡を用いて観察対象を撮像することにより得られる複数の画像信号を取得する画像取得ステップと、
     少なくとも1つの前記画像信号に基づいて得られる表示用画像信号を用いて、ディスプレイに表示する表示画像を生成する表示画像生成ステップと、
     複数の前記画像信号のそれぞれに基づいて、前記観察対象が特定の状態である特定領域を画定して示す支援画像を生成する支援画像生成ステップと、
     前記表示画像を前記ディスプレイに表示する際に、複数の前記支援画像を互いに区別可能な態様により前記表示画像に重畳して表示する制御を行う表示制御ステップとを備える画像処理装置の作動方法。
    An image acquisition step of acquiring a plurality of image signals obtained by imaging an observation target using an endoscope, and an image acquisition step.
    A display image generation step of generating a display image to be displayed on a display using a display image signal obtained based on at least one of the image signals.
    A support image generation step of generating a support image indicating by defining a specific region in which the observation target is in a specific state based on each of the plurality of the image signals.
    A method of operating an image processing device including a display control step for controlling display of a plurality of the support images superimposed on the display image when the display image is displayed on the display.
  16.  内視鏡を用いて観察対象を撮像することにより得られる画像信号に画像処理を施す画像処理装置にインストールされる画像処理装置用プログラムにおいて、
     コンピュータに、
     内視鏡を用いて観察対象を撮像することにより得られる複数の画像信号を取得する画像取得機能と、
     少なくとも1つの前記画像信号に基づいて得られる表示用画像信号を用いて、ディスプレイに表示する表示画像を生成する表示画像生成機能と、
     複数の前記画像信号のそれぞれに基づいて、前記観察対象が特定の状態である特定領域を画定して示す支援画像を生成する支援画像生成機能と、
     前記表示画像を前記ディスプレイに表示する際に、複数の前記支援画像を互いに区別可能な態様により前記表示画像に重畳して表示する制御を行う表示制御機能とを実現させるための画像処理装置用プログラム。
     
     
    In a program for an image processing device installed in an image processing device that performs image processing on an image signal obtained by imaging an observation target using an endoscope.
    On the computer
    An image acquisition function that acquires multiple image signals obtained by imaging an observation target using an endoscope, and
    A display image generation function that generates a display image to be displayed on a display by using a display image signal obtained based on at least one of the image signals.
    A support image generation function that generates a support image that defines and indicates a specific area in which the observation target is in a specific state based on each of the plurality of image signals.
    A program for an image processing device for realizing a display control function that controls display of a plurality of support images superimposed on the display image when the display image is displayed on the display. ..

PCT/JP2021/010863 2020-07-07 2021-03-17 Image processing device, endoscope system, operation method for image processing device, and program for image processing device WO2022009478A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022534904A JPWO2022009478A1 (en) 2020-07-07 2021-03-17

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020117098 2020-07-07
JP2020-117098 2020-07-07

Publications (1)

Publication Number Publication Date
WO2022009478A1 true WO2022009478A1 (en) 2022-01-13

Family

ID=79553244

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/010863 WO2022009478A1 (en) 2020-07-07 2021-03-17 Image processing device, endoscope system, operation method for image processing device, and program for image processing device

Country Status (2)

Country Link
JP (1) JPWO2022009478A1 (en)
WO (1) WO2022009478A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020012872A1 (en) * 2018-07-09 2020-01-16 富士フイルム株式会社 Medical image processing device, medical image processing system, medical image processing method, and program
WO2020075578A1 (en) * 2018-10-12 2020-04-16 富士フイルム株式会社 Medical image processing device, endoscope system, and medical image processing method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020012872A1 (en) * 2018-07-09 2020-01-16 富士フイルム株式会社 Medical image processing device, medical image processing system, medical image processing method, and program
WO2020075578A1 (en) * 2018-10-12 2020-04-16 富士フイルム株式会社 Medical image processing device, endoscope system, and medical image processing method

Also Published As

Publication number Publication date
JPWO2022009478A1 (en) 2022-01-13

Similar Documents

Publication Publication Date Title
CN110325100B (en) Endoscope system and method of operating the same
JP6785948B2 (en) How to operate medical image processing equipment, endoscopic system, and medical image processing equipment
JP7335399B2 (en) MEDICAL IMAGE PROCESSING APPARATUS, ENDOSCOPE SYSTEM, AND METHOD OF OPERATION OF MEDICAL IMAGE PROCESSING APPARATUS
JP7190597B2 (en) endoscope system
WO2020036121A1 (en) Endoscope system
JP2020065685A (en) Endoscope system
US20230027950A1 (en) Medical image processing apparatus, endoscope system, method of operating medical image processing apparatus, and non-transitory computer readable medium
JP6785990B2 (en) Medical image processing equipment and endoscopic equipment
US11627864B2 (en) Medical image processing apparatus, endoscope system, and method for emphasizing region of interest
US20230141302A1 (en) Image analysis processing apparatus, endoscope system, operation method of image analysis processing apparatus, and non-transitory computer readable medium
US20230237659A1 (en) Image processing apparatus, endoscope system, operation method of image processing apparatus, and non-transitory computer readable medium
US20220076458A1 (en) Image processing apparatus
WO2022009478A1 (en) Image processing device, endoscope system, operation method for image processing device, and program for image processing device
JP7214886B2 (en) Image processing device and its operating method
WO2021006121A1 (en) Image processing device, endoscope system, and operation method for image processing device
WO2021210331A1 (en) Image processing device and operating method therefor
WO2022059233A1 (en) Image processing device, endoscope system, operation method for image processing device, and program for image processing device
WO2022210508A1 (en) Processor device, medical image processing device, medical image processing system, and endoscopic system
JP6866497B2 (en) Medical image processing equipment and endoscopic equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21837972

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022534904

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21837972

Country of ref document: EP

Kind code of ref document: A1