WO2022004056A1 - Endoscope system and method for operating same - Google Patents

Endoscope system and method for operating same Download PDF

Info

Publication number
WO2022004056A1
WO2022004056A1 PCT/JP2021/008740 JP2021008740W WO2022004056A1 WO 2022004056 A1 WO2022004056 A1 WO 2022004056A1 JP 2021008740 W JP2021008740 W JP 2021008740W WO 2022004056 A1 WO2022004056 A1 WO 2022004056A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
support information
illumination
observation target
movement
Prior art date
Application number
PCT/JP2021/008740
Other languages
French (fr)
Japanese (ja)
Inventor
康太郎 檜谷
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to JP2022533677A priority Critical patent/JPWO2022004056A1/ja
Publication of WO2022004056A1 publication Critical patent/WO2022004056A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements

Definitions

  • the present invention relates to an endoscopic system for displaying diagnostic support information regarding a lesion candidate area such as a suspicion rate of a lesion and a method for operating the same.
  • an endoscope system including a light source device, an endoscope, and a processor device.
  • an observation object is irradiated with illumination light, and the observation object illuminated by the illumination light is imaged by an image pickup sensor to acquire an endoscope image as a medical image.
  • the endoscopic image is displayed on the monitor and used for diagnosis.
  • lesion candidate regions are detected from images obtained by imaging an observation target.
  • the user is notified that the lesion candidate area has been detected by displaying the area around the lesion candidate area with diagnostic support information such as a bounding box.
  • diagnostic support information such as a bounding box.
  • the notification to the user reduces the visibility of the observation target, it may hinder the effective diagnosis by the user.
  • Patent Document 1 when the number of lesion candidate regions is larger than a predetermined threshold value or the size is larger than a predetermined threshold value, the alert image regarding the notification to the user is hidden. The visibility of the observation target is not reduced.
  • the present invention prevents the diagnostic support information regarding the lesion candidate region from interfering with the visibility of the observation target when the lesion candidate region is detected even in a situation where the movement of the observation target in the image changes. It is an object of the present invention to provide an endoscope system and a method of operating the same.
  • the endoscope system of the present invention has a light source unit that emits first illumination light and second illumination light having different emission spectra, and a first illumination period that emits the first illumination light and a second illumination light that emits the second illumination light.
  • the light was illuminated by the light source processor that emits the first illumination light in the first emission pattern and emits the second illumination light in the second emission pattern, and the first illumination light.
  • Image control is provided with an image sensor that captures the first image signal obtained by imaging the observation target and outputs the second image signal that captures the observation target illuminated by the second illumination light, and an image control processor.
  • the processor When the processor performs recognition processing on the second image signal, acquires the movement of the observation target in the image, and recognizes the lesion candidate region by the recognition processing, the observation image based on the first image signal is obtained.
  • a diagnostic support information image that displays diagnostic support information regarding the lesion candidate area is generated, and at least the first diagnostic support information of the diagnostic support information is displayed or hidden in the diagnostic support information image depending on the movement of the observation target. Control.
  • the image control processor displays diagnostic support information when the movement of the observation target is less than the first threshold value, and hides the diagnostic support information when the movement of the observation target is equal to or higher than the first threshold value. It is preferable to perform the first display control.
  • the image control processor hides the diagnostic support information when the movement of the observation target is less than the second threshold value, and displays the diagnostic support information when the movement of the observation target is equal to or more than the second threshold value. It is preferable to perform the second display control.
  • the image control processor displays diagnostic support information when the movement of the observation target is less than the first threshold value, and hides the diagnostic support information when the movement of the observation target is equal to or higher than the first threshold value.
  • the first display control or the diagnosis support information is hidden when the movement of the observation target is less than the second threshold value, and the diagnosis support information is displayed when the movement of the observation target is equal to or more than the second threshold value.
  • the image control processor displays diagnostic support information when the movement of the observation target is less than the first threshold value, and hides the diagnostic support information when the movement of the observation target is equal to or higher than the first threshold value.
  • the first display control or the diagnosis support information is hidden when the movement of the observation target is less than the second threshold value, and the diagnosis support information is displayed when the movement of the observation target is equal to or more than the second threshold value.
  • the image control processor automatically switches between the first display control and the second display control.
  • the image control processor switches to the first display control when the movement of the observation target is less than the switching threshold value, and switches to the second display control when the movement of the observation target is equal to or more than the switching threshold value.
  • the emission spectra of the second illumination light are different in each second illumination period, and a plurality of lesion candidate images are recognized by the recognition process based on the second image signal corresponding to each second illumination light. It is preferable that the display of the second diagnostic support information, which maintains the display regardless of the movement of the observation target among the diagnostic support information, differs depending on the emission spectrum of the second illumination light.
  • the first light emission pattern has a first A light emission pattern in which the number of frames in the first lighting period is the same in each of the first lighting periods, and the number of frames in the first lighting period is different in each first lighting period. It is preferably one of the first B emission patterns.
  • the number of frames in the second illumination period is the same in each second illumination period, and the emission spectrum of the second illumination light is the same in each second illumination period.
  • the second B emission pattern, the second in which the pattern and the number of frames in the second illumination period are the same in each second illumination period, and the emission spectrum of the second illumination light is different in each second illumination period.
  • the second C emission pattern and the second illumination period in which the number of frames in the illumination period is different in each second illumination period and the emission spectrum of the second illumination light is the same in each second illumination period.
  • the number of frames is different in each second illumination period, and the emission spectrum of the second illumination light is one of the second D emission patterns different in each second illumination period. preferable.
  • the present invention automatically sets a light source unit that emits a first illumination light and a second illumination light having different emission spectra, a first illumination period that emits the first illumination light, and a second illumination period that emits the second illumination light.
  • a light source processor that emits the first illumination light in the first emission pattern and emits the second illumination light in the second emission pattern, and obtains an image of an observation target illuminated by the first illumination light.
  • an endoscope system including an image pickup sensor that captures an observation target illuminated by a second illumination light and outputs a second image signal, and an image control processor.
  • the image control processor performs recognition processing on the second image signal, acquires the movement of the observation target in the image, and recognizes the lesion candidate region by the recognition processing, observation based on the first image signal.
  • a diagnostic support information image that displays diagnostic support information regarding a lesion candidate area is generated for the image, and at least the first diagnostic support information of the diagnostic support information is displayed or not displayed in the diagnostic support information image depending on the movement of the observation target. Control the display.
  • the diagnostic support information regarding the lesion candidate region does not interfere with the visibility of the observation target. can do.
  • the endoscope system 10 has an endoscope 12, a light source device 14, a processor device 16, a display 18, and a user interface 19.
  • the endoscope 12 is optically connected to the light source device 14 and electrically connected to the processor device 16.
  • the endoscope 12 has an insertion portion 12a to be inserted into the body to be observed, an operation portion 12b provided at the base end portion of the insertion portion 12a, and a curved portion 12c and a tip provided on the tip end side of the insertion portion 12a. It has a portion 12d.
  • the curved portion 12c bends by operating the angle knob 12e of the operating portion 12b.
  • the tip portion 12d is directed in a desired direction by the bending motion of the bending portion 12c.
  • the operation unit 12b includes a mode switching switch (mode switching switch) 12f used for mode switching operation, and a still image acquisition instruction unit 12g used for instructing acquisition of a still image to be observed.
  • a zoom operation unit 12h used for operating the zoom lens 43 (see FIG. 2) is provided.
  • the endoscope system 10 has three modes: a normal observation mode, a special observation mode, and a diagnostic support mode.
  • a normal observation mode a normal observation image having a natural hue is displayed on the display 18 by illuminating the observation target with normal light such as white light and taking an image.
  • a special observation image emphasizing a specific structure is displayed on the display 18 by illuminating the observation target with special light having a emission spectrum different from that of normal light and taking an image.
  • the diagnostic support mode when the first illumination light and the second illumination light having different emission spectra are switched to emit light and the lesion candidate region or the like is recognized by the recognition process (AI (Artificial Intelligence) or the like), the first A diagnostic support information image that displays diagnostic support information regarding the lesion candidate region is displayed on the display 18 with respect to the observation image based on the illumination light.
  • AI Artificial Intelligence
  • a signal related to the still image acquisition instruction is sent to the endoscope 12, the light source device 14, and the processor device 16.
  • the processor device 16 when the still image acquisition instruction is given, when the still image acquisition instruction is given, the still image to be observed is saved in the still image storage memory 69 of the processor device 16.
  • the processor device 16 is electrically connected to the display 18 and the user interface 19.
  • the display 18 outputs and displays an image to be observed, information incidental to the image to be observed, and the like.
  • the user interface 19 has a keyboard, a mouse, a touch pad, and the like, and has a function of accepting input operations such as function settings.
  • An external recording unit (not shown) for recording an image, image information, or the like may be connected to the processor device 16.
  • the light source device 14 includes a light source unit 20 and a light source processor 21 that controls the light source unit 20.
  • the light source unit 20 has, for example, a plurality of semiconductor light sources, each of which is turned on or off, and when the light source unit 20 is turned on, the light emission amount of each semiconductor light source is controlled to emit illumination light for illuminating the observation target.
  • the light source unit 20 is a V-LED (Violet Light Emitting Diode) 20a, a B-LED (Blue Light Emitting Diode) 20b, a G-LED (Green Light Emitting Diode) 20c, and an R-LED (Red Light).
  • Emitting Diode It has a 20d 4-color LED.
  • the V-LED 20a generates purple light V having a center wavelength of 405 ⁇ 10 nm and a wavelength range of 380 to 420 nm.
  • the B-LED 20b generates blue light B having a center wavelength of 450 ⁇ 10 nm and a wavelength range of 420 to 500 nm.
  • the G-LED 20c generates green light G having a wavelength range of 480 to 600 nm.
  • the R-LED 20d generates red light R having a center wavelength of 620 to 630 nm and a wavelength range of 600 to 650 nm.
  • the light source processor 21 controls the V-LED20a, B-LED20b, G-LED20c, and R-LED20d. By controlling each of the LEDs 20a to 20d independently, the light source processor 21 can emit purple light V, blue light B, green light G, or red light R by independently changing the amount of light. Further, the light source processor 21 emits white light having a light amount ratio of Vc: Bc: Gc: Rc among the purple light V, the blue light B, the green light G, and the red light R in the normal observation mode. , Each LED 20a to 20d is controlled. In addition, Vc, Bc, Gc, Rc> 0.
  • the light source processor 21 has a light amount ratio of Vs: Bs: Gs: Rs with purple light V, blue light B, green light G, and red light R as short-wavelength narrow-band light.
  • Each LED 20a to 20d is controlled so as to emit a special light.
  • the light amount ratio Vs: Bs: Gs: Rs is different from the light amount ratio Vc: Bc: Gc: Rc used in the normal observation mode, and is appropriately determined according to the observation purpose. For example, when emphasizing superficial blood vessels, it is preferable to make Vs larger than other Bs, Gs, Rs, and when emphasizing mesopelagic blood vessels, Gs is more than other Vs, Gs, Rs. It is also preferable to increase the size.
  • the first illumination light is emitted in the first emission pattern and the second illumination is emitted.
  • the light is emitted in the second emission pattern.
  • the first light emission pattern is the first A light emission pattern in which the number of frames in the first lighting period is the same in each first lighting period, and as shown in FIG. It is preferable that the number of frames in the first lighting period is one of the first B emission patterns different in each first lighting period.
  • the number of frames in the second illumination period is the same in each second illumination period, and the emission spectrum of the second illumination light is in each second illumination period.
  • the number of frames in the second illumination period is the same in each of the second illumination periods, and the emission spectrum of the second illumination light is the same in the second illumination pattern.
  • the second B emission pattern different in the two illumination periods as shown in FIG. 7, the number of frames in the second illumination period is different in each second illumination period, and the emission spectrum of the second illumination light is different.
  • the second C emission pattern which is the same in each second illumination period as shown in FIG. 8, the number of frames in the second illumination period is different in each second illumination period, and the emission of the second illumination light. It is preferable that the spectrum is one of the second D emission patterns that are different in each second illumination period.
  • the emission spectrum of the first illumination light may be the same or different in each first illumination period.
  • the first lighting period is preferably longer than the second lighting period, and the first lighting period is preferably two frames or more.
  • the first lighting period is set to 2 frames, and the second lighting period is set to 1 frame. Since the first illumination light is used to generate an observation image to be displayed on the display 18, it is preferable to obtain a bright image by illuminating the observation target with the first illumination light.
  • the first illumination light is preferably white light.
  • the second illumination light is used for the recognition process, it is preferable to illuminate the observation target with the second illumination light to obtain an image suitable for the recognition process.
  • the recognition process is performed based on the surface blood vessels, it is preferable that the second illumination light is purple light V.
  • the first and second light emission patterns which are switching patterns between the first lighting period and the second lighting period, will be described later because they are determined based on the image pickup control of the image pickup sensor 44 by the image pickup processor 45.
  • the frame refers to a unit of a period including at least a period from a specific timing to the completion of signal reading in the image pickup sensor 44.
  • the light intensity ratio includes the case where the ratio of at least one semiconductor light source is 0 (zero). Therefore, this includes the case where any one or more of the semiconductor light sources are not lit. For example, as in the case where the light amount ratio between purple light V, blue light B, green light G, and red light R is 1: 0: 0: 0, only one of the semiconductor light sources is turned on, and the other three are turned on. Even if it does not light up, it shall have a light intensity ratio.
  • the light emitted by each of the LEDs 20a to 20d is incident on the light guide 25 via the optical path coupling portion 23 composed of a mirror, a lens, or the like.
  • the light guide 25 is built in the endoscope 12 and a universal cord (a cord connecting the endoscope 12, the light source device 14 and the processor device 16).
  • the light guide 25 propagates the light from the optical path coupling portion 23 to the tip portion 12d of the endoscope 12.
  • An illumination optical system 30a and an image pickup optical system 30b are provided at the tip end portion 12d of the endoscope 12.
  • the illumination optical system 30a has an illumination lens 32, and the illumination light propagated by the light guide 25 is applied to the observation target through the illumination lens 32.
  • the image pickup optical system 30b has an objective lens 42 and an image pickup sensor 44. The light from the observation target due to the irradiation of the illumination light is incident on the image pickup sensor 44 via the objective lens 42 and the zoom lens 43. As a result, an image to be observed is formed on the image pickup sensor 44.
  • the zoom lens 43 is a lens for enlarging the observation target, and moves between the telephoto end and the wide end by operating the zoom operation unit 12h.
  • the image pickup sensor 44 is a primary color sensor, and is a B pixel (blue pixel) having a blue color filter, a G pixel (green pixel) having a green color filter, and an R pixel (red pixel) having a red color filter. It is equipped with three types of pixels.
  • the blue color filter BF mainly transmits light in the blue band, specifically, light in the wavelength band having a wavelength band of 380 to 560 nm.
  • the transmittance of the blue color filter BF peaks in the vicinity of the wavelength of 460 to 470 nm.
  • the green color filter transmits GF, mainly light in the green band, specifically, light in the wavelength band of 460 to 620 nm.
  • the red color filter RF mainly transmits light in the red band, specifically, light in the wavelength band of 580 to 760 nm.
  • the image sensor 44 is preferably a CCD (Charge-Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).
  • the image pickup processor 45 controls the image pickup sensor 44. Specifically, the image signal is output from the image pickup sensor 44 by reading out the signal of the image pickup sensor 44 by the image pickup processor 45. In the normal observation mode, the image pickup processor 45 reads out the signal while the normal light is exposed to the image pickup sensor 44, so that the Bc image signal is output from the B pixel of the image pickup sensor 44 and the Gc image signal is output from the G pixel. Is output, and an Rc image signal is output from the R pixel.
  • the Bs image signal is output from the B pixel of the image pickup sensor 44 and the Gs image signal is output from the G pixel by reading the signal from the image pickup processor 45 in a state where the special light is exposed to the image pickup sensor 44. Is output, and the Rs image signal is output from the R pixel.
  • the image pickup processor 45 reads out a signal from the image pickup sensor 44 in a state where the first illumination light is exposed to the image pickup sensor 44 during the first illumination period. 1 Output an image signal.
  • the period for outputting the first image signal is defined as the first imaging period.
  • the first image signal includes a B1 image signal output from the B pixel, a G1 image signal output from the G pixel, and an R1 image signal output from the R pixel.
  • the image pickup processor 45 outputs a second image signal from the image pickup sensor 44 by performing signal readout in a state where the image pickup sensor 44 is exposed to the second illumination light during the second illumination period.
  • the period for outputting the second image signal is defined as the first imaging period.
  • the second image signal includes a B2 image signal output from the B pixel, a G2 image signal output from the G pixel, and an R2 image signal output from the R pixel.
  • the CDS / AGC (Correlated Double Sampling / Automatic Gain Control) circuit 46 performs correlated double sampling (CDS) and automatic gain control (AGC) on the analog image signal obtained from the image pickup sensor 44. ..
  • CDS correlated double sampling
  • AGC automatic gain control
  • the image signal that has passed through the CDS / AGC circuit 46 is converted into a digital image signal by the A / D (Analog / Digital) converter 48.
  • the digital image signal after A / D conversion is input to the processor device 16.
  • the processor device 16 stores programs related to various processes in a program memory (not shown).
  • the central control unit 68 configured by the image control processor operates the program in the program memory to operate the image acquisition unit 50, the DSP (Digital Signal Processor) 52, and the noise reduction unit.
  • the functions of the 54, the image processing switching unit 56, the image processing unit 58, and the display control unit 60 are realized. Further, with the realization of the functions of the image processing unit 58, the functions of the normal observation image generation unit 62, the special observation image generation unit 64, and the diagnosis support information processing unit 66 are realized.
  • the image control processor performs image processing based on the first image signal or the second image signal, and controls the display 18.
  • the image acquisition unit 50 acquires a color image input from the endoscope 12.
  • the color image includes a blue signal (B image signal), a green signal (G image signal), and a red signal (R image signal) output from the B pixel, G pixel, and R pixel of the image pickup sensor 44.
  • the acquired color image is transmitted to the DSP 52.
  • the DSP 52 performs various signal processing such as defect correction processing, offset processing, gain correction processing, matrix processing, gamma conversion processing, demosaic processing, and YC conversion processing on the received color image.
  • the noise reduction unit 54 performs noise reduction processing by, for example, a moving average method, a median filter method, or the like on a color image that has been demosaic processed by DSP 56.
  • the color image with reduced noise is input to the image processing switching unit 56.
  • the image processing switching unit 56 sets the destination of the image signal from the noise reduction unit 54 to the normal observation image generation unit 62, the special observation image generation unit 64, and the diagnosis support information processing unit 66. Switch to either. Specifically, when the normal observation mode is set, the image signal from the noise reduction unit 54 is input to the normal observation image generation unit 62. When the special observation mode is set, the image signal from the noise reduction unit 54 is input to the special observation image generation unit 64. When the diagnosis support mode is set, the image signal from the noise reduction unit 54 is input to the diagnosis support information processing unit 66.
  • the normal observation image generation unit 62 performs image processing for a normal observation image on the input Rc image signal, Gc image signal, and Bc image signal for one frame.
  • Image processing for normal observation images includes 3 ⁇ 3 matrix processing, gradation conversion processing, color conversion processing such as 3D LUT (Look Up Table) processing, color enhancement processing, and structure enhancement processing such as spatial frequency enhancement. Is done.
  • the Rc image signal, Gc image signal, and Bc image signal that have been subjected to image processing for a normal observation image are input to the display control unit 60 as normal observation images.
  • the special observation image generation unit 64 performs image processing for special observation images on the input Rs image signal, Gs image signal, and Bs image signal for one frame.
  • Image processing for special observation images includes 3 ⁇ 3 matrix processing, gradation conversion processing, color conversion processing such as 3D LUT (Look Up Table) processing, color enhancement processing, and structure enhancement processing such as spatial frequency enhancement. Is done.
  • the Rs image signal, Gs image signal, and Bs image signal that have been subjected to image processing for special observation images are input to the display control unit 60 as special observation images.
  • the diagnosis support information processing unit 66 performs the same image processing for normal observation images as described above on the input R1 image signal, G1 image signal, and B1 image signal for one frame.
  • the R1 image signal, the G1 image signal, and the B1 image signal that have undergone image processing for a normal observation image signal are used as observation images.
  • the diagnosis support information processing unit 66 performs recognition processing on the input R2 image signal, G2 image signal, and B2 image signal for a specific frame.
  • a diagnosis support information image displaying the diagnosis support information regarding the lesion candidate area is generated.
  • the display control unit 60 displays the observation image or the diagnosis support information image on the display 18.
  • the display control unit 60 controls to display the image output from the image processing unit 58 on the display 18. Specifically, the display control unit 60 converts a normal observation image, a special observation image, an observation image, or a diagnosis support information image into a video signal that can be displayed in full color on the display 18. The converted video signal is input to the display 18. As a result, the display 18 displays a normal observation image, a special observation image, an observation image, or a diagnosis support information image.
  • the display control unit 60 performs the following display control.
  • the first emission pattern is the first A emission pattern and the second emission pattern is the second B emission pattern (the number of frames in the second illumination period: the same, the emission spectrum of the second illumination light: different)
  • the first illumination light When the observation target is illuminated for two frames and the second illumination light for one frame each during the emission of the first illumination light, as shown in FIG. 11, it is obtained by the illumination of the first illumination light.
  • An observation image is obtained by performing image processing for a normal observation image on the first image signal to be obtained.
  • the observation image is displayed on the display 18 as it is.
  • the presence or absence of the lesion candidate region DR is detected by performing recognition processing on the second image signal obtained by the emission of the second illumination light.
  • the lesion candidate area is, for example, a lesion such as cancer, or a benign polyp.
  • the observation image of the immediately preceding frame is displayed on the display 18 when the second illumination light is emitted.
  • the diagnosis support information is displayed on the observation image during the period when the lesion candidate region is detected when the first illumination light is emitted and when the second illumination light is emitted.
  • the diagnostic support information image displaying the above is displayed on the display 18. The details of the display control of the diagnosis support information image will be described later.
  • the diagnosis support image is generated in the diagnosis support information processing unit 66.
  • the diagnosis support information processing unit 66 includes a recognition processing unit 70, a motion acquisition unit 72, and a diagnosis support information image generation unit 74.
  • the recognition processing unit 70 performs recognition processing based on the second image signal obtained when the second illumination light is emitted.
  • the recognition process is preferably a process for recognizing and detecting a lesion candidate region.
  • the recognition processing unit 70 uses a learning model that has been machine-learned for the lesion candidate region, and outputs the presence or absence of the lesion candidate region when a second image signal is input to the learning model. ..
  • the motion acquisition unit 72 acquires the motion of the observation target in the image.
  • the movement of the observation target is used to control the display or non-display of the first diagnosis support information, as will be described later.
  • the movement of the observation target includes the movement of the observation target caused by the movement of the actual observation target such as the body movement, and the movement of the observation target caused by the movement of the tip portion 12d of the endoscope.
  • the motion acquisition unit 72 calculates the motion of the observation target based on the first image signal or the second image signal of a plurality of frames obtained within a certain period of time.
  • the movement vector of the observation target is calculated from the first or second image signals of a plurality of frames obtained within a certain time, and the movement of the observation target is calculated based on the calculated movement vector. Is preferable.
  • the diagnosis support information image generation unit 74 When the lesion candidate area is recognized by the recognition process, the diagnosis support information image generation unit 74 generates a diagnosis support information image displaying the diagnosis support information regarding the lesion candidate area with respect to the observation image.
  • the diagnostic support information includes first diagnostic support information that is displayed or hidden on the display 18 depending on the movement of the observation target, and second diagnostic support that maintains the display on the display 18 regardless of the movement of the observation target. Contains information.
  • the first diagnosis support information 76 is character information of a lesion candidate region such as a suspicion rate of a lesion, and is information that easily affects the user's visibility to the observed image. Is preferable.
  • the second diagnosis support information 78 is preferably a bounding box or the like indicating the position information of the lesion candidate region DR, and is information that does not easily affect the user's visibility of the observed image.
  • the display control unit 60 controls the display or non-display of at least the first diagnosis support information among the diagnosis support information in the diagnosis support information image according to the movement of the observation target. Specifically, the display control unit 60 is different from the first display control when performing detailed observation (differentiation, etc.) of the lesion candidate area and the second display control when performing the lesion candidate area screening. Is preferable. The display or non-display of the second diagnosis support information may also be controlled.
  • the first display control displays the first diagnosis support information 76 when the movement of the observation target is less than the first threshold value, and when the movement of the observation target is equal to or more than the first threshold value. Hides the first diagnosis support information 76.
  • the tip portion 12d of the endoscope having the image pickup sensor 44 is temporarily stopped and the movement of the observation target is relatively small. Therefore, when the movement of the observation target suitable for detailed observation is small, such as less than the first threshold value, the first diagnosis support information 76 is displayed.
  • the display of the first diagnosis support information 76 may hinder the visibility, so that the first diagnosis Hide support information 76. Even when the first diagnosis support information 76 is hidden in the first display control, it is preferable to maintain the display of the second diagnosis support information.
  • the second display control displays the first diagnosis support information 76 when the movement of the observation target is equal to or higher than the second threshold value, and when the movement of the observation target is less than the second threshold value. In addition, the first diagnosis support information 76 is hidden.
  • the first diagnosis support information 76 is displayed when the movement of the observation target is large such as the second threshold value or more.
  • the display of the first diagnosis support information 76 may hinder the visibility, so that the first diagnosis support is supported. Hide information 76. Even when the first diagnosis support information 76 is hidden in the second display control, it is preferable to maintain the display of the second diagnosis support information.
  • switching between the first display control and the second display control may be performed manually using the user interface 19. Further, the switching between the first display control and the second display control may be manually performed by the display control switching unit 75 (see FIG. 12). As shown in FIG. 16, when the movement of the observation target is less than the switching threshold value, the display control switching unit 75 determines that the movement of the observation target is less detailed observation, and switches to the first display control. On the other hand, when the movement of the observation target is equal to or higher than the switching threshold value, it is determined that the movement of the observation target is large in the screening, and the screen is switched to the second display control. The display control unit 60 controls the display of the diagnosis support information according to the automatically switched first display control or second display control.
  • the emission spectrum of the second illumination light is different in each of the second illumination periods, as in the case where the second emission pattern of the second illumination light is the second B emission pattern or the second D emission pattern.
  • the display control unit 60 maintains the display regardless of the movement of the observation target. 2 It is preferable that the display of the diagnostic support information is different depending on the emission spectrum of the second illumination light.
  • the lesion candidate region DRA detected by the second illumination light of the emission spectrum A and the second illumination light of the emission spectrum B different from the emission spectrum A were detected.
  • the second diagnosis support information 78b is displayed in the "blue” banding box, and the second diagnosis support information 78b is ". Display in the "green” bounding box. That is, “blue” indicates that it was detected by the second illumination light of the emission spectrum A, and “green” indicates that it was detected by the second illumination light of the emission spectrum B. This makes it possible to visually grasp which lesion candidate region DRA or DRB is detected by the second illumination light of which emission spectrum.
  • the mode switching SW12f When the mode switching SW12f is operated to switch to the diagnostic support mode, the first illumination light and the second illumination light having different emission spectra are automatically switched and emitted.
  • the image pickup sensor 44 takes an image of the observation target illuminated by the first illumination light and outputs the first image signal. Further, the observation target illuminated by the second illumination light is imaged, and the second image signal is output.
  • the recognition processing unit 70 performs recognition processing on the second image signal.
  • the motion acquisition unit 72 acquires the motion of the observation target in the image.
  • the diagnosis support information image generation unit 74 recognizes the lesion candidate region by the recognition process, the diagnosis support information image generation unit 74 generates a diagnosis support information image that displays the diagnosis support information regarding the lesion candidate region with respect to the observation image based on the first image signal. do.
  • the display control unit 60 controls the display or non-display of at least the first diagnosis support information among the diagnosis support information according to the movement of the observation target in the diagnosis support information image. The above series of processes is repeated while the diagnosis support mode continues.
  • a processing unit that executes various processes such as a generation unit 64, a diagnosis support information processing unit 66, a central control unit 68, a recognition processing unit 70, a motion acquisition unit 72, a diagnosis support information image generation unit 74, and a display control switching unit 75.
  • the hardware structure of the processing unit) is various processors as shown below.
  • the circuit configuration is changed after manufacturing the CPU (Central Processing Unit), FPGA (Field Programmable Gate Array), etc., which are general-purpose processors that execute software (programs) and function as various processing units. It includes a programmable logic device (PLD), which is a possible processor, a dedicated electric circuit, which is a processor having a circuit configuration specially designed for executing various processes, and the like.
  • CPU Central Processing Unit
  • FPGA Field Programmable Gate Array
  • PLD programmable logic device
  • dedicated electric circuit which is a processor having a circuit configuration specially designed for executing various processes, and the like.
  • One processing unit may be composed of one of these various processors, or may be composed of a combination of two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of a CPU and an FPGA). May be done. Further, a plurality of processing units may be configured by one processor. As an example of configuring a plurality of processing units with one processor, first, as represented by a computer such as a client or a server, one processor is configured by a combination of one or more CPUs and software. There is a form in which this processor functions as a plurality of processing units.
  • SoC System On Chip
  • the various processing units are configured by using one or more of the above-mentioned various processors as a hardware-like structure.
  • the hardware-like structure of these various processors is, more specifically, an electric circuit (circuitry) in which circuit elements such as semiconductor elements are combined.
  • the hardware structure of the storage unit is a storage device such as an HDD (hard disk drive) or SSD (solid state drive).

Abstract

Provided are an endoscope system and a method for operating the same which, when a lesion candidate region is detected, prevent diagnosis assistance information relating to the lesion candidate region from obstructing the visibility of a subject being observed even in situations where the movement of the subject being observed changes in an image. When a lesion candidate region (DR) is recognized by a recognition process, a diagnosis assistance information image is generated that displays diagnosis assistance information relating to the lesion candidate region (DR) on an observation image based on a first image signal. Whether to display or not display at least first diagnosis assistance information (76) among pieces of the diagnosis assistance information is controlled by the movement of the subject being observed in the diagnosis assistance information image.

Description

内視鏡システム及びその作動方法Endoscope system and how to operate it
 本発明は、病変の疑い率など病変候補領域に関する診断支援情報を表示する内視鏡システム及びその作動方法に関する。 The present invention relates to an endoscopic system for displaying diagnostic support information regarding a lesion candidate area such as a suspicion rate of a lesion and a method for operating the same.
 医療分野においては、医療画像を用いて診断することが広く行われている。例えば、医療画像を用いる装置として、光源装置、内視鏡、及びプロセッサ装置を備える内視鏡システムがある。内視鏡システムでは、観察対象に対して照明光を照射し、照明光で照明された観察対象を撮像センサで撮像することにより、医療画像としての内視鏡画像を取得する。内視鏡画像は、モニタに表示され、診断に使用される。 In the medical field, diagnosis using medical images is widely practiced. For example, as a device using a medical image, there is an endoscope system including a light source device, an endoscope, and a processor device. In the endoscope system, an observation object is irradiated with illumination light, and the observation object illuminated by the illumination light is imaged by an image pickup sensor to acquire an endoscope image as a medical image. The endoscopic image is displayed on the monitor and used for diagnosis.
 また、近年の内視鏡システムにおいては、観察対象を撮像して得られた画像から、病変候補領域を検出することが行われている。病変候補領域が検出された場合には、病変候補領域の周囲をバウンディングボックスなどの診断支援情報で表示することによって、注目領域が検出されたことをユーザーに報知することが行われている。ただし、ユーザーへの報知が観察対象の視認を低下させるなどの場合には、ユーザーによる有効な診断を妨げる可能性がある。これに関して、特許文献1では、病変候補領域の数が所定の閾値よりも大きい場合、又は、サイズが所定の閾値よりも大きい場合に、ユーザーへの報知に関するアラート画像を非表示にすることで、観察対象の視認性が低下しないようにしている。 Further, in recent endoscopic systems, lesion candidate regions are detected from images obtained by imaging an observation target. When a lesion candidate area is detected, the user is notified that the lesion candidate area has been detected by displaying the area around the lesion candidate area with diagnostic support information such as a bounding box. However, if the notification to the user reduces the visibility of the observation target, it may hinder the effective diagnosis by the user. Regarding this, in Patent Document 1, when the number of lesion candidate regions is larger than a predetermined threshold value or the size is larger than a predetermined threshold value, the alert image regarding the notification to the user is hidden. The visibility of the observation target is not reduced.
特開2011-255006号公報Japanese Unexamined Patent Publication No. 2011-255006
 内視鏡診断においては、撮像センサを有する内視鏡の先端部を一定速度以上で移動させるスクリーニングや、内視鏡の先端部の動きを止めて病変の鑑別などを行う小委観察などが行われている。そのため、内視鏡の診断中は、内視鏡の画像中での観察対象の動きが変化する。このように画像中での観察対象の動きが変化する状況下においても、病変候補領域を検出した際に、病変候補領域に関する診断支援情報が、観察対象の視認性を妨げないようにすることが求められていた。 In endoscopic diagnosis, screening is performed to move the tip of an endoscope with an image sensor at a constant speed or higher, and subcommittee observation is performed to stop the movement of the tip of the endoscope to distinguish lesions. It has been. Therefore, during the diagnosis of the endoscope, the movement of the observation target in the image of the endoscope changes. Even under the situation where the movement of the observation target in the image changes in this way, when the lesion candidate area is detected, the diagnostic support information regarding the lesion candidate area should not interfere with the visibility of the observation target. I was asked.
 本発明は、画像中での観察対象の動きが変化する状況下においても、病変候補領域を検出した際に、病変候補領域に関する診断支援情報が、観察対象の視認性を妨げないようにする内視鏡システム及びその作動方法を提供することを目的とする。 The present invention prevents the diagnostic support information regarding the lesion candidate region from interfering with the visibility of the observation target when the lesion candidate region is detected even in a situation where the movement of the observation target in the image changes. It is an object of the present invention to provide an endoscope system and a method of operating the same.
 本発明の内視鏡システムは、互いに発光スペクトルが異なる第1照明光と第2照明光とを発する光源部と、第1照明光を発光する第1照明期間と第2照明光を発光する第2照明期間とを自動的に切り替える場合において、第1照明光を第1発光パターンで発光し、第2照明光を第2発光パターンで発光する光源用プロセッサと、第1照明光によって照明された観察対象を撮像して得られる第1画像信号と、第2照明光によって照明された観察対象を撮像して第2画像信号とを出力する撮像センサと、画像制御用プロセッサとを備え、画像制御用プロセッサは、第2画像信号に対して認識処理を行い、画像中での前記観察対象の動きを取得し、認識処理により病変候補領域を認識した場合に、第1画像信号に基づく観察画像に対して、病変候補領域に関する診断支援情報を表示する診断支援情報画像を生成し、診断支援情報画像において、観察対象の動きによって、診断支援情報のうち少なくとも第1診断支援情報の表示又は非表示を制御する。 The endoscope system of the present invention has a light source unit that emits first illumination light and second illumination light having different emission spectra, and a first illumination period that emits the first illumination light and a second illumination light that emits the second illumination light. In the case of automatically switching between the two illumination periods, the light was illuminated by the light source processor that emits the first illumination light in the first emission pattern and emits the second illumination light in the second emission pattern, and the first illumination light. Image control is provided with an image sensor that captures the first image signal obtained by imaging the observation target and outputs the second image signal that captures the observation target illuminated by the second illumination light, and an image control processor. When the processor performs recognition processing on the second image signal, acquires the movement of the observation target in the image, and recognizes the lesion candidate region by the recognition processing, the observation image based on the first image signal is obtained. On the other hand, a diagnostic support information image that displays diagnostic support information regarding the lesion candidate area is generated, and at least the first diagnostic support information of the diagnostic support information is displayed or hidden in the diagnostic support information image depending on the movement of the observation target. Control.
 画像制御用プロセッサは、観察対象の動きが第1閾値未満である場合には、診断支援情報を表示し、観察対象の動きが第1閾値以上である場合には、診断支援情報を非表示にする第1表示制御を行うことが好ましい。画像制御用プロセッサは、観察対象の動きが第2閾値未満である場合には、診断支援情報を非表示にし、観察対象の動きが第2閾値以上である場合には、診断支援情報を表示にする第2表示制御を行うことが好ましい。 The image control processor displays diagnostic support information when the movement of the observation target is less than the first threshold value, and hides the diagnostic support information when the movement of the observation target is equal to or higher than the first threshold value. It is preferable to perform the first display control. The image control processor hides the diagnostic support information when the movement of the observation target is less than the second threshold value, and displays the diagnostic support information when the movement of the observation target is equal to or more than the second threshold value. It is preferable to perform the second display control.
 画像制御用プロセッサが、観察対象の動きが第1閾値未満である場合には、診断支援情報を表示し、観察対象の動きが第1閾値以上である場合には、診断支援情報を非表示にする第1表示制御、又は、観察対象の動きが第2閾値未満である場合には、診断支援情報を非表示にし、観察対象の動きが第2閾値以上である場合には、診断支援情報を表示にする第2表示制御を行うことが可能である場合において、第1表示制御と第2表示制御を手動で切り替えるユーザーインターフェースを有することが好ましい。 The image control processor displays diagnostic support information when the movement of the observation target is less than the first threshold value, and hides the diagnostic support information when the movement of the observation target is equal to or higher than the first threshold value. The first display control or the diagnosis support information is hidden when the movement of the observation target is less than the second threshold value, and the diagnosis support information is displayed when the movement of the observation target is equal to or more than the second threshold value. When it is possible to perform the second display control for display, it is preferable to have a user interface for manually switching between the first display control and the second display control.
 画像制御用プロセッサが、観察対象の動きが第1閾値未満である場合には、診断支援情報を表示し、観察対象の動きが第1閾値以上である場合には、診断支援情報を非表示にする第1表示制御、又は、観察対象の動きが第2閾値未満である場合には、診断支援情報を非表示にし、観察対象の動きが第2閾値以上である場合には、診断支援情報を表示にする第2表示制御を行うことが可能である場合において、画像制御用プロセッサが、第1表示制御と第2表示制御を自動で切り替えることが好ましい。 The image control processor displays diagnostic support information when the movement of the observation target is less than the first threshold value, and hides the diagnostic support information when the movement of the observation target is equal to or higher than the first threshold value. The first display control or the diagnosis support information is hidden when the movement of the observation target is less than the second threshold value, and the diagnosis support information is displayed when the movement of the observation target is equal to or more than the second threshold value. When it is possible to perform the second display control for displaying, it is preferable that the image control processor automatically switches between the first display control and the second display control.
 画像制御用プロセッサは、観察対象の動きが切替用閾値未満である場合には、第1表示制御に切替え、観察対象の動きが切替用閾値以上である場合には、第2表示制御に切り替えることが好ましい。第2照明光の発光スペクトルが、それぞれの第2照明期間において異なっており、且つ、各第2照明光に対応する第2画像信号に基づく認識処理により、複数の病変候補画像が認識された場合には、診断支援情報のうち観察対象の動きに関わらず表示を維持する第2診断支援情報の表示は、第2照明光の発光スペクトルによって異なることが好ましい。 The image control processor switches to the first display control when the movement of the observation target is less than the switching threshold value, and switches to the second display control when the movement of the observation target is equal to or more than the switching threshold value. Is preferable. When the emission spectra of the second illumination light are different in each second illumination period, and a plurality of lesion candidate images are recognized by the recognition process based on the second image signal corresponding to each second illumination light. It is preferable that the display of the second diagnostic support information, which maintains the display regardless of the movement of the observation target among the diagnostic support information, differs depending on the emission spectrum of the second illumination light.
 第1発光パターンは、第1照明期間のフレーム数が、それぞれの前記第1照明期間において同じである第1A発光パターンと、第1照明期間のフレーム数が、それぞれの第1照明期間において異なっている第1B発光パターンとのうちのいずれかであることが好ましい。 The first light emission pattern has a first A light emission pattern in which the number of frames in the first lighting period is the same in each of the first lighting periods, and the number of frames in the first lighting period is different in each first lighting period. It is preferably one of the first B emission patterns.
 第2発光パターンは、第2照明期間のフレーム数が、それぞれの第2照明期間において同じであり、且つ、第2照明光の発光スペクトルが、それぞれの第2照明期間において同じである第2A発光パターン、第2照明期間のフレーム数が、それぞれの第2照明期間において同じであり、且つ、第2照明光の発光スペクトルが、それぞれの第2照明期間において異なっている第2B発光パターン、第2照明期間のフレーム数が、それぞれの第2照明期間において異なっており、且つ、第2照明光の発光スペクトルが、それぞれの第2照明期間において同じである第2C発光パターン、及び、第2照明期間のフレーム数が、それぞれの第2照明期間において異なっており、且つ、第2照明光の発光スペクトルが、それぞれの第2照明期間において異なっている第2D発光パターンのうちのいずれかであることが好ましい。 In the second emission pattern, the number of frames in the second illumination period is the same in each second illumination period, and the emission spectrum of the second illumination light is the same in each second illumination period. The second B emission pattern, the second, in which the pattern and the number of frames in the second illumination period are the same in each second illumination period, and the emission spectrum of the second illumination light is different in each second illumination period. The second C emission pattern and the second illumination period in which the number of frames in the illumination period is different in each second illumination period and the emission spectrum of the second illumination light is the same in each second illumination period. The number of frames is different in each second illumination period, and the emission spectrum of the second illumination light is one of the second D emission patterns different in each second illumination period. preferable.
 本発明は、互いに発光スペクトルが異なる第1照明光と第2照明光とを発する光源部、第1照明光を発光する第1照明期間と第2照明光を発光する第2照明期間とを自動的に切り替える場合において、第1照明光を第1発光パターンで発光し、第2照明光を第2発光パターンで発光する光源用プロセッサ、第1照明光によって照明された観察対象を撮像して得られる第1画像信号と、第2照明光によって照明された観察対象を撮像して第2画像信号とを出力する撮像センサ、及び、画像制御用プロセッサとを備える内視鏡システムの作動方法において、画像制御用プロセッサは、第2画像信号に対して認識処理を行い、画像中での前記観察対象の動きを取得し、認識処理により病変候補領域を認識した場合に、第1画像信号に基づく観察画像に対して、病変候補領域に関する診断支援情報を表示する診断支援情報画像を生成し、診断支援情報画像において、観察対象の動きによって、診断支援情報のうち少なくとも第1診断支援情報の表示又は非表示を制御する。 The present invention automatically sets a light source unit that emits a first illumination light and a second illumination light having different emission spectra, a first illumination period that emits the first illumination light, and a second illumination period that emits the second illumination light. A light source processor that emits the first illumination light in the first emission pattern and emits the second illumination light in the second emission pattern, and obtains an image of an observation target illuminated by the first illumination light. In the method of operating an endoscope system including an image pickup sensor that captures an observation target illuminated by a second illumination light and outputs a second image signal, and an image control processor. The image control processor performs recognition processing on the second image signal, acquires the movement of the observation target in the image, and recognizes the lesion candidate region by the recognition processing, observation based on the first image signal. A diagnostic support information image that displays diagnostic support information regarding a lesion candidate area is generated for the image, and at least the first diagnostic support information of the diagnostic support information is displayed or not displayed in the diagnostic support information image depending on the movement of the observation target. Control the display.
 本発明によれば、画像中での観察対象の動きが変化する状況下においても、病変候補領域を検出した際に、病変候補領域に関する診断支援情報が、観察対象の視認性を妨げないようにすることができる。 According to the present invention, even when the movement of the observation target in the image changes, when the lesion candidate region is detected, the diagnostic support information regarding the lesion candidate region does not interfere with the visibility of the observation target. can do.
内視鏡システムの外観図である。It is an external view of an endoscope system. 内視鏡システムの機能を示すブロック図である。It is a block diagram which shows the function of an endoscope system. 紫色光V、青色光B、緑色光G、及び赤色光Rのスペクトルを示すグラフである。It is a graph which shows the spectrum of purple light V, blue light B, green light G, and red light R. 診断支援モード時の第1A発光パターン又は第2A発光パターンを示す説明図である。It is explanatory drawing which shows the 1A light emission pattern or the 2nd A light emission pattern in the diagnosis support mode. 診断支援モード時の第1B発光パターンを示す説明図である。It is explanatory drawing which shows the 1B light emission pattern in the diagnosis support mode. 診断支援モード時の第2B発光パターンを示す説明図である。It is explanatory drawing which shows the 2nd B light emission pattern in the diagnosis support mode. 診断支援モード時の第2C発光パターンを示す説明図である。It is explanatory drawing which shows the 2nd C light emission pattern in the diagnosis support mode. 診断支援モード時の第2D発光パターンを示す説明図である。It is explanatory drawing which shows the 2D light emission pattern in the diagnosis support mode. 撮像センサの各カラーフィルタの分光透過率を示すグラフである。It is a graph which shows the spectral transmittance of each color filter of an image pickup sensor. 第1撮像期間及び第2撮像期間を示す説明図である。It is explanatory drawing which shows the 1st imaging period and the 2nd imaging period. 診断支援モードにおける照明制御、解析処理、及び画像表示を時系列順で示す説明図である。It is explanatory drawing which shows lighting control, analysis processing, and image display in a diagnosis support mode in chronological order. 診断支援情報処理部の機能を示すブロック図である。It is a block diagram which shows the function of a diagnosis support information processing part. 第1診断支援情報及び第2診断支援情報を示す画像図である。It is an image diagram which shows the 1st diagnosis support information and the 2nd diagnosis support information. 第1表示制御を示す説明図である。It is explanatory drawing which shows the 1st display control. 第2表示制御を示す説明図である。It is explanatory drawing which shows the 2nd display control. 表示制御の自動切り替えに用いる切替用閾値を示す説明図である。It is explanatory drawing which shows the switching threshold value used for automatic switching of display control. 発光スペクトルが異なる複数の第2照明光を用いる場合における第2診断支援情報の表示方法を示す説明図である。It is explanatory drawing which shows the display method of the 2nd diagnosis support information in the case of using a plurality of 2nd illumination lights having different emission spectra. 診断支援モードの一連の流れを示すフローチャートである。It is a flowchart which shows a series flow of a diagnosis support mode.
 図1において、内視鏡システム10は、内視鏡12と、光源装置14と、プロセッサ装置16と、ディスプレイ18と、ユーザーインターフェース19とを有する。内視鏡12は、光源装置14と光学的に接続され、且つ、プロセッサ装置16と電気的に接続される。内視鏡12は、観察対象の体内に挿入される挿入部12aと、挿入部12aの基端部分に設けられた操作部12bと、挿入部12aの先端側に設けられた湾曲部12c及び先端部12dとを有している。湾曲部12cは、操作部12bのアングルノブ12eを操作することにより湾曲動作する。先端部12dは、湾曲部12cの湾曲動作によって所望の方向に向けられる。 In FIG. 1, the endoscope system 10 has an endoscope 12, a light source device 14, a processor device 16, a display 18, and a user interface 19. The endoscope 12 is optically connected to the light source device 14 and electrically connected to the processor device 16. The endoscope 12 has an insertion portion 12a to be inserted into the body to be observed, an operation portion 12b provided at the base end portion of the insertion portion 12a, and a curved portion 12c and a tip provided on the tip end side of the insertion portion 12a. It has a portion 12d. The curved portion 12c bends by operating the angle knob 12e of the operating portion 12b. The tip portion 12d is directed in a desired direction by the bending motion of the bending portion 12c.
 また、操作部12bには、アングルノブ12eの他、モードの切り替え操作に用いるモード切替SW(モード切替スイッチ)12fと、観察対象の静止画の取得指示に用いられる静止画取得指示部12gと、ズームレンズ43(図2参照)の操作に用いられるズーム操作部12hとが設けられている。 In addition to the angle knob 12e, the operation unit 12b includes a mode switching switch (mode switching switch) 12f used for mode switching operation, and a still image acquisition instruction unit 12g used for instructing acquisition of a still image to be observed. A zoom operation unit 12h used for operating the zoom lens 43 (see FIG. 2) is provided.
 なお、内視鏡システム10は、通常観察モード、特殊観察モード、診断支援モードの3つのモードを有している。通常観察モードでは、白色光などの通常光を観察対象に照明して撮像することによって、自然な色合いの通常観察画像をディスプレイ18に表示する。特殊観察モードでは、通常光と発光スペクトルが異なる特殊光を観察対象に照明して撮像することによって、特定の構造を強調した特殊観察画像をディスプレイ18に表示する。診断支援モードでは、発光スペクトルが異なる第1照明光と第2照明光とを切り替えて発光し、且つ、病変候補領域などを認識処理(AI(Artificial Intelligence)など)で認識した場合に、第1照明光に基づく観察画像に対して、病変候補領域に関する診断支援情報を表示する診断支援情報画像をディスプレイ18に表示する。 The endoscope system 10 has three modes: a normal observation mode, a special observation mode, and a diagnostic support mode. In the normal observation mode, a normal observation image having a natural hue is displayed on the display 18 by illuminating the observation target with normal light such as white light and taking an image. In the special observation mode, a special observation image emphasizing a specific structure is displayed on the display 18 by illuminating the observation target with special light having a emission spectrum different from that of normal light and taking an image. In the diagnostic support mode, when the first illumination light and the second illumination light having different emission spectra are switched to emit light and the lesion candidate region or the like is recognized by the recognition process (AI (Artificial Intelligence) or the like), the first A diagnostic support information image that displays diagnostic support information regarding the lesion candidate region is displayed on the display 18 with respect to the observation image based on the illumination light.
 静止画取得指示部12gをユーザーが操作することにより、静止画取得指示に関する信号が内視鏡12、光源装置14、及びプロセッサ装置16に送られる。プロセッサ装置16では、静止画取得指示が行われた場合には、静止画取得指示が行われると、観察対象の静止画が、プロセッサ装置16の静止画保存用メモリ69に保存される。 When the user operates the still image acquisition instruction unit 12g, a signal related to the still image acquisition instruction is sent to the endoscope 12, the light source device 14, and the processor device 16. In the processor device 16, when the still image acquisition instruction is given, when the still image acquisition instruction is given, the still image to be observed is saved in the still image storage memory 69 of the processor device 16.
 プロセッサ装置16は、ディスプレイ18及びユーザーインターフェース19と電気的に接続される。ディスプレイ18は、観察対象の画像や、観察対象の画像に付帯する情報などを出力表示する。ユーザーインターフェース19は、キーボード、マウス、タッチパッドなどを有し、機能設定などの入力操作を受け付ける機能を有する。なお、プロセッサ装置16には、画像や画像情報などを記録する外付けの記録部(図示省略)を接続してもよい。 The processor device 16 is electrically connected to the display 18 and the user interface 19. The display 18 outputs and displays an image to be observed, information incidental to the image to be observed, and the like. The user interface 19 has a keyboard, a mouse, a touch pad, and the like, and has a function of accepting input operations such as function settings. An external recording unit (not shown) for recording an image, image information, or the like may be connected to the processor device 16.
 図2において、光源装置14は、光源部20と、光源部20を制御する光源用プロセッサ21とを備えている。光源部20は、例えば、複数の半導体光源を有し、これらをそれぞれ点灯または消灯し、点灯する場合には各半導体光源の発光量を制御することにより、観察対象を照明する照明光を発する。本実施形態では、光源部20は、V-LED(Violet Light Emitting Diode)20a、B-LED(Blue Light Emitting Diode)20b、G-LED(Green Light Emitting Diode)20c、及びR-LED(Red Light Emitting Diode)20dの4色のLEDを有する。 In FIG. 2, the light source device 14 includes a light source unit 20 and a light source processor 21 that controls the light source unit 20. The light source unit 20 has, for example, a plurality of semiconductor light sources, each of which is turned on or off, and when the light source unit 20 is turned on, the light emission amount of each semiconductor light source is controlled to emit illumination light for illuminating the observation target. In the present embodiment, the light source unit 20 is a V-LED (Violet Light Emitting Diode) 20a, a B-LED (Blue Light Emitting Diode) 20b, a G-LED (Green Light Emitting Diode) 20c, and an R-LED (Red Light). Emitting Diode) It has a 20d 4-color LED.
 図3に示すように、V-LED20aは、中心波長405±10nm、波長範囲380~420nmの紫色光Vを発生する。B-LED20bは、中心波長450±10nm、波長範囲420~500nmの青色光Bを発生する。G-LED20cは、波長範囲が480~600nmに及ぶ緑色光Gを発生する。R-LED20dは、中心波長620~630nmで、波長範囲が600~650nmに及ぶ赤色光Rを発生する。 As shown in FIG. 3, the V-LED 20a generates purple light V having a center wavelength of 405 ± 10 nm and a wavelength range of 380 to 420 nm. The B-LED 20b generates blue light B having a center wavelength of 450 ± 10 nm and a wavelength range of 420 to 500 nm. The G-LED 20c generates green light G having a wavelength range of 480 to 600 nm. The R-LED 20d generates red light R having a center wavelength of 620 to 630 nm and a wavelength range of 600 to 650 nm.
 光源用プロセッサ21は、V-LED20a、B-LED20b、G-LED20c、及びR-LED20dを制御する。光源用プロセッサ21は、各LED20a~20dをそれぞれ独立に制御することで、紫色光V、青色光B、緑色光G、又は赤色光Rをそれぞれ独立に光量を変えて発光可能である。また、光源用プロセッサ21は、通常観察モード時には、紫色光V、青色光B、緑色光G、及び赤色光R間の光量比がVc:Bc:Gc:Rcとなる白色光を発光するように、各LED20a~20dを制御する。なお、Vc、Bc、Gc、Rc>0である。 The light source processor 21 controls the V-LED20a, B-LED20b, G-LED20c, and R-LED20d. By controlling each of the LEDs 20a to 20d independently, the light source processor 21 can emit purple light V, blue light B, green light G, or red light R by independently changing the amount of light. Further, the light source processor 21 emits white light having a light amount ratio of Vc: Bc: Gc: Rc among the purple light V, the blue light B, the green light G, and the red light R in the normal observation mode. , Each LED 20a to 20d is controlled. In addition, Vc, Bc, Gc, Rc> 0.
 また、光源用プロセッサ21は、特殊観察モード時には、短波長の狭帯域光としての紫色光V、青色光B、緑色光G、及び赤色光Rとの光量比がVs:Bs:Gs:Rsとなる特殊光を発光するように、各LED20a~20dを制御する。光量比Vs:Bs:Gs:Rsは、通常観察モード時に使用する光量比Vc:Bc:Gc:Rcと異なっており、観察目的に応じて適宜定められる。例えば、表層血管を強調する場合には、Vsを、他のBs、Gs、Rsよりも大きくすることが好ましく、中深層血管を強調する場合には、Gsを、他のVs、Gs、Rsよりも大きくすることが好ましい。 Further, in the special observation mode, the light source processor 21 has a light amount ratio of Vs: Bs: Gs: Rs with purple light V, blue light B, green light G, and red light R as short-wavelength narrow-band light. Each LED 20a to 20d is controlled so as to emit a special light. The light amount ratio Vs: Bs: Gs: Rs is different from the light amount ratio Vc: Bc: Gc: Rc used in the normal observation mode, and is appropriately determined according to the observation purpose. For example, when emphasizing superficial blood vessels, it is preferable to make Vs larger than other Bs, Gs, Rs, and when emphasizing mesopelagic blood vessels, Gs is more than other Vs, Gs, Rs. It is also preferable to increase the size.
 また、光源用プロセッサ21は、診断支援モード時に、第1照明光と第2照明光とを自動的に切り替えて発光する場合において、第1照明光を第1発光パターンで発光し、第2照明光を第2発光パターンで発光する。具体的には、第1発光パターンは、図4に示すように、第1照明期間のフレーム数が、それぞれの第1照明期間において同じである第1A発光パターンと、図5に示すように、第1照明期間のフレーム数が、それぞれの第1照明期間において異なっている第1B発光パターンとのうちのいずれかであることが好ましい。 Further, when the light source processor 21 automatically switches between the first illumination light and the second illumination light to emit light in the diagnosis support mode, the first illumination light is emitted in the first emission pattern and the second illumination is emitted. The light is emitted in the second emission pattern. Specifically, as shown in FIG. 4, the first light emission pattern is the first A light emission pattern in which the number of frames in the first lighting period is the same in each first lighting period, and as shown in FIG. It is preferable that the number of frames in the first lighting period is one of the first B emission patterns different in each first lighting period.
 第2発光パターンは、図4に示すように、第2照明期間のフレーム数が、それぞれの第2照明期間において同じであり、且つ、第2照明光の発光スペクトルが、それぞれの第2照明期間において同じである第2A発光パターン、図6に示すように、第2照明期間のフレーム数が、それぞれの第2照明期間において同じであり、且つ、第2照明光の発光スペクトルが、それぞれの第2照明期間において異なっている第2B発光パターン、図7に示すように、第2照明期間のフレーム数が、それぞれの第2照明期間において異なっており、且つ、第2照明光の発光スペクトルが、それぞれの第2照明期間において同じである第2C発光パターン、図8に示すように、第2照明期間のフレーム数が、それぞれの第2照明期間において異なっており、且つ、第2照明光の発光スペクトルが、それぞれの第2照明期間において異なっている第2D発光パターンのうちのいずれかであることが好ましい。なお、第1照明光の発光スペクトルは、それぞれの第1照明期間において同じであってもよく、異なってもよい。 In the second emission pattern, as shown in FIG. 4, the number of frames in the second illumination period is the same in each second illumination period, and the emission spectrum of the second illumination light is in each second illumination period. As shown in FIG. 6, the number of frames in the second illumination period is the same in each of the second illumination periods, and the emission spectrum of the second illumination light is the same in the second illumination pattern. The second B emission pattern different in the two illumination periods, as shown in FIG. 7, the number of frames in the second illumination period is different in each second illumination period, and the emission spectrum of the second illumination light is different. The second C emission pattern which is the same in each second illumination period, as shown in FIG. 8, the number of frames in the second illumination period is different in each second illumination period, and the emission of the second illumination light. It is preferable that the spectrum is one of the second D emission patterns that are different in each second illumination period. The emission spectrum of the first illumination light may be the same or different in each first illumination period.
 ここで、第1照明期間は第2照明期間よりも長くすることが好ましく、第1照明期間は2フレーム以上とすることが好ましい。例えば、図4では、第1発光パターンを第1Aパターンとし、第2発光パターンを第2A発光パターン(第2照明期間のフレーム数:同じ、第2照明光の発光スペクトル:同じ)とする場合において、第1照明期間を2フレームとし、第2照明期間を1フレームとしている。第1照明光は、ディスプレイ18に表示する観察画像の生成に用いられることから、第1照明光を観察対象に照明することによって、明るい画像が得られることが好ましい。 Here, the first lighting period is preferably longer than the second lighting period, and the first lighting period is preferably two frames or more. For example, in FIG. 4, when the first emission pattern is the first A pattern and the second emission pattern is the second A emission pattern (the number of frames in the second illumination period: the same, the emission spectrum of the second illumination light: the same). The first lighting period is set to 2 frames, and the second lighting period is set to 1 frame. Since the first illumination light is used to generate an observation image to be displayed on the display 18, it is preferable to obtain a bright image by illuminating the observation target with the first illumination light.
 例えば、第1照明光は、白色光であることが好ましい。一方、第2照明光は、認識処理に用いることから、第2照明光を観察対象に照明することによって、認識処理に適した画像が得られることが好ましい。例えば、表層血管に基づいて認識処理を行う場合には、第2照明光を紫色光Vとすることが好ましい。第1照明期間と第2照明期間の切替パターンである第1、2発光パターンについては、撮像用プロセッサ45による撮像センサ44の撮像制御に基づいて定められることから、後述する。また、フレームとは、撮像センサ44において特定タイミングから信号読み出し完了までの間の期間を少なくとも含む期間の単位のことをいう。 For example, the first illumination light is preferably white light. On the other hand, since the second illumination light is used for the recognition process, it is preferable to illuminate the observation target with the second illumination light to obtain an image suitable for the recognition process. For example, when the recognition process is performed based on the surface blood vessels, it is preferable that the second illumination light is purple light V. The first and second light emission patterns, which are switching patterns between the first lighting period and the second lighting period, will be described later because they are determined based on the image pickup control of the image pickup sensor 44 by the image pickup processor 45. Further, the frame refers to a unit of a period including at least a period from a specific timing to the completion of signal reading in the image pickup sensor 44.
 なお、本明細書において、光量比は、少なくとも1つの半導体光源の比率が0(ゼロ)の場合を含む。したがって、各半導体光源のいずれか1つまたは2つ以上が点灯しない場合を含む。例えば、紫色光V、青色光B、緑色光G、及び赤色光R間の光量比が1:0:0:0の場合のように、半導体光源の1つのみを点灯し、他の3つは点灯しない場合も、光量比を有するものとする。 In the present specification, the light intensity ratio includes the case where the ratio of at least one semiconductor light source is 0 (zero). Therefore, this includes the case where any one or more of the semiconductor light sources are not lit. For example, as in the case where the light amount ratio between purple light V, blue light B, green light G, and red light R is 1: 0: 0: 0, only one of the semiconductor light sources is turned on, and the other three are turned on. Even if it does not light up, it shall have a light intensity ratio.
 図2に示すように、各LED20a~20dが発する光は、ミラーやレンズなどで構成される光路結合部23を介して、ライトガイド25に入射される。ライトガイド25は、内視鏡12及びユニバーサルコード(内視鏡12と、光源装置14及びプロセッサ装置16を接続するコード)に内蔵されている。ライトガイド25は、光路結合部23からの光を、内視鏡12の先端部12dまで伝搬する。 As shown in FIG. 2, the light emitted by each of the LEDs 20a to 20d is incident on the light guide 25 via the optical path coupling portion 23 composed of a mirror, a lens, or the like. The light guide 25 is built in the endoscope 12 and a universal cord (a cord connecting the endoscope 12, the light source device 14 and the processor device 16). The light guide 25 propagates the light from the optical path coupling portion 23 to the tip portion 12d of the endoscope 12.
 内視鏡12の先端部12dには、照明光学系30aと撮像光学系30bが設けられている。照明光学系30aは照明レンズ32を有しており、ライトガイド25によって伝搬した照明光は照明レンズ32を介して観察対象に照射される。撮像光学系30bは、対物レンズ42、撮像センサ44を有している。照明光を照射したことによる観察対象からの光は、対物レンズ42及びズームレンズ43を介して撮像センサ44に入射する。これにより、撮像センサ44に観察対象の像が結像される。ズームレンズ43は観察対象を拡大するためのレンズであり、ズーム操作部12hを操作することによって、テレ端とワイド端と間を移動する。 An illumination optical system 30a and an image pickup optical system 30b are provided at the tip end portion 12d of the endoscope 12. The illumination optical system 30a has an illumination lens 32, and the illumination light propagated by the light guide 25 is applied to the observation target through the illumination lens 32. The image pickup optical system 30b has an objective lens 42 and an image pickup sensor 44. The light from the observation target due to the irradiation of the illumination light is incident on the image pickup sensor 44 via the objective lens 42 and the zoom lens 43. As a result, an image to be observed is formed on the image pickup sensor 44. The zoom lens 43 is a lens for enlarging the observation target, and moves between the telephoto end and the wide end by operating the zoom operation unit 12h.
 撮像センサ44は、原色系のカラーセンサであり、青色カラーフィルタを有するB画素(青色画素)、緑色カラーフィルタを有するG画素(緑色画素)、及び、赤色カラーフィルタを有するR画素(赤色画素)の3種類の画素を備える。図9に示すように、青色カラーフィルタBFは、主として青色帯域の光、具体的には波長帯域が380~560nmの波長帯域の光を透過する。青色カラーフィルタBFの透過率は、波長460~470nm付近においてピークになる。緑色カラーフィルタはGF、主として緑色帯域の光、具体的には、460~620nmの波長帯域の光を透過する。赤色カラーフィルタRFは、主として赤色帯域の光、具体的には、580~760nmの波長帯域の光を透過する。 The image pickup sensor 44 is a primary color sensor, and is a B pixel (blue pixel) having a blue color filter, a G pixel (green pixel) having a green color filter, and an R pixel (red pixel) having a red color filter. It is equipped with three types of pixels. As shown in FIG. 9, the blue color filter BF mainly transmits light in the blue band, specifically, light in the wavelength band having a wavelength band of 380 to 560 nm. The transmittance of the blue color filter BF peaks in the vicinity of the wavelength of 460 to 470 nm. The green color filter transmits GF, mainly light in the green band, specifically, light in the wavelength band of 460 to 620 nm. The red color filter RF mainly transmits light in the red band, specifically, light in the wavelength band of 580 to 760 nm.
 また、撮像センサ44は、CCD(Charge-Coupled Device)またはCMOS(Complementary Metal Oxide Semiconductor)であることが好ましい。撮像用プロセッサ45は、撮像センサ44を制御する。具体的には、撮像用プロセッサ45により撮像センサ44の信号読み出しを行うことによって、撮像センサ44から画像信号が出力される。通常観察モードでは、通常光が撮像センサ44に露光された状態で、撮像用プロセッサ45が信号読み出しを行うことにより、撮像センサ44のB画素からBc画像信号が出力され、G画素からGc画像信号が出力され、R画素からRc画像信号が出力される。特殊観察モードでは、特殊光が撮像センサ44に露光された状態で、撮像用プロセッサ45が信号読み出しを行うことによって、撮像センサ44のB画素からBs画像信号が出力され、G画素からGs画像信号が出力され、R画素からRs画像信号が出力される。 Further, the image sensor 44 is preferably a CCD (Charge-Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor). The image pickup processor 45 controls the image pickup sensor 44. Specifically, the image signal is output from the image pickup sensor 44 by reading out the signal of the image pickup sensor 44 by the image pickup processor 45. In the normal observation mode, the image pickup processor 45 reads out the signal while the normal light is exposed to the image pickup sensor 44, so that the Bc image signal is output from the B pixel of the image pickup sensor 44 and the Gc image signal is output from the G pixel. Is output, and an Rc image signal is output from the R pixel. In the special observation mode, the Bs image signal is output from the B pixel of the image pickup sensor 44 and the Gs image signal is output from the G pixel by reading the signal from the image pickup processor 45 in a state where the special light is exposed to the image pickup sensor 44. Is output, and the Rs image signal is output from the R pixel.
 診断支援モードでは、撮像用プロセッサ45は、図10に示すように、第1照明期間において第1照明光を撮像センサ44に露光させた状態で、信号読み出しを行うことにより、撮像センサ44から第1画像信号を出力させる。第1画像信号を出力する期間を第1撮像期間とする。第1画像信号には、B画素から出力されるB1画像信号、G画素から出力されるG1画像信号、及び、R画素から出力されるR1画像信号が含まれる。また、撮像用プロセッサ45は、第2照明期間において第2照明光を撮像センサ44に露光させた状態で、信号読み出しを行うことにより、撮像センサ44から第2画像信号を出力させる。第2画像信号を出力する期間を第1撮像期間とする。第2画像信号には、B画素から出力されるB2画像信号、G画素から出力されるG2画像信号、及び、R画素から出力されるR2画像信号が含まれる。 In the diagnostic support mode, as shown in FIG. 10, the image pickup processor 45 reads out a signal from the image pickup sensor 44 in a state where the first illumination light is exposed to the image pickup sensor 44 during the first illumination period. 1 Output an image signal. The period for outputting the first image signal is defined as the first imaging period. The first image signal includes a B1 image signal output from the B pixel, a G1 image signal output from the G pixel, and an R1 image signal output from the R pixel. Further, the image pickup processor 45 outputs a second image signal from the image pickup sensor 44 by performing signal readout in a state where the image pickup sensor 44 is exposed to the second illumination light during the second illumination period. The period for outputting the second image signal is defined as the first imaging period. The second image signal includes a B2 image signal output from the B pixel, a G2 image signal output from the G pixel, and an R2 image signal output from the R pixel.
 図2に示すように、CDS/AGC(Correlated Double Sampling/Automatic Gain Control)回路46は、撮像センサ44から得られるアナログの画像信号に相関二重サンプリング(CDS)や自動利得制御(AGC)を行う。CDS/AGC回路46を経た画像信号は、A/D(Analog/Digital)コンバータ48により、デジタルの画像信号に変換される。A/D変換後のデジタル画像信号がプロセッサ装置16に入力される。 As shown in FIG. 2, the CDS / AGC (Correlated Double Sampling / Automatic Gain Control) circuit 46 performs correlated double sampling (CDS) and automatic gain control (AGC) on the analog image signal obtained from the image pickup sensor 44. .. The image signal that has passed through the CDS / AGC circuit 46 is converted into a digital image signal by the A / D (Analog / Digital) converter 48. The digital image signal after A / D conversion is input to the processor device 16.
 プロセッサ装置16には、各種処理に関するプログラムがプログラム用メモリ(図示しない)に格納されている。プロセッサ装置16においては、画像制御用プロセッサによって構成される中央制御部68によって、プログラム用メモリ内のプログラムが動作することによって、画像取得部50と、DSP(Digital Signal Processor)52と、ノイズ低減部54と、画像処理切替部56と、画像処理部58と、表示制御部60の機能が実現される。また、画像処理部58の機能実現に伴って、通常観察画像生成部62と、特殊観察画像生成部64と、診断支援情報処理部66との機能が実現される。なお、診断支援モードにおいては、画像制御用プロセッサは、第1画像信号又は第2画像信号に基づいて画像処理を行い、ディスプレイ18に対する制御を行う。 The processor device 16 stores programs related to various processes in a program memory (not shown). In the processor device 16, the central control unit 68 configured by the image control processor operates the program in the program memory to operate the image acquisition unit 50, the DSP (Digital Signal Processor) 52, and the noise reduction unit. The functions of the 54, the image processing switching unit 56, the image processing unit 58, and the display control unit 60 are realized. Further, with the realization of the functions of the image processing unit 58, the functions of the normal observation image generation unit 62, the special observation image generation unit 64, and the diagnosis support information processing unit 66 are realized. In the diagnostic support mode, the image control processor performs image processing based on the first image signal or the second image signal, and controls the display 18.
 画像取得部50は、内視鏡12から入力されるカラー画像を取得する。カラー画像には、撮像センサ44のB画素、G画素、R画素から出力される青色信号(B画像信号)、緑色信号(G画像信号)、赤色信号(R画像信号)が含まれている。取得したカラー画像はDSP52に送信される。DSP52は、受信したカラー画像に対して、欠陥補正処理、オフセット処理、ゲイン補正処理、マトリクス処理、ガンマ変換処理、デモザイク処理、及びYC変換処理等の各種信号処理を行う。ノイズ低減部54は、DSP56でデモザイク処理等を施したカラー画像に対して、例えば移動平均法やメディアンフィルタ法等によるノイズ低減処理を施す。ノイズを低減したカラー画像は、画像処理切替部56に入力される。 The image acquisition unit 50 acquires a color image input from the endoscope 12. The color image includes a blue signal (B image signal), a green signal (G image signal), and a red signal (R image signal) output from the B pixel, G pixel, and R pixel of the image pickup sensor 44. The acquired color image is transmitted to the DSP 52. The DSP 52 performs various signal processing such as defect correction processing, offset processing, gain correction processing, matrix processing, gamma conversion processing, demosaic processing, and YC conversion processing on the received color image. The noise reduction unit 54 performs noise reduction processing by, for example, a moving average method, a median filter method, or the like on a color image that has been demosaic processed by DSP 56. The color image with reduced noise is input to the image processing switching unit 56.
 画像処理切替部56は、設定されているモードによって、ノイズ低減部54からの画像信号の送信先を、通常観察画像生成部62と、特殊観察画像生成部64と、診断支援情報処理部66のいずれかに切り替える。具体的には、通常観察モードにセットされている場合には、ノイズ低減部54からの画像信号を通常観察画像生成部62に入力する。特殊観察モードにセットされている場合には、ノイズ低減部54からの画像信号を特殊観察画像生成部64に入力する。診断支援モードにセットされている場合には、ノイズ低減部54からの画像信号を診断支援情報処理部66に入力する。 Depending on the set mode, the image processing switching unit 56 sets the destination of the image signal from the noise reduction unit 54 to the normal observation image generation unit 62, the special observation image generation unit 64, and the diagnosis support information processing unit 66. Switch to either. Specifically, when the normal observation mode is set, the image signal from the noise reduction unit 54 is input to the normal observation image generation unit 62. When the special observation mode is set, the image signal from the noise reduction unit 54 is input to the special observation image generation unit 64. When the diagnosis support mode is set, the image signal from the noise reduction unit 54 is input to the diagnosis support information processing unit 66.
 通常観察画像生成部62は、入力した1フレーム分のRc画像信号、Gc画像信号、Bc画像信号に対して、通常観察画像用画像処理を施す。通常観察画像用画像処理には、3×3のマトリクス処理、階調変換処理、3次元LUT(Look Up Table)処理等の色変換処理、色彩強調処理、空間周波数強調等の構造強調処理が含まれる。通常観察画像用画像処理が施されたRc画像信号、Gc画像信号、Bc画像信号は、通常観察画像として表示制御部60に入力される。 The normal observation image generation unit 62 performs image processing for a normal observation image on the input Rc image signal, Gc image signal, and Bc image signal for one frame. Image processing for normal observation images includes 3 × 3 matrix processing, gradation conversion processing, color conversion processing such as 3D LUT (Look Up Table) processing, color enhancement processing, and structure enhancement processing such as spatial frequency enhancement. Is done. The Rc image signal, Gc image signal, and Bc image signal that have been subjected to image processing for a normal observation image are input to the display control unit 60 as normal observation images.
 特殊観察画像生成部64は、入力した1フレーム分のRs画像信号、Gs画像信号、Bs画像信号に対して、特殊観察画像用画像処理を施す。特殊観察画像用画像処理には、3×3のマトリクス処理、階調変換処理、3次元LUT(Look Up Table)処理等の色変換処理、色彩強調処理、空間周波数強調等の構造強調処理が含まれる。特殊観察画像用画像処理が施されたRs画像信号、Gs画像信号、Bs画像信号は、特殊観察画像として表示制御部60に入力される。 The special observation image generation unit 64 performs image processing for special observation images on the input Rs image signal, Gs image signal, and Bs image signal for one frame. Image processing for special observation images includes 3 × 3 matrix processing, gradation conversion processing, color conversion processing such as 3D LUT (Look Up Table) processing, color enhancement processing, and structure enhancement processing such as spatial frequency enhancement. Is done. The Rs image signal, Gs image signal, and Bs image signal that have been subjected to image processing for special observation images are input to the display control unit 60 as special observation images.
 診断支援情報処理部66は、入力した1フレーム分のR1画像信号、G1画像信号、B1画像信号に対して、上述と同様の通常観察画像用画像処理を施す。通常観察画像信号用画像処理が施されたR1画像信号、G1画像信号、B1画像信号は、観察画像として使用される。また、診断支援情報処理部66は、入力した特定フレーム分のR2画像信号、G2画像信号、B2画像信号に対して、認識処理を行う。認識処理によって、病変候補領域を認識した場合には、病変候補領域に関する診断支援情報を表示する診断支援情報画像を生成する。表示制御部60は、観察画像、又は、診断支援情報画像をディスプレイ18に表示する。 The diagnosis support information processing unit 66 performs the same image processing for normal observation images as described above on the input R1 image signal, G1 image signal, and B1 image signal for one frame. The R1 image signal, the G1 image signal, and the B1 image signal that have undergone image processing for a normal observation image signal are used as observation images. Further, the diagnosis support information processing unit 66 performs recognition processing on the input R2 image signal, G2 image signal, and B2 image signal for a specific frame. When the lesion candidate area is recognized by the recognition process, a diagnosis support information image displaying the diagnosis support information regarding the lesion candidate area is generated. The display control unit 60 displays the observation image or the diagnosis support information image on the display 18.
 表示制御部60は、画像処理部58から出力される画像をディスプレイ18に表示するための制御を行う。具体的には、表示制御部60は、通常観察画像、特殊観察画像、又は、観察画像、又は、診断支援情報画像を、ディスプレイ18においてフルカラーで表示可能にする映像信号に変換する。変換済みの映像信号はディスプレイ18に入力される。これにより、ディスプレイ18には通常観察画像、特殊観察画像、又は、観察画像、又は、診断支援情報画像が表示される。 The display control unit 60 controls to display the image output from the image processing unit 58 on the display 18. Specifically, the display control unit 60 converts a normal observation image, a special observation image, an observation image, or a diagnosis support information image into a video signal that can be displayed in full color on the display 18. The converted video signal is input to the display 18. As a result, the display 18 displays a normal observation image, a special observation image, an observation image, or a diagnosis support information image.
 診断支援モードにおいては、表示制御部60は、以下のような表示制御を行う。第1発光パターンを第1A発光パターンとし、第2発光パターンを第2B発光パターン(第2照明期間のフレーム数:同じ、第2照明光の発光スペクトル:異なる)とする場合において、第1照明光を2フレーム分、第2照明光を、第1照明光の発光の間に、それぞれ1フレーム分だけ観察対象に照明する場合には、図11に示すように、第1照明光の照明により得られる第1画像信号に対して通常観察画像用画像処理を施すことによって、観察画像を得る。第1照明光の発光時には、そのまま観察画像がディスプレイ18に表示される。 In the diagnosis support mode, the display control unit 60 performs the following display control. When the first emission pattern is the first A emission pattern and the second emission pattern is the second B emission pattern (the number of frames in the second illumination period: the same, the emission spectrum of the second illumination light: different), the first illumination light When the observation target is illuminated for two frames and the second illumination light for one frame each during the emission of the first illumination light, as shown in FIG. 11, it is obtained by the illumination of the first illumination light. An observation image is obtained by performing image processing for a normal observation image on the first image signal to be obtained. When the first illumination light is emitted, the observation image is displayed on the display 18 as it is.
 一方、第2照明光の発光により得られる第2画像信号に対して認識処理を行うことによって、病変候補領域DRの有無を検出する。病変候補領域は、例えば、がんなどの病変部の他、良性ポリープなどである。病変候補領域を検出しなかった場合には、第2照明光の発光時には、直前のフレームの観察画像がディスプレイ18に表示される。これに対して、病変候補領域DRを検出した場合には、病変候補領域が検出している期間は、第1照明光の発光時、及び第2照明光の発光時に、観察画像に診断支援情報を表示した診断支援情報画像がディスプレイ18に表示される。診断支援情報画像の表示制御の詳細は後述する。 On the other hand, the presence or absence of the lesion candidate region DR is detected by performing recognition processing on the second image signal obtained by the emission of the second illumination light. The lesion candidate area is, for example, a lesion such as cancer, or a benign polyp. When the lesion candidate region is not detected, the observation image of the immediately preceding frame is displayed on the display 18 when the second illumination light is emitted. On the other hand, when the lesion candidate region DR is detected, the diagnosis support information is displayed on the observation image during the period when the lesion candidate region is detected when the first illumination light is emitted and when the second illumination light is emitted. The diagnostic support information image displaying the above is displayed on the display 18. The details of the display control of the diagnosis support information image will be described later.
 以下、診断支援情報画像の表示制御の詳細について、説明する。診断支援画像は、診断支援情報処理部66において生成される。診断支援情報処理部66は、図12に示すように、認識処理部70、動き取得部72、診断支援情報画像生成部74を備えている。認識処理部70は、第2照明光の発光時に得られる第2画像信号に基づいて、認識処理を行う。認識処理は、病変候補領域を認識して検出するための処理であることが好ましい。具体的には、認識処理部70は、病変候補領域に関して機械学習済みの学習モデルを用い、学習モデルに対して第2画像信号を入力した場合に、病変候補領域の有無を出力することが好ましい。 The details of the display control of the diagnostic support information image will be described below. The diagnosis support image is generated in the diagnosis support information processing unit 66. As shown in FIG. 12, the diagnosis support information processing unit 66 includes a recognition processing unit 70, a motion acquisition unit 72, and a diagnosis support information image generation unit 74. The recognition processing unit 70 performs recognition processing based on the second image signal obtained when the second illumination light is emitted. The recognition process is preferably a process for recognizing and detecting a lesion candidate region. Specifically, it is preferable that the recognition processing unit 70 uses a learning model that has been machine-learned for the lesion candidate region, and outputs the presence or absence of the lesion candidate region when a second image signal is input to the learning model. ..
 動き取得部72は、画像中での観察対象の動きを取得する。観察対象の動きは、後述するように、第1診断支援情報の表示又は非表示の制御に用いられる。観察対象の動きには、体動などの実際の観察対象の動きに従って生ずる観察対象の動きの他、内視鏡の先端部12dなどの動きに従って生ずる観察対象の動きがある。動き取得部72では、一定時間内に得られる複数フレームの第1画像信号又は第2画像信号に基づいて、観察対象の動きを算出することが好ましい。動きの算出方法としては、一定時間内に得られる複数フレームの第1又は第2画像信号から、観察対象の移動ベクトルを算出し、算出された移動ベクトルに基づいて観察対象の動きを算出することが好ましい。 The motion acquisition unit 72 acquires the motion of the observation target in the image. The movement of the observation target is used to control the display or non-display of the first diagnosis support information, as will be described later. The movement of the observation target includes the movement of the observation target caused by the movement of the actual observation target such as the body movement, and the movement of the observation target caused by the movement of the tip portion 12d of the endoscope. It is preferable that the motion acquisition unit 72 calculates the motion of the observation target based on the first image signal or the second image signal of a plurality of frames obtained within a certain period of time. As a method of calculating the movement, the movement vector of the observation target is calculated from the first or second image signals of a plurality of frames obtained within a certain time, and the movement of the observation target is calculated based on the calculated movement vector. Is preferable.
 診断支援情報画像生成部74は、認識処理により病変候補領域を認識した場合に、観察画像に対して、病変候補領域に関する診断支援情報を表示した診断支援情報画像を生成する。診断支援情報としては、観察対象の動きによって、ディスプレイ18で表示し、又は非表示にする第1診断支援情報と、観察対象の動きに関わらず、ディスプレイ18での表示を維持する第2診断支援情報が含まれる。 When the lesion candidate area is recognized by the recognition process, the diagnosis support information image generation unit 74 generates a diagnosis support information image displaying the diagnosis support information regarding the lesion candidate area with respect to the observation image. The diagnostic support information includes first diagnostic support information that is displayed or hidden on the display 18 depending on the movement of the observation target, and second diagnostic support that maintains the display on the display 18 regardless of the movement of the observation target. Contains information.
 具体的には、図13に示すように、第1診断支援情報76は、病変の疑い率などの病変候補領域の文字情報であり、観察画像に対するユーザーの視認性に影響を与えやすい情報であることが好ましい。一方、第2診断支援情報78は、病変候補領域DRの位置情報を示すバウンディングボックスなどであり、観察画像に対するユーザーの視認性に影響を与えにくい情報であることが好ましい。 Specifically, as shown in FIG. 13, the first diagnosis support information 76 is character information of a lesion candidate region such as a suspicion rate of a lesion, and is information that easily affects the user's visibility to the observed image. Is preferable. On the other hand, the second diagnosis support information 78 is preferably a bounding box or the like indicating the position information of the lesion candidate region DR, and is information that does not easily affect the user's visibility of the observed image.
 表示制御部60は、観察対象の動きによって、診断支援情報画像において、診断支援情報のうち少なくとも第1診断支援情報の表示又は非表示を制御する。具体的には、表示制御部60は、病変候補領域の詳細観察(鑑別など)を行う場合の第1表示制御と、病変候補領域スクリーニングを行う場合の第2表示制御とは、異なっていることが好ましい。なお、第2診断支援情報についても表示又は非表示の制御を行ってもよい。 The display control unit 60 controls the display or non-display of at least the first diagnosis support information among the diagnosis support information in the diagnosis support information image according to the movement of the observation target. Specifically, the display control unit 60 is different from the first display control when performing detailed observation (differentiation, etc.) of the lesion candidate area and the second display control when performing the lesion candidate area screening. Is preferable. The display or non-display of the second diagnosis support information may also be controlled.
 第1表示制御は、図14に示すように、観察対象の動きが第1閾値未満である場合に、第1診断支援情報76を表示し、観察対象の動きが第1閾値以上である場合には、第1診断支援情報76を非表示にする。鑑別などの詳細観察においては、撮像センサ44を有する内視鏡の先端部12dを一時的に止めて行い、観察対象の動きは比較的小さいことが好ましい。したがって、詳細観察に適した観察対象の動きが第1閾値未満などの小さい状態では、第1診断支援情報76を表示するようにする。一方、観察対象の動きが第1閾値以上になるなど大きくなって、詳細観察に適さなくなった場合には、第1診断支援情報76の表示が視認性を妨げる可能性があるため、第1診断支援情報76を非表示にする。第1表示制御において第1診断支援情報76を非表示にする場合であっても、第2診断支援情報は表示を維持することが好ましい。 As shown in FIG. 14, the first display control displays the first diagnosis support information 76 when the movement of the observation target is less than the first threshold value, and when the movement of the observation target is equal to or more than the first threshold value. Hides the first diagnosis support information 76. In detailed observation such as discrimination, it is preferable that the tip portion 12d of the endoscope having the image pickup sensor 44 is temporarily stopped and the movement of the observation target is relatively small. Therefore, when the movement of the observation target suitable for detailed observation is small, such as less than the first threshold value, the first diagnosis support information 76 is displayed. On the other hand, when the movement of the observation target becomes large such as becoming larger than the first threshold value and becomes unsuitable for detailed observation, the display of the first diagnosis support information 76 may hinder the visibility, so that the first diagnosis Hide support information 76. Even when the first diagnosis support information 76 is hidden in the first display control, it is preferable to maintain the display of the second diagnosis support information.
 第2表示制御は、図15に示すように、観察対象の動きが第2閾値以上である場合には、第1診断支援情報76を表示し、観察対象の動きが第2閾値未満である場合に、第1診断支援情報76を非表示にする。スクリーニングにおいては、内視鏡の先端部12dを一定速度以上で移動させるため、観察対象の動きが比較的大きくなっている。したがって、スクリーニング時に病変候補領域を検出し易くするために、観察対象の動きが第2閾値以上などの大きい状態では、第1診断支援情報76を表示する。一方、観察対象の動きが第2閾値未満になるなど小さくなって、スクリーニングに適さなくなった場合には、第1診断支援情報76の表示が視認性を妨げる可能性があるため、第1診断支援情報76を非表示にする。第2表示制御において第1診断支援情報76を非表示にする場合であっても、第2診断支援情報は表示を維持することが好ましい。 As shown in FIG. 15, the second display control displays the first diagnosis support information 76 when the movement of the observation target is equal to or higher than the second threshold value, and when the movement of the observation target is less than the second threshold value. In addition, the first diagnosis support information 76 is hidden. In the screening, since the tip portion 12d of the endoscope is moved at a constant speed or higher, the movement of the observation target is relatively large. Therefore, in order to facilitate the detection of the lesion candidate region at the time of screening, the first diagnosis support information 76 is displayed when the movement of the observation target is large such as the second threshold value or more. On the other hand, when the movement of the observation target becomes smaller than the second threshold value and becomes unsuitable for screening, the display of the first diagnosis support information 76 may hinder the visibility, so that the first diagnosis support is supported. Hide information 76. Even when the first diagnosis support information 76 is hidden in the second display control, it is preferable to maintain the display of the second diagnosis support information.
 なお、第1表示制御と第2表示制御の切り替えは、ユーザーインターフェース19を用いて、手動で行ってもよい。また、第1表示制御と第2表示制御の切り替えは、表示制御切替部75(図12参照)によって、手動で行ってもよい。表示制御切替部75は、図16に示すように、観察対象の動きが切替用閾値未満である場合には、観察対象の動きが少ない詳細観察であると判定し、第1表示制御に切り替える。一方、観察対象の動きが切替用閾値以上である場合には、観察対象の動きが大きいスクリーニングであると判定し、第2表示制御に切り替える。表示制御部60は、自動的に切り替えられた第1表示制御又は第2表示制御に従って、診断支援情報の表示制御を行う。 Note that switching between the first display control and the second display control may be performed manually using the user interface 19. Further, the switching between the first display control and the second display control may be manually performed by the display control switching unit 75 (see FIG. 12). As shown in FIG. 16, when the movement of the observation target is less than the switching threshold value, the display control switching unit 75 determines that the movement of the observation target is less detailed observation, and switches to the first display control. On the other hand, when the movement of the observation target is equal to or higher than the switching threshold value, it is determined that the movement of the observation target is large in the screening, and the screen is switched to the second display control. The display control unit 60 controls the display of the diagnosis support information according to the automatically switched first display control or second display control.
 なお、第2照明光の第2発光パターンが、第2B発光パターン、又は、第2D発光パターンのように、第2照明光の発光スペクトルが、それぞれの前記第2照明期間において異なっており、且つ、各第2照明光に対応する第2画像信号に基づく認識処理により、複数の病変候補画像が認識された場合には、表示制御部60は、観察対象の動きに関わらず表示を維持する第2診断支援情報の表示を、第2照明光の発光スペクトルによって、異ならせることが好ましい。 It should be noted that the emission spectrum of the second illumination light is different in each of the second illumination periods, as in the case where the second emission pattern of the second illumination light is the second B emission pattern or the second D emission pattern. When a plurality of lesion candidate images are recognized by the recognition process based on the second image signal corresponding to each second illumination light, the display control unit 60 maintains the display regardless of the movement of the observation target. 2 It is preferable that the display of the diagnostic support information is different depending on the emission spectrum of the second illumination light.
 例えば、図17に示すように、診断支援画像において、発光スペクトルAの第2照明光によって検出された病変候補領域DRAと、発光スペクトルAとは異なる発光スペクトルBの第2照明光によって検出された病変候補領域DRBが存在する場合には、病変候補領域DRAに関する第1診断支援情報76a及び第2診断支援情報78aと、病変候補領域DRBに関する第1診断支援情報76b及び第2診断支援情報78bとが表示されている。 For example, as shown in FIG. 17, in the diagnosis support image, the lesion candidate region DRA detected by the second illumination light of the emission spectrum A and the second illumination light of the emission spectrum B different from the emission spectrum A were detected. When the lesion candidate region DRB exists, the first diagnosis support information 76a and the second diagnosis support information 78a regarding the lesion candidate region DRA, and the first diagnosis support information 76b and the second diagnosis support information 78b regarding the lesion candidate region DRB. Is displayed.
 この場合、観察対象の動きに関わらず表示を維持する第2診断支援情報78a及び78bについては、第2診断支援情報78bは「青色」のバンディングボックスで表示し、第2診断支援情報78bは「緑色」のバウンディングボックスで表示する。すなわち、「青色」は発光スペクトルAの第2照明光で検出されたことを、「緑色」は発光スペクトルBの第2照明光で検出されたことを表している。これにより、それぞれの病変候補領域DRA、DRBが、いずれの発光スペクトルの第2照明光で検出されたかを視覚的に把握することができる。 In this case, regarding the second diagnosis support information 78a and 78b that maintain the display regardless of the movement of the observation target, the second diagnosis support information 78b is displayed in the "blue" banding box, and the second diagnosis support information 78b is ". Display in the "green" bounding box. That is, "blue" indicates that it was detected by the second illumination light of the emission spectrum A, and "green" indicates that it was detected by the second illumination light of the emission spectrum B. This makes it possible to visually grasp which lesion candidate region DRA or DRB is detected by the second illumination light of which emission spectrum.
 次に、診断支援モードの一連の流れについて、図18のフローチャートに沿って説明する。モード切替SW12fを操作して、診断支援モードに切り替えられると、互いに発光スペクトルが異なる第1照明光と第2照明光とが自動的に切り替えて発光される。撮像センサ44は、第1照明光によって照明された観察対象を撮像して、第1画像信号を出力する。また、第2照明光によって照明された観察対象を撮像して、第2画像信号を出力する。 Next, a series of flow of the diagnosis support mode will be described with reference to the flowchart of FIG. When the mode switching SW12f is operated to switch to the diagnostic support mode, the first illumination light and the second illumination light having different emission spectra are automatically switched and emitted. The image pickup sensor 44 takes an image of the observation target illuminated by the first illumination light and outputs the first image signal. Further, the observation target illuminated by the second illumination light is imaged, and the second image signal is output.
 認識処理部70は、第2画像信号に対して認識処理を行う。動き取得部72は、画像中での観察対象の動きを取得する。診断支援情報画像生成部74は、認識処理により病変候補領域を認識した場合には、第1画像信号に基づく観察画像に対して、病変候補領域に関する診断支援情報を表示する診断支援情報画像を生成する。表示制御部60は、診断支援情報画像において、観察対象の動きによって、診断支援情報のうち少なくとも第1診断支援情報の表示又は非表示を制御する。以上の一連の処理は、診断支援モードが継続する間、繰り返し行われる。 The recognition processing unit 70 performs recognition processing on the second image signal. The motion acquisition unit 72 acquires the motion of the observation target in the image. When the diagnosis support information image generation unit 74 recognizes the lesion candidate region by the recognition process, the diagnosis support information image generation unit 74 generates a diagnosis support information image that displays the diagnosis support information regarding the lesion candidate region with respect to the observation image based on the first image signal. do. The display control unit 60 controls the display or non-display of at least the first diagnosis support information among the diagnosis support information according to the movement of the observation target in the diagnosis support information image. The above series of processes is repeated while the diagnosis support mode continues.
 上記実施形態において、光源用プロセッサ21、撮像用プロセッサ45、画像取得部50、DPS52、ノイズ低減部54、画像処理切替部56、画像処理部58に含まれる通常観察画像生成部62、特殊観察画像生成部64、診断支援情報処理部66、中央制御部68、認識処理部70、動き取得部72、診断支援情報画像生成部74、及び表示制御切替部75といった各種の処理を実行する処理部(processing unit)のハードウェア的な構造は、次に示すような各種のプロセッサ(processor)である。各種のプロセッサには、ソフトウエア(プログラム)を実行して各種の処理部として機能する汎用的なプロセッサであるCPU(Central Processing Unit)、FPGA (Field Programmable Gate Array) などの製造後に回路構成を変更可能なプロセッサであるプログラマブルロジックデバイス(Programmable Logic Device:PLD)、各種の処理を実行するために専用に設計された回路構成を有するプロセッサである専用電気回路などが含まれる。 In the above embodiment, the light source processor 21, the imaging processor 45, the image acquisition unit 50, the DPS 52, the noise reduction unit 54, the image processing switching unit 56, the normal observation image generation unit 62 included in the image processing unit 58, and the special observation image. A processing unit that executes various processes such as a generation unit 64, a diagnosis support information processing unit 66, a central control unit 68, a recognition processing unit 70, a motion acquisition unit 72, a diagnosis support information image generation unit 74, and a display control switching unit 75. The hardware structure of the processing unit) is various processors as shown below. For various processors, the circuit configuration is changed after manufacturing the CPU (Central Processing Unit), FPGA (Field Programmable Gate Array), etc., which are general-purpose processors that execute software (programs) and function as various processing units. It includes a programmable logic device (PLD), which is a possible processor, a dedicated electric circuit, which is a processor having a circuit configuration specially designed for executing various processes, and the like.
 1つの処理部は、これら各種のプロセッサのうちの1つで構成されてもよいし、同種または異種の2つ以上のプロセッサの組み合せ(例えば、複数のFPGAや、CPUとFPGAの組み合わせ)で構成されてもよい。また、複数の処理部を1つのプロセッサで構成してもよい。複数の処理部を1つのプロセッサで構成する例としては、第1に、クライアントやサーバなどのコンピュータに代表されるように、1つ以上のCPUとソフトウエアの組み合わせで1つのプロセッサを構成し、このプロセッサが複数の処理部として機能する形態がある。第2に、システムオンチップ(System On Chip:SoC)などに代表されるように、複数の処理部を含むシステム全体の機能を1つのIC(Integrated Circuit)チップで実現するプロセッサを使用する形態がある。このように、各種の処理部は、ハードウェア的な構造として、上記各種のプロセッサを1つ以上用いて構成される。 One processing unit may be composed of one of these various processors, or may be composed of a combination of two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of a CPU and an FPGA). May be done. Further, a plurality of processing units may be configured by one processor. As an example of configuring a plurality of processing units with one processor, first, as represented by a computer such as a client or a server, one processor is configured by a combination of one or more CPUs and software. There is a form in which this processor functions as a plurality of processing units. Second, as typified by System On Chip (SoC), there is a form that uses a processor that realizes the functions of the entire system including multiple processing units with one IC (Integrated Circuit) chip. be. As described above, the various processing units are configured by using one or more of the above-mentioned various processors as a hardware-like structure.
 さらに、これらの各種のプロセッサのハードウェア的な構造は、より具体的には、半導体素子などの回路素子を組み合わせた形態の電気回路(circuitry)である。また、記憶部のハードウェア的な構造はHDD(hard disc drive)やSSD(solid state drive)等の記憶装置である。 Furthermore, the hardware-like structure of these various processors is, more specifically, an electric circuit (circuitry) in which circuit elements such as semiconductor elements are combined. The hardware structure of the storage unit is a storage device such as an HDD (hard disk drive) or SSD (solid state drive).
10 内視鏡システム
12 内視鏡
12a 挿入部
12b 操作部
12c 湾曲部
12d 先端部
12e アングルノブ
12f モード切替スイッチ
12g 静止画取得指示部
12h ズーム操作部
14 光源装置
16 プロセッサ装置
18 ディスプレイ
19 ユーザーインターフェース
20 光源部
20a V-LED
20b B-LED
20c G-LED
20d R-LED
21 光源用プロセッサ
23 光路結合部
25 ライトガイド
30a 照明光学系
30b 撮像光学系
32 照明レンズ
42 対物レンズ
43 ズームレンズ
44 撮像センサ
45 撮像用プロセッサ
46 CDS/AGC回路
48 A/Dコンバータ
50 画像取得部
52 DSP
54 ノイズ低減部
56 画像処理切替部
58 画像処理部
60 表示制御部
62 通常観察画像生成部
64 特殊観察画像生成部
66 診断支援情報処理部
68 中央制御部
69 静止画保存用メモリ
70 認識処理部
72 動き取得部
74 診断支援情報画像生成部
75 表示制御切替部
76、76a、76b 第1診断支援情報
78、78a、78b 第2診断支援情報
DR、DRA、DRB 病変候補領域
10 Endoscope system 12 Endoscope 12a Insertion part 12b Operation part 12c Curved part 12d Tip part 12e Angle knob 12f Mode changeover switch 12g Still image acquisition instruction part 12h Zoom operation part 14 Light source device 16 Processor device 18 Display 19 User interface 20 Light source 20a V-LED
20b B-LED
20c G-LED
20d R-LED
21 Light source processor 23 Optical path coupling part 25 Light guide 30a Illumination optical system 30b Imaging optical system 32 Illumination lens 42 Objective lens 43 Zoom lens 44 Imaging sensor 45 Imaging processor 46 CDS / AGC circuit 48 A / D converter 50 Image acquisition unit 52 DSP
54 Noise reduction unit 56 Image processing switching unit 58 Image processing unit 60 Display control unit 62 Normal observation image generation unit 64 Special observation image generation unit 66 Diagnosis support information processing unit 68 Central control unit 69 Still image storage memory 70 Recognition processing unit 72 Motion acquisition unit 74 Diagnosis support information image generation unit 75 Display control switching unit 76, 76a, 76b First diagnosis support information 78, 78a, 78b Second diagnosis support information DR, DRA, DRB Disease candidate area

Claims (10)

  1.  互いに発光スペクトルが異なる第1照明光と第2照明光とを発する光源部と、
     前記第1照明光を発光する第1照明期間と前記第2照明光を発光する第2照明期間とを自動的に切り替える場合において、前記第1照明光を第1発光パターンで発光し、前記第2照明光を第2発光パターンで発光する光源用プロセッサと、
     前記第1照明光によって照明された観察対象を撮像して得られる第1画像信号と、前記第2照明光によって照明された前記観察対象を撮像して第2画像信号とを出力する撮像センサと、
     画像制御用プロセッサとを備え、
     前記画像制御用プロセッサは、
     前記第2画像信号に対して認識処理を行い、
     画像中での前記観察対象の動きを取得し、
     前記認識処理により病変候補領域を認識した場合に、前記第1画像信号に基づく観察画像に対して、前記病変候補領域に関する診断支援情報を表示する診断支援情報画像を生成し、
     前記診断支援情報画像において、前記観察対象の動きによって、前記診断支援情報のうち少なくとも第1診断支援情報の表示又は非表示を制御する内視鏡システム。
    A light source unit that emits first illumination light and second illumination light having different emission spectra from each other.
    When the first illumination period for emitting the first illumination light and the second illumination period for emitting the second illumination light are automatically switched, the first illumination light is emitted in the first emission pattern, and the first emission pattern is used. 2 A light source processor that emits illumination light in the second emission pattern,
    An image sensor that outputs a first image signal obtained by imaging an observation object illuminated by the first illumination light and a second image signal by imaging the observation object illuminated by the second illumination light. ,
    Equipped with an image control processor
    The image control processor is
    The second image signal is recognized and processed.
    Acquire the movement of the observation target in the image and
    When the lesion candidate region is recognized by the recognition process, a diagnostic support information image that displays diagnostic support information regarding the lesion candidate region is generated for the observation image based on the first image signal.
    An endoscope system that controls the display or non-display of at least the first diagnosis support information among the diagnosis support information according to the movement of the observation target in the diagnosis support information image.
  2.  前記画像制御用プロセッサは、前記観察対象の動きが第1閾値未満である場合には、前記診断支援情報を表示し、前記観察対象の動きが前記第1閾値以上である場合には、前記診断支援情報を非表示にする第1表示制御を行う請求項1記載の内視鏡システム。 The image control processor displays the diagnosis support information when the movement of the observation target is less than the first threshold value, and the diagnosis when the movement of the observation target is equal to or more than the first threshold value. The endoscope system according to claim 1, wherein the first display control for hiding the support information is performed.
  3.  前記画像制御用プロセッサは、前記観察対象の動きが第2閾値未満である場合には、前記診断支援情報を非表示にし、前記観察対象の動きが前記第2閾値以上である場合には、前記診断支援情報を表示にする第2表示制御を行う請求項1または2記載の内視鏡システム。 The image control processor hides the diagnostic support information when the movement of the observation target is less than the second threshold value, and when the movement of the observation target is equal to or more than the second threshold value, the diagnosis support information is hidden. The endoscope system according to claim 1 or 2, wherein the second display control for displaying diagnostic support information is performed.
  4.  前記画像制御用プロセッサが、前記観察対象の動きが第1閾値未満である場合には、前記診断支援情報を表示し、前記観察対象の動きが前記第1閾値以上である場合には、前記診断支援情報を非表示にする第1表示制御、又は、前記観察対象の動きが第2閾値未満である場合には、前記診断支援情報を非表示にし、前記観察対象の動きが前記第2閾値以上である場合には、前記診断支援情報を表示にする第2表示制御を行うことが可能である場合において、
     前記第1表示制御と前記第2表示制御を手動で切り替えるユーザーインターフェースを有する請求項1記載の内視鏡システム。
    The image control processor displays the diagnosis support information when the movement of the observation target is less than the first threshold value, and the diagnosis when the movement of the observation target is equal to or more than the first threshold value. The first display control that hides the support information, or when the movement of the observation target is less than the second threshold value, the diagnosis support information is hidden and the movement of the observation target is equal to or higher than the second threshold value. In the case where it is possible to perform the second display control for displaying the diagnosis support information.
    The endoscope system according to claim 1, further comprising a user interface for manually switching between the first display control and the second display control.
  5.  前記画像制御用プロセッサが、前記観察対象の動きが第1閾値未満である場合には、前記診断支援情報を表示し、前記観察対象の動きが前記第1閾値以上である場合には、前記診断支援情報を非表示にする第1表示制御、又は、前記観察対象の動きが第2閾値未満である場合には、前記診断支援情報を非表示にし、前記観察対象の動きが前記第2閾値以上である場合には、前記診断支援情報を表示にする第2表示制御を行うことが可能である場合において、
     前記画像制御用プロセッサが、前記第1表示制御と前記第2表示制御を自動で切り替える請求項1記載の内視鏡システム。
    The image control processor displays the diagnosis support information when the movement of the observation target is less than the first threshold value, and the diagnosis when the movement of the observation target is equal to or more than the first threshold value. The first display control that hides the support information, or when the movement of the observation target is less than the second threshold value, the diagnosis support information is hidden and the movement of the observation target is equal to or higher than the second threshold value. In the case where it is possible to perform the second display control for displaying the diagnosis support information.
    The endoscope system according to claim 1, wherein the image control processor automatically switches between the first display control and the second display control.
  6.  前記画像制御用プロセッサは、前記観察対象の動きが切替用閾値未満である場合には、前記第1表示制御に切替え、前記観察対象の動きが前記切替用閾値以上である場合には、前記第2表示制御に切り替える請求項5記載の内視鏡システム。 The image control processor switches to the first display control when the movement of the observation target is less than the switching threshold, and the first display control when the movement of the observation target is equal to or more than the switching threshold. 2. The endoscope system according to claim 5, which switches to display control.
  7.  前記第2照明光の発光スペクトルが、それぞれの前記第2照明期間において異なっており、且つ、各第2照明光に対応する第2画像信号に基づく認識処理により、複数の病変候補画像が認識された場合には、前記診断支援情報のうち前記観察対象の動きに関わらず表示を維持する第2診断支援情報の表示は、前記第2照明光の発光スペクトルによって異なる請求項1ないし6いずれか1項記載の内視鏡システム。 The emission spectra of the second illumination light are different in each of the second illumination periods, and a plurality of lesion candidate images are recognized by the recognition process based on the second image signal corresponding to each second illumination light. In this case, the display of the second diagnostic support information that maintains the display regardless of the movement of the observation target among the diagnostic support information is any one of claims 1 to 6 that differs depending on the emission spectrum of the second illumination light. Endoscope system described in section.
  8.  第1発光パターンは、前記第1照明期間のフレーム数が、それぞれの前記第1照明期間において同じである第1A発光パターンと、前記第1照明期間のフレーム数が、それぞれの第1照明期間において異なっている第1B発光パターンとのうちのいずれかである請求項1ないし7いずれか1項記載の内視鏡システム。 In the first light emission pattern, the number of frames in the first lighting period is the same in each of the first lighting periods, and the number of frames in the first lighting period is the same in each first lighting period. The endoscope system according to any one of claims 1 to 7, which is one of different first B emission patterns.
  9.  第2発光パターンは、前記第2照明期間のフレーム数が、それぞれの前記第2照明期間において同じであり、且つ、前記第2照明光の発光スペクトルが、それぞれの前記第2照明期間において同じである第2A発光パターン、
     前記第2照明期間のフレーム数が、それぞれの前記第2照明期間において同じであり、且つ、第2照明光の発光スペクトルが、それぞれの前記第2照明期間において異なっている第2B発光パターン、
     前記第2照明期間のフレーム数が、それぞれの前記第2照明期間において異なっており、且つ、第2照明光の発光スペクトルが、それぞれの前記第2照明期間において同じである第2C発光パターン、
     及び、前記第2照明期間のフレーム数が、それぞれの前記第2照明期間において異なっており、且つ、前記第2照明光の発光スペクトルが、それぞれの前記第2照明期間において異なっている第2D発光パターンのうちのいずれかである請求項1ないし8いずれか1項記載の内視鏡システム。
    In the second emission pattern, the number of frames in the second illumination period is the same in each of the second illumination periods, and the emission spectrum of the second illumination light is the same in each of the second illumination periods. A second A emission pattern,
    A second B emission pattern, in which the number of frames in the second illumination period is the same in each of the second illumination periods, and the emission spectrum of the second illumination light is different in each of the second illumination periods.
    A second C emission pattern, wherein the number of frames in the second illumination period is different in each of the second illumination periods, and the emission spectrum of the second illumination light is the same in each of the second illumination periods.
    The number of frames in the second illumination period is different in each of the second illumination periods, and the emission spectrum of the second illumination light is different in each of the second illumination periods. The endoscope system according to any one of claims 1 to 8, which is any one of the patterns.
  10.  互いに発光スペクトルが異なる第1照明光と第2照明光とを発する光源部、前記第1照明光を発光する第1照明期間と前記第2照明光を発光する第2照明期間とを自動的に切り替える場合において、前記第1照明光を第1発光パターンで発光し、前記第2照明光を第2発光パターンで発光する光源用プロセッサ、前記第1照明光によって照明された観察対象を撮像して得られる第1画像信号と、前記第2照明光によって照明された前記観察対象を撮像して第2画像信号とを出力する撮像センサ、及び、画像制御用プロセッサとを備える内視鏡システムの作動方法において、
     前記画像制御用プロセッサは、
     前記第2画像信号に対して認識処理を行い、
     画像中での前記観察対象の動きを取得し、
     前記認識処理により病変候補領域を認識した場合に、前記第1画像信号に基づく観察画像に対して、前記病変候補領域に関する診断支援情報を表示する診断支援情報画像を生成し、
     前記診断支援情報画像において、前記観察対象の動きによって、前記診断支援情報のうち少なくとも第1診断支援情報の表示又は非表示を制御する内視鏡システムの作動方法。
     
     
     
    The light source unit that emits the first illumination light and the second illumination light having different emission spectra, the first illumination period that emits the first illumination light, and the second illumination period that emits the second illumination light are automatically set. In the case of switching, the light source processor that emits the first illumination light in the first emission pattern and emits the second illumination light in the second emission pattern, and the observation target illuminated by the first illumination light are imaged. Operation of an endoscope system including a first image signal obtained, an image sensor that captures the observation target illuminated by the second illumination light and outputs a second image signal, and an image control processor. In the method
    The image control processor is
    The second image signal is recognized and processed.
    Acquire the movement of the observation target in the image and
    When the lesion candidate region is recognized by the recognition process, a diagnostic support information image that displays diagnostic support information regarding the lesion candidate region is generated for the observation image based on the first image signal.
    A method of operating an endoscope system that controls the display or non-display of at least the first diagnosis support information among the diagnosis support information according to the movement of the observation target in the diagnosis support information image.


PCT/JP2021/008740 2020-07-03 2021-03-05 Endoscope system and method for operating same WO2022004056A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022533677A JPWO2022004056A1 (en) 2020-07-03 2021-03-05

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020115695 2020-07-03
JP2020-115695 2020-07-03

Publications (1)

Publication Number Publication Date
WO2022004056A1 true WO2022004056A1 (en) 2022-01-06

Family

ID=79315699

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/008740 WO2022004056A1 (en) 2020-07-03 2021-03-05 Endoscope system and method for operating same

Country Status (2)

Country Link
JP (1) JPWO2022004056A1 (en)
WO (1) WO2022004056A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016199273A1 (en) * 2015-06-11 2016-12-15 オリンパス株式会社 Endoscope device and operation method for endoscope device
WO2020012563A1 (en) * 2018-07-10 2020-01-16 オリンパス株式会社 Endoscope device, processing device and processing method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016199273A1 (en) * 2015-06-11 2016-12-15 オリンパス株式会社 Endoscope device and operation method for endoscope device
WO2020012563A1 (en) * 2018-07-10 2020-01-16 オリンパス株式会社 Endoscope device, processing device and processing method

Also Published As

Publication number Publication date
JPWO2022004056A1 (en) 2022-01-06

Similar Documents

Publication Publication Date Title
JP7335399B2 (en) MEDICAL IMAGE PROCESSING APPARATUS, ENDOSCOPE SYSTEM, AND METHOD OF OPERATION OF MEDICAL IMAGE PROCESSING APPARATUS
JPWO2019163540A1 (en) Endoscope system
US20230027950A1 (en) Medical image processing apparatus, endoscope system, method of operating medical image processing apparatus, and non-transitory computer readable medium
JP7130043B2 (en) MEDICAL IMAGE PROCESSING APPARATUS, ENDOSCOPE SYSTEM, AND METHOD OF OPERATION OF MEDICAL IMAGE PROCESSING APPARATUS
WO2022014077A1 (en) Endoscope system and method for operating same
JP7047122B2 (en) How to operate a medical image processing device, an endoscope system, and a medical image processing device
US20230101620A1 (en) Medical image processing apparatus, endoscope system, method of operating medical image processing apparatus, and non-transitory computer readable medium
US20230029239A1 (en) Medical image processing system and method for operating medical image processing system
WO2020054255A1 (en) Endoscope device, endoscope processor, and endoscope device operation method
WO2022004056A1 (en) Endoscope system and method for operating same
WO2021006121A1 (en) Image processing device, endoscope system, and operation method for image processing device
JP7214886B2 (en) Image processing device and its operating method
WO2022018894A1 (en) Endoscope system and method for operating same
WO2021205777A1 (en) Processor device and operation method for same
JP7411515B2 (en) Endoscope system and its operating method
WO2021149357A1 (en) Endoscope system and method for operating same
WO2022209390A1 (en) Endoscope system and operation method of same
WO2021106452A1 (en) Endoscope system and method for operating same
WO2023007896A1 (en) Endoscope system, processor device, and operation method therefor
WO2021229900A1 (en) Endoscope system and method for operating same
US11969152B2 (en) Medical image processing system
JP7015384B2 (en) Medical image processing system
WO2020217883A1 (en) Image processing device and method for operating same
JP7076535B2 (en) Endoscope device, how to operate the endoscope device, and program
CN115956870A (en) Endoscope system and working method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21832766

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022533677

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21832766

Country of ref document: EP

Kind code of ref document: A1