WO2022004056A1 - Système d'endoscope et procédé de fonctionnement d'un tel système - Google Patents

Système d'endoscope et procédé de fonctionnement d'un tel système Download PDF

Info

Publication number
WO2022004056A1
WO2022004056A1 PCT/JP2021/008740 JP2021008740W WO2022004056A1 WO 2022004056 A1 WO2022004056 A1 WO 2022004056A1 JP 2021008740 W JP2021008740 W JP 2021008740W WO 2022004056 A1 WO2022004056 A1 WO 2022004056A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
support information
illumination
observation target
movement
Prior art date
Application number
PCT/JP2021/008740
Other languages
English (en)
Japanese (ja)
Inventor
康太郎 檜谷
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to JP2022533677A priority Critical patent/JPWO2022004056A1/ja
Publication of WO2022004056A1 publication Critical patent/WO2022004056A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements

Definitions

  • the present invention relates to an endoscopic system for displaying diagnostic support information regarding a lesion candidate area such as a suspicion rate of a lesion and a method for operating the same.
  • an endoscope system including a light source device, an endoscope, and a processor device.
  • an observation object is irradiated with illumination light, and the observation object illuminated by the illumination light is imaged by an image pickup sensor to acquire an endoscope image as a medical image.
  • the endoscopic image is displayed on the monitor and used for diagnosis.
  • lesion candidate regions are detected from images obtained by imaging an observation target.
  • the user is notified that the lesion candidate area has been detected by displaying the area around the lesion candidate area with diagnostic support information such as a bounding box.
  • diagnostic support information such as a bounding box.
  • the notification to the user reduces the visibility of the observation target, it may hinder the effective diagnosis by the user.
  • Patent Document 1 when the number of lesion candidate regions is larger than a predetermined threshold value or the size is larger than a predetermined threshold value, the alert image regarding the notification to the user is hidden. The visibility of the observation target is not reduced.
  • the present invention prevents the diagnostic support information regarding the lesion candidate region from interfering with the visibility of the observation target when the lesion candidate region is detected even in a situation where the movement of the observation target in the image changes. It is an object of the present invention to provide an endoscope system and a method of operating the same.
  • the endoscope system of the present invention has a light source unit that emits first illumination light and second illumination light having different emission spectra, and a first illumination period that emits the first illumination light and a second illumination light that emits the second illumination light.
  • the light was illuminated by the light source processor that emits the first illumination light in the first emission pattern and emits the second illumination light in the second emission pattern, and the first illumination light.
  • Image control is provided with an image sensor that captures the first image signal obtained by imaging the observation target and outputs the second image signal that captures the observation target illuminated by the second illumination light, and an image control processor.
  • the processor When the processor performs recognition processing on the second image signal, acquires the movement of the observation target in the image, and recognizes the lesion candidate region by the recognition processing, the observation image based on the first image signal is obtained.
  • a diagnostic support information image that displays diagnostic support information regarding the lesion candidate area is generated, and at least the first diagnostic support information of the diagnostic support information is displayed or hidden in the diagnostic support information image depending on the movement of the observation target. Control.
  • the image control processor displays diagnostic support information when the movement of the observation target is less than the first threshold value, and hides the diagnostic support information when the movement of the observation target is equal to or higher than the first threshold value. It is preferable to perform the first display control.
  • the image control processor hides the diagnostic support information when the movement of the observation target is less than the second threshold value, and displays the diagnostic support information when the movement of the observation target is equal to or more than the second threshold value. It is preferable to perform the second display control.
  • the image control processor displays diagnostic support information when the movement of the observation target is less than the first threshold value, and hides the diagnostic support information when the movement of the observation target is equal to or higher than the first threshold value.
  • the first display control or the diagnosis support information is hidden when the movement of the observation target is less than the second threshold value, and the diagnosis support information is displayed when the movement of the observation target is equal to or more than the second threshold value.
  • the image control processor displays diagnostic support information when the movement of the observation target is less than the first threshold value, and hides the diagnostic support information when the movement of the observation target is equal to or higher than the first threshold value.
  • the first display control or the diagnosis support information is hidden when the movement of the observation target is less than the second threshold value, and the diagnosis support information is displayed when the movement of the observation target is equal to or more than the second threshold value.
  • the image control processor automatically switches between the first display control and the second display control.
  • the image control processor switches to the first display control when the movement of the observation target is less than the switching threshold value, and switches to the second display control when the movement of the observation target is equal to or more than the switching threshold value.
  • the emission spectra of the second illumination light are different in each second illumination period, and a plurality of lesion candidate images are recognized by the recognition process based on the second image signal corresponding to each second illumination light. It is preferable that the display of the second diagnostic support information, which maintains the display regardless of the movement of the observation target among the diagnostic support information, differs depending on the emission spectrum of the second illumination light.
  • the first light emission pattern has a first A light emission pattern in which the number of frames in the first lighting period is the same in each of the first lighting periods, and the number of frames in the first lighting period is different in each first lighting period. It is preferably one of the first B emission patterns.
  • the number of frames in the second illumination period is the same in each second illumination period, and the emission spectrum of the second illumination light is the same in each second illumination period.
  • the second B emission pattern, the second in which the pattern and the number of frames in the second illumination period are the same in each second illumination period, and the emission spectrum of the second illumination light is different in each second illumination period.
  • the second C emission pattern and the second illumination period in which the number of frames in the illumination period is different in each second illumination period and the emission spectrum of the second illumination light is the same in each second illumination period.
  • the number of frames is different in each second illumination period, and the emission spectrum of the second illumination light is one of the second D emission patterns different in each second illumination period. preferable.
  • the present invention automatically sets a light source unit that emits a first illumination light and a second illumination light having different emission spectra, a first illumination period that emits the first illumination light, and a second illumination period that emits the second illumination light.
  • a light source processor that emits the first illumination light in the first emission pattern and emits the second illumination light in the second emission pattern, and obtains an image of an observation target illuminated by the first illumination light.
  • an endoscope system including an image pickup sensor that captures an observation target illuminated by a second illumination light and outputs a second image signal, and an image control processor.
  • the image control processor performs recognition processing on the second image signal, acquires the movement of the observation target in the image, and recognizes the lesion candidate region by the recognition processing, observation based on the first image signal.
  • a diagnostic support information image that displays diagnostic support information regarding a lesion candidate area is generated for the image, and at least the first diagnostic support information of the diagnostic support information is displayed or not displayed in the diagnostic support information image depending on the movement of the observation target. Control the display.
  • the diagnostic support information regarding the lesion candidate region does not interfere with the visibility of the observation target. can do.
  • the endoscope system 10 has an endoscope 12, a light source device 14, a processor device 16, a display 18, and a user interface 19.
  • the endoscope 12 is optically connected to the light source device 14 and electrically connected to the processor device 16.
  • the endoscope 12 has an insertion portion 12a to be inserted into the body to be observed, an operation portion 12b provided at the base end portion of the insertion portion 12a, and a curved portion 12c and a tip provided on the tip end side of the insertion portion 12a. It has a portion 12d.
  • the curved portion 12c bends by operating the angle knob 12e of the operating portion 12b.
  • the tip portion 12d is directed in a desired direction by the bending motion of the bending portion 12c.
  • the operation unit 12b includes a mode switching switch (mode switching switch) 12f used for mode switching operation, and a still image acquisition instruction unit 12g used for instructing acquisition of a still image to be observed.
  • a zoom operation unit 12h used for operating the zoom lens 43 (see FIG. 2) is provided.
  • the endoscope system 10 has three modes: a normal observation mode, a special observation mode, and a diagnostic support mode.
  • a normal observation mode a normal observation image having a natural hue is displayed on the display 18 by illuminating the observation target with normal light such as white light and taking an image.
  • a special observation image emphasizing a specific structure is displayed on the display 18 by illuminating the observation target with special light having a emission spectrum different from that of normal light and taking an image.
  • the diagnostic support mode when the first illumination light and the second illumination light having different emission spectra are switched to emit light and the lesion candidate region or the like is recognized by the recognition process (AI (Artificial Intelligence) or the like), the first A diagnostic support information image that displays diagnostic support information regarding the lesion candidate region is displayed on the display 18 with respect to the observation image based on the illumination light.
  • AI Artificial Intelligence
  • a signal related to the still image acquisition instruction is sent to the endoscope 12, the light source device 14, and the processor device 16.
  • the processor device 16 when the still image acquisition instruction is given, when the still image acquisition instruction is given, the still image to be observed is saved in the still image storage memory 69 of the processor device 16.
  • the processor device 16 is electrically connected to the display 18 and the user interface 19.
  • the display 18 outputs and displays an image to be observed, information incidental to the image to be observed, and the like.
  • the user interface 19 has a keyboard, a mouse, a touch pad, and the like, and has a function of accepting input operations such as function settings.
  • An external recording unit (not shown) for recording an image, image information, or the like may be connected to the processor device 16.
  • the light source device 14 includes a light source unit 20 and a light source processor 21 that controls the light source unit 20.
  • the light source unit 20 has, for example, a plurality of semiconductor light sources, each of which is turned on or off, and when the light source unit 20 is turned on, the light emission amount of each semiconductor light source is controlled to emit illumination light for illuminating the observation target.
  • the light source unit 20 is a V-LED (Violet Light Emitting Diode) 20a, a B-LED (Blue Light Emitting Diode) 20b, a G-LED (Green Light Emitting Diode) 20c, and an R-LED (Red Light).
  • Emitting Diode It has a 20d 4-color LED.
  • the V-LED 20a generates purple light V having a center wavelength of 405 ⁇ 10 nm and a wavelength range of 380 to 420 nm.
  • the B-LED 20b generates blue light B having a center wavelength of 450 ⁇ 10 nm and a wavelength range of 420 to 500 nm.
  • the G-LED 20c generates green light G having a wavelength range of 480 to 600 nm.
  • the R-LED 20d generates red light R having a center wavelength of 620 to 630 nm and a wavelength range of 600 to 650 nm.
  • the light source processor 21 controls the V-LED20a, B-LED20b, G-LED20c, and R-LED20d. By controlling each of the LEDs 20a to 20d independently, the light source processor 21 can emit purple light V, blue light B, green light G, or red light R by independently changing the amount of light. Further, the light source processor 21 emits white light having a light amount ratio of Vc: Bc: Gc: Rc among the purple light V, the blue light B, the green light G, and the red light R in the normal observation mode. , Each LED 20a to 20d is controlled. In addition, Vc, Bc, Gc, Rc> 0.
  • the light source processor 21 has a light amount ratio of Vs: Bs: Gs: Rs with purple light V, blue light B, green light G, and red light R as short-wavelength narrow-band light.
  • Each LED 20a to 20d is controlled so as to emit a special light.
  • the light amount ratio Vs: Bs: Gs: Rs is different from the light amount ratio Vc: Bc: Gc: Rc used in the normal observation mode, and is appropriately determined according to the observation purpose. For example, when emphasizing superficial blood vessels, it is preferable to make Vs larger than other Bs, Gs, Rs, and when emphasizing mesopelagic blood vessels, Gs is more than other Vs, Gs, Rs. It is also preferable to increase the size.
  • the first illumination light is emitted in the first emission pattern and the second illumination is emitted.
  • the light is emitted in the second emission pattern.
  • the first light emission pattern is the first A light emission pattern in which the number of frames in the first lighting period is the same in each first lighting period, and as shown in FIG. It is preferable that the number of frames in the first lighting period is one of the first B emission patterns different in each first lighting period.
  • the number of frames in the second illumination period is the same in each second illumination period, and the emission spectrum of the second illumination light is in each second illumination period.
  • the number of frames in the second illumination period is the same in each of the second illumination periods, and the emission spectrum of the second illumination light is the same in the second illumination pattern.
  • the second B emission pattern different in the two illumination periods as shown in FIG. 7, the number of frames in the second illumination period is different in each second illumination period, and the emission spectrum of the second illumination light is different.
  • the second C emission pattern which is the same in each second illumination period as shown in FIG. 8, the number of frames in the second illumination period is different in each second illumination period, and the emission of the second illumination light. It is preferable that the spectrum is one of the second D emission patterns that are different in each second illumination period.
  • the emission spectrum of the first illumination light may be the same or different in each first illumination period.
  • the first lighting period is preferably longer than the second lighting period, and the first lighting period is preferably two frames or more.
  • the first lighting period is set to 2 frames, and the second lighting period is set to 1 frame. Since the first illumination light is used to generate an observation image to be displayed on the display 18, it is preferable to obtain a bright image by illuminating the observation target with the first illumination light.
  • the first illumination light is preferably white light.
  • the second illumination light is used for the recognition process, it is preferable to illuminate the observation target with the second illumination light to obtain an image suitable for the recognition process.
  • the recognition process is performed based on the surface blood vessels, it is preferable that the second illumination light is purple light V.
  • the first and second light emission patterns which are switching patterns between the first lighting period and the second lighting period, will be described later because they are determined based on the image pickup control of the image pickup sensor 44 by the image pickup processor 45.
  • the frame refers to a unit of a period including at least a period from a specific timing to the completion of signal reading in the image pickup sensor 44.
  • the light intensity ratio includes the case where the ratio of at least one semiconductor light source is 0 (zero). Therefore, this includes the case where any one or more of the semiconductor light sources are not lit. For example, as in the case where the light amount ratio between purple light V, blue light B, green light G, and red light R is 1: 0: 0: 0, only one of the semiconductor light sources is turned on, and the other three are turned on. Even if it does not light up, it shall have a light intensity ratio.
  • the light emitted by each of the LEDs 20a to 20d is incident on the light guide 25 via the optical path coupling portion 23 composed of a mirror, a lens, or the like.
  • the light guide 25 is built in the endoscope 12 and a universal cord (a cord connecting the endoscope 12, the light source device 14 and the processor device 16).
  • the light guide 25 propagates the light from the optical path coupling portion 23 to the tip portion 12d of the endoscope 12.
  • An illumination optical system 30a and an image pickup optical system 30b are provided at the tip end portion 12d of the endoscope 12.
  • the illumination optical system 30a has an illumination lens 32, and the illumination light propagated by the light guide 25 is applied to the observation target through the illumination lens 32.
  • the image pickup optical system 30b has an objective lens 42 and an image pickup sensor 44. The light from the observation target due to the irradiation of the illumination light is incident on the image pickup sensor 44 via the objective lens 42 and the zoom lens 43. As a result, an image to be observed is formed on the image pickup sensor 44.
  • the zoom lens 43 is a lens for enlarging the observation target, and moves between the telephoto end and the wide end by operating the zoom operation unit 12h.
  • the image pickup sensor 44 is a primary color sensor, and is a B pixel (blue pixel) having a blue color filter, a G pixel (green pixel) having a green color filter, and an R pixel (red pixel) having a red color filter. It is equipped with three types of pixels.
  • the blue color filter BF mainly transmits light in the blue band, specifically, light in the wavelength band having a wavelength band of 380 to 560 nm.
  • the transmittance of the blue color filter BF peaks in the vicinity of the wavelength of 460 to 470 nm.
  • the green color filter transmits GF, mainly light in the green band, specifically, light in the wavelength band of 460 to 620 nm.
  • the red color filter RF mainly transmits light in the red band, specifically, light in the wavelength band of 580 to 760 nm.
  • the image sensor 44 is preferably a CCD (Charge-Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).
  • the image pickup processor 45 controls the image pickup sensor 44. Specifically, the image signal is output from the image pickup sensor 44 by reading out the signal of the image pickup sensor 44 by the image pickup processor 45. In the normal observation mode, the image pickup processor 45 reads out the signal while the normal light is exposed to the image pickup sensor 44, so that the Bc image signal is output from the B pixel of the image pickup sensor 44 and the Gc image signal is output from the G pixel. Is output, and an Rc image signal is output from the R pixel.
  • the Bs image signal is output from the B pixel of the image pickup sensor 44 and the Gs image signal is output from the G pixel by reading the signal from the image pickup processor 45 in a state where the special light is exposed to the image pickup sensor 44. Is output, and the Rs image signal is output from the R pixel.
  • the image pickup processor 45 reads out a signal from the image pickup sensor 44 in a state where the first illumination light is exposed to the image pickup sensor 44 during the first illumination period. 1 Output an image signal.
  • the period for outputting the first image signal is defined as the first imaging period.
  • the first image signal includes a B1 image signal output from the B pixel, a G1 image signal output from the G pixel, and an R1 image signal output from the R pixel.
  • the image pickup processor 45 outputs a second image signal from the image pickup sensor 44 by performing signal readout in a state where the image pickup sensor 44 is exposed to the second illumination light during the second illumination period.
  • the period for outputting the second image signal is defined as the first imaging period.
  • the second image signal includes a B2 image signal output from the B pixel, a G2 image signal output from the G pixel, and an R2 image signal output from the R pixel.
  • the CDS / AGC (Correlated Double Sampling / Automatic Gain Control) circuit 46 performs correlated double sampling (CDS) and automatic gain control (AGC) on the analog image signal obtained from the image pickup sensor 44. ..
  • CDS correlated double sampling
  • AGC automatic gain control
  • the image signal that has passed through the CDS / AGC circuit 46 is converted into a digital image signal by the A / D (Analog / Digital) converter 48.
  • the digital image signal after A / D conversion is input to the processor device 16.
  • the processor device 16 stores programs related to various processes in a program memory (not shown).
  • the central control unit 68 configured by the image control processor operates the program in the program memory to operate the image acquisition unit 50, the DSP (Digital Signal Processor) 52, and the noise reduction unit.
  • the functions of the 54, the image processing switching unit 56, the image processing unit 58, and the display control unit 60 are realized. Further, with the realization of the functions of the image processing unit 58, the functions of the normal observation image generation unit 62, the special observation image generation unit 64, and the diagnosis support information processing unit 66 are realized.
  • the image control processor performs image processing based on the first image signal or the second image signal, and controls the display 18.
  • the image acquisition unit 50 acquires a color image input from the endoscope 12.
  • the color image includes a blue signal (B image signal), a green signal (G image signal), and a red signal (R image signal) output from the B pixel, G pixel, and R pixel of the image pickup sensor 44.
  • the acquired color image is transmitted to the DSP 52.
  • the DSP 52 performs various signal processing such as defect correction processing, offset processing, gain correction processing, matrix processing, gamma conversion processing, demosaic processing, and YC conversion processing on the received color image.
  • the noise reduction unit 54 performs noise reduction processing by, for example, a moving average method, a median filter method, or the like on a color image that has been demosaic processed by DSP 56.
  • the color image with reduced noise is input to the image processing switching unit 56.
  • the image processing switching unit 56 sets the destination of the image signal from the noise reduction unit 54 to the normal observation image generation unit 62, the special observation image generation unit 64, and the diagnosis support information processing unit 66. Switch to either. Specifically, when the normal observation mode is set, the image signal from the noise reduction unit 54 is input to the normal observation image generation unit 62. When the special observation mode is set, the image signal from the noise reduction unit 54 is input to the special observation image generation unit 64. When the diagnosis support mode is set, the image signal from the noise reduction unit 54 is input to the diagnosis support information processing unit 66.
  • the normal observation image generation unit 62 performs image processing for a normal observation image on the input Rc image signal, Gc image signal, and Bc image signal for one frame.
  • Image processing for normal observation images includes 3 ⁇ 3 matrix processing, gradation conversion processing, color conversion processing such as 3D LUT (Look Up Table) processing, color enhancement processing, and structure enhancement processing such as spatial frequency enhancement. Is done.
  • the Rc image signal, Gc image signal, and Bc image signal that have been subjected to image processing for a normal observation image are input to the display control unit 60 as normal observation images.
  • the special observation image generation unit 64 performs image processing for special observation images on the input Rs image signal, Gs image signal, and Bs image signal for one frame.
  • Image processing for special observation images includes 3 ⁇ 3 matrix processing, gradation conversion processing, color conversion processing such as 3D LUT (Look Up Table) processing, color enhancement processing, and structure enhancement processing such as spatial frequency enhancement. Is done.
  • the Rs image signal, Gs image signal, and Bs image signal that have been subjected to image processing for special observation images are input to the display control unit 60 as special observation images.
  • the diagnosis support information processing unit 66 performs the same image processing for normal observation images as described above on the input R1 image signal, G1 image signal, and B1 image signal for one frame.
  • the R1 image signal, the G1 image signal, and the B1 image signal that have undergone image processing for a normal observation image signal are used as observation images.
  • the diagnosis support information processing unit 66 performs recognition processing on the input R2 image signal, G2 image signal, and B2 image signal for a specific frame.
  • a diagnosis support information image displaying the diagnosis support information regarding the lesion candidate area is generated.
  • the display control unit 60 displays the observation image or the diagnosis support information image on the display 18.
  • the display control unit 60 controls to display the image output from the image processing unit 58 on the display 18. Specifically, the display control unit 60 converts a normal observation image, a special observation image, an observation image, or a diagnosis support information image into a video signal that can be displayed in full color on the display 18. The converted video signal is input to the display 18. As a result, the display 18 displays a normal observation image, a special observation image, an observation image, or a diagnosis support information image.
  • the display control unit 60 performs the following display control.
  • the first emission pattern is the first A emission pattern and the second emission pattern is the second B emission pattern (the number of frames in the second illumination period: the same, the emission spectrum of the second illumination light: different)
  • the first illumination light When the observation target is illuminated for two frames and the second illumination light for one frame each during the emission of the first illumination light, as shown in FIG. 11, it is obtained by the illumination of the first illumination light.
  • An observation image is obtained by performing image processing for a normal observation image on the first image signal to be obtained.
  • the observation image is displayed on the display 18 as it is.
  • the presence or absence of the lesion candidate region DR is detected by performing recognition processing on the second image signal obtained by the emission of the second illumination light.
  • the lesion candidate area is, for example, a lesion such as cancer, or a benign polyp.
  • the observation image of the immediately preceding frame is displayed on the display 18 when the second illumination light is emitted.
  • the diagnosis support information is displayed on the observation image during the period when the lesion candidate region is detected when the first illumination light is emitted and when the second illumination light is emitted.
  • the diagnostic support information image displaying the above is displayed on the display 18. The details of the display control of the diagnosis support information image will be described later.
  • the diagnosis support image is generated in the diagnosis support information processing unit 66.
  • the diagnosis support information processing unit 66 includes a recognition processing unit 70, a motion acquisition unit 72, and a diagnosis support information image generation unit 74.
  • the recognition processing unit 70 performs recognition processing based on the second image signal obtained when the second illumination light is emitted.
  • the recognition process is preferably a process for recognizing and detecting a lesion candidate region.
  • the recognition processing unit 70 uses a learning model that has been machine-learned for the lesion candidate region, and outputs the presence or absence of the lesion candidate region when a second image signal is input to the learning model. ..
  • the motion acquisition unit 72 acquires the motion of the observation target in the image.
  • the movement of the observation target is used to control the display or non-display of the first diagnosis support information, as will be described later.
  • the movement of the observation target includes the movement of the observation target caused by the movement of the actual observation target such as the body movement, and the movement of the observation target caused by the movement of the tip portion 12d of the endoscope.
  • the motion acquisition unit 72 calculates the motion of the observation target based on the first image signal or the second image signal of a plurality of frames obtained within a certain period of time.
  • the movement vector of the observation target is calculated from the first or second image signals of a plurality of frames obtained within a certain time, and the movement of the observation target is calculated based on the calculated movement vector. Is preferable.
  • the diagnosis support information image generation unit 74 When the lesion candidate area is recognized by the recognition process, the diagnosis support information image generation unit 74 generates a diagnosis support information image displaying the diagnosis support information regarding the lesion candidate area with respect to the observation image.
  • the diagnostic support information includes first diagnostic support information that is displayed or hidden on the display 18 depending on the movement of the observation target, and second diagnostic support that maintains the display on the display 18 regardless of the movement of the observation target. Contains information.
  • the first diagnosis support information 76 is character information of a lesion candidate region such as a suspicion rate of a lesion, and is information that easily affects the user's visibility to the observed image. Is preferable.
  • the second diagnosis support information 78 is preferably a bounding box or the like indicating the position information of the lesion candidate region DR, and is information that does not easily affect the user's visibility of the observed image.
  • the display control unit 60 controls the display or non-display of at least the first diagnosis support information among the diagnosis support information in the diagnosis support information image according to the movement of the observation target. Specifically, the display control unit 60 is different from the first display control when performing detailed observation (differentiation, etc.) of the lesion candidate area and the second display control when performing the lesion candidate area screening. Is preferable. The display or non-display of the second diagnosis support information may also be controlled.
  • the first display control displays the first diagnosis support information 76 when the movement of the observation target is less than the first threshold value, and when the movement of the observation target is equal to or more than the first threshold value. Hides the first diagnosis support information 76.
  • the tip portion 12d of the endoscope having the image pickup sensor 44 is temporarily stopped and the movement of the observation target is relatively small. Therefore, when the movement of the observation target suitable for detailed observation is small, such as less than the first threshold value, the first diagnosis support information 76 is displayed.
  • the display of the first diagnosis support information 76 may hinder the visibility, so that the first diagnosis Hide support information 76. Even when the first diagnosis support information 76 is hidden in the first display control, it is preferable to maintain the display of the second diagnosis support information.
  • the second display control displays the first diagnosis support information 76 when the movement of the observation target is equal to or higher than the second threshold value, and when the movement of the observation target is less than the second threshold value. In addition, the first diagnosis support information 76 is hidden.
  • the first diagnosis support information 76 is displayed when the movement of the observation target is large such as the second threshold value or more.
  • the display of the first diagnosis support information 76 may hinder the visibility, so that the first diagnosis support is supported. Hide information 76. Even when the first diagnosis support information 76 is hidden in the second display control, it is preferable to maintain the display of the second diagnosis support information.
  • switching between the first display control and the second display control may be performed manually using the user interface 19. Further, the switching between the first display control and the second display control may be manually performed by the display control switching unit 75 (see FIG. 12). As shown in FIG. 16, when the movement of the observation target is less than the switching threshold value, the display control switching unit 75 determines that the movement of the observation target is less detailed observation, and switches to the first display control. On the other hand, when the movement of the observation target is equal to or higher than the switching threshold value, it is determined that the movement of the observation target is large in the screening, and the screen is switched to the second display control. The display control unit 60 controls the display of the diagnosis support information according to the automatically switched first display control or second display control.
  • the emission spectrum of the second illumination light is different in each of the second illumination periods, as in the case where the second emission pattern of the second illumination light is the second B emission pattern or the second D emission pattern.
  • the display control unit 60 maintains the display regardless of the movement of the observation target. 2 It is preferable that the display of the diagnostic support information is different depending on the emission spectrum of the second illumination light.
  • the lesion candidate region DRA detected by the second illumination light of the emission spectrum A and the second illumination light of the emission spectrum B different from the emission spectrum A were detected.
  • the second diagnosis support information 78b is displayed in the "blue” banding box, and the second diagnosis support information 78b is ". Display in the "green” bounding box. That is, “blue” indicates that it was detected by the second illumination light of the emission spectrum A, and “green” indicates that it was detected by the second illumination light of the emission spectrum B. This makes it possible to visually grasp which lesion candidate region DRA or DRB is detected by the second illumination light of which emission spectrum.
  • the mode switching SW12f When the mode switching SW12f is operated to switch to the diagnostic support mode, the first illumination light and the second illumination light having different emission spectra are automatically switched and emitted.
  • the image pickup sensor 44 takes an image of the observation target illuminated by the first illumination light and outputs the first image signal. Further, the observation target illuminated by the second illumination light is imaged, and the second image signal is output.
  • the recognition processing unit 70 performs recognition processing on the second image signal.
  • the motion acquisition unit 72 acquires the motion of the observation target in the image.
  • the diagnosis support information image generation unit 74 recognizes the lesion candidate region by the recognition process, the diagnosis support information image generation unit 74 generates a diagnosis support information image that displays the diagnosis support information regarding the lesion candidate region with respect to the observation image based on the first image signal. do.
  • the display control unit 60 controls the display or non-display of at least the first diagnosis support information among the diagnosis support information according to the movement of the observation target in the diagnosis support information image. The above series of processes is repeated while the diagnosis support mode continues.
  • a processing unit that executes various processes such as a generation unit 64, a diagnosis support information processing unit 66, a central control unit 68, a recognition processing unit 70, a motion acquisition unit 72, a diagnosis support information image generation unit 74, and a display control switching unit 75.
  • the hardware structure of the processing unit) is various processors as shown below.
  • the circuit configuration is changed after manufacturing the CPU (Central Processing Unit), FPGA (Field Programmable Gate Array), etc., which are general-purpose processors that execute software (programs) and function as various processing units. It includes a programmable logic device (PLD), which is a possible processor, a dedicated electric circuit, which is a processor having a circuit configuration specially designed for executing various processes, and the like.
  • CPU Central Processing Unit
  • FPGA Field Programmable Gate Array
  • PLD programmable logic device
  • dedicated electric circuit which is a processor having a circuit configuration specially designed for executing various processes, and the like.
  • One processing unit may be composed of one of these various processors, or may be composed of a combination of two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of a CPU and an FPGA). May be done. Further, a plurality of processing units may be configured by one processor. As an example of configuring a plurality of processing units with one processor, first, as represented by a computer such as a client or a server, one processor is configured by a combination of one or more CPUs and software. There is a form in which this processor functions as a plurality of processing units.
  • SoC System On Chip
  • the various processing units are configured by using one or more of the above-mentioned various processors as a hardware-like structure.
  • the hardware-like structure of these various processors is, more specifically, an electric circuit (circuitry) in which circuit elements such as semiconductor elements are combined.
  • the hardware structure of the storage unit is a storage device such as an HDD (hard disk drive) or SSD (solid state drive).

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

L'invention concerne un système d'endoscope et un procédé de fonctionnement associé qui, lorsqu'une région de lésion candidate est détectée, empêche des informations d'aide au diagnostic concernant la région de lésion candidate d'obstruer la visibilité d'un sujet observé même dans des situations où le mouvement du sujet observé change dans une image. Lorsqu'une région de lésion candidate (DR) est reconnue par un processus de reconnaissance, une image d'informations d'aide au diagnostic est générée qui affiche des informations d'aide au diagnostic concernant la région de lésion candidate (DR) sur une image d'observation sur la base d'un premier signal d'image. L'affichage ou le non-affichage d'au moins des premières informations d'aide au diagnostic (76) parmi des informations d'aide au diagnostic est commandé par le mouvement du sujet observé dans l'image d'informations d'aide au diagnostic.
PCT/JP2021/008740 2020-07-03 2021-03-05 Système d'endoscope et procédé de fonctionnement d'un tel système WO2022004056A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022533677A JPWO2022004056A1 (fr) 2020-07-03 2021-03-05

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-115695 2020-07-03
JP2020115695 2020-07-03

Publications (1)

Publication Number Publication Date
WO2022004056A1 true WO2022004056A1 (fr) 2022-01-06

Family

ID=79315699

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/008740 WO2022004056A1 (fr) 2020-07-03 2021-03-05 Système d'endoscope et procédé de fonctionnement d'un tel système

Country Status (2)

Country Link
JP (1) JPWO2022004056A1 (fr)
WO (1) WO2022004056A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016199273A1 (fr) * 2015-06-11 2016-12-15 オリンパス株式会社 Dispositif d'endoscope et procédé de fonctionnement de dispositif d'endoscope
WO2020012563A1 (fr) * 2018-07-10 2020-01-16 オリンパス株式会社 Dispositif d'endoscope, dispositif de traitement et procédé de traitement

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016199273A1 (fr) * 2015-06-11 2016-12-15 オリンパス株式会社 Dispositif d'endoscope et procédé de fonctionnement de dispositif d'endoscope
WO2020012563A1 (fr) * 2018-07-10 2020-01-16 オリンパス株式会社 Dispositif d'endoscope, dispositif de traitement et procédé de traitement

Also Published As

Publication number Publication date
JPWO2022004056A1 (fr) 2022-01-06

Similar Documents

Publication Publication Date Title
JP7335399B2 (ja) 医用画像処理装置及び内視鏡システム並びに医用画像処理装置の作動方法
JPWO2019163540A1 (ja) 内視鏡システム
US20230027950A1 (en) Medical image processing apparatus, endoscope system, method of operating medical image processing apparatus, and non-transitory computer readable medium
WO2020039929A1 (fr) Dispositif de traitement d'image médicale, système endoscopique, et procédé de fonctionnement d'un dispositif de traitement d'image médicale
WO2020054255A1 (fr) Dispositif d'endoscope, processeur d'endoscope et procédé de fonctionnement de dispositif d'endoscope
WO2022014077A1 (fr) Système d'endoscope et son procédé de fonctionnement
WO2022018894A1 (fr) Système d'endoscope et procédé de fonctionnement d'un tel système
JP7047122B2 (ja) 医用画像処理装置及び内視鏡システム並びに医用画像処理装置の作動方法
US20230101620A1 (en) Medical image processing apparatus, endoscope system, method of operating medical image processing apparatus, and non-transitory computer readable medium
US20230029239A1 (en) Medical image processing system and method for operating medical image processing system
WO2022004056A1 (fr) Système d'endoscope et procédé de fonctionnement d'un tel système
WO2021006121A1 (fr) Dispositif de traitement d'image, système d'endoscope, et procédé de fonctionnement de dispositif de traitement d'image
JP7214886B2 (ja) 画像処理装置及びその作動方法
US11969152B2 (en) Medical image processing system
WO2020217883A1 (fr) Dispositif de traitement d'images et procédé de fonctionnement associé
WO2021205777A1 (fr) Dispositif de processeur et procédé de fonctionnement de celui-ci
JP7411515B2 (ja) 内視鏡システム及びその作動方法
WO2021149357A1 (fr) Système d'endoscope et son procédé de fonctionnement
WO2022209390A1 (fr) Système d'endoscopie et procédé de fonctionnement d'un tel système
WO2021106452A1 (fr) Système d'endoscope et procédé de fonctionnement associé
WO2023007896A1 (fr) Système endoscope, dispositif de traitement, et procédé de fonctionnement associé
WO2021229900A1 (fr) Système d'endoscope et son procédé de fonctionnement
JP7076535B2 (ja) 内視鏡装置、内視鏡装置の作動方法、及びプログラム
CN115956870A (zh) 内窥镜系统及其工作方法
JPWO2020171012A1 (ja) 内視鏡システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21832766

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022533677

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21832766

Country of ref document: EP

Kind code of ref document: A1