WO2023007896A1 - Système endoscope, dispositif de traitement, et procédé de fonctionnement associé - Google Patents

Système endoscope, dispositif de traitement, et procédé de fonctionnement associé Download PDF

Info

Publication number
WO2023007896A1
WO2023007896A1 PCT/JP2022/019386 JP2022019386W WO2023007896A1 WO 2023007896 A1 WO2023007896 A1 WO 2023007896A1 JP 2022019386 W JP2022019386 W JP 2022019386W WO 2023007896 A1 WO2023007896 A1 WO 2023007896A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
interest
section
frame rate
display
Prior art date
Application number
PCT/JP2022/019386
Other languages
English (en)
Japanese (ja)
Inventor
浩司 下村
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2023007896A1 publication Critical patent/WO2023007896A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements

Definitions

  • the present invention relates to an endoscope system, a processor device, and an operating method thereof that acquire multiple types of images and display different images on images having duplicate sections.
  • the subject is illuminated with multiple illumination lights with different wavelength ranges depending on the purpose of diagnosis.
  • multiple illumination lights with different wavelength ranges depending on the purpose of diagnosis.
  • normal light that can observe the mucous membrane of the digestive tract in natural colors
  • short-wave light that penetrates deep into the superficial layer in order to obtain surface layer information such as superficial blood vessels and deep layer information such as deep blood vessels.
  • the middle wavelength light having a depth of penetration to the deep layer is switched and illuminated, and the normal light image obtained by the normal light illumination, the surface layer image obtained by the short wave light illumination as the special light image, and the medium wavelength light Acquire a depth image obtained by illumination of
  • a first special observation image obtained by switching between a first illumination light that emphasizes superficial blood vessels and a second illumination light that emphasizes deep blood vessels, and a first special observation image obtained by illumination with the first illumination light and a first special observation image obtained by illumination with the second illumination light There is known a technique for sequentially displaying second special observation images by switching them (Patent Document 1).
  • the present invention provides an endoscope system and a processor device that enable smooth observation of an image of particular interest when illuminating by switching a plurality of lights and displaying a plurality of images obtained by illuminating with each light. and a method of operating the same.
  • a processor device of the present invention is a processor device that includes a light source processor and an image processing processor.
  • the light source processor automatically selects a first illumination period during which the subject illuminated with the first illumination light is imaged and a second illumination period during which the subject is imaged with the second illumination light having a spectrum different from that of the first illumination light. switch control.
  • the image processing processor acquires a first illumination light image as a medical image during the first illumination period, acquires a second illumination light image as a medical image during the second illumination period, and obtains the first illumination light image and/or the second illumination light image.
  • a display image for displaying an illumination light image is generated, and the display image has a target segment and a semi-target segment, and is the number of medical images displayed in the target segment per unit time.
  • the frame rate and the quasi-region-of-interest display frame rate which is the number of medical images displayed in the quasi-region of interest per unit time, are within the range of the overall display frame rate, which is the number of display images to be displayed per unit time,
  • the target section display frame rate is higher than the semi-target section display frame rate, and the medical image displayed in the target section is the first illumination light image or the second illumination light image.
  • the image processing processor acquires a medical image of interest, which is a medical image to be displayed in the section of interest, based on a section-of-interest imaging frame rate, which is the number of medical images of interest acquired per unit time, and displays the image in the sub-section of interest. It is preferable to acquire quasi-interest medical images, which are medical images, based on a quasi-interest segment imaging frame rate, which is the number of quasi-interest medical images acquired per unit time.
  • the image processing processor preferably uses the target medical image to generate an interpolated frame image for display in the target section. It is preferable that the image processing processor uses the quasi-interest medical image to generate an interpolated frame image to be displayed in the quasi-interest zone when the quasi-zone-of-interest display frame rate is higher than the quasi-interest zone imaging frame rate.
  • the interpolated frame image is preferably generated by an arithmetic mean method using at least two medical images of interest or semi-interested medical images.
  • the interpolated frame image is preferably generated by the motion vector method using at least two or more target medical images or quasi-target medical images.
  • the interpolated frame image is preferably generated by duplicating the medical image of interest or the medical image of semi-interest.
  • the image processing processor preferably selects a medical image of interest to be displayed in the section of interest when the display frame rate of the section of interest is lower than the imaging frame rate of the section of interest.
  • the image processing processor selects the sub-interest medical image to be displayed in the semi-interest segment when the semi-interest segment display frame rate is lower than the semi-interest segment imaging frame rate.
  • the image processing processor detects a region of interest, which is a region that the user should pay attention to, from the medical image of interest and/or the medical image of secondary interest, and/or discriminates the region of interest from the medical image of secondary interest.
  • the image processing processor preferably controls the display of the detection result, which is the result of detecting the region of interest, and/or the discrimination result, which is the result of discriminating the region of interest, in the section of interest and/or the sub-section of interest.
  • the image processing processor performs control to switch display and non-display of the semi-interest section. It is preferable that the image processing processor performs control to change the size of the target section and the sub-target section in the display image.
  • the image processing processor captures a subject illuminated by any one of the first illumination light and at least one type of second illumination light to obtain a medical image, and a first illumination period. and a multi-emission mode in which a medical image is obtained by capturing an object by automatically switching between the second illumination period and the second illumination period.
  • a method of operating a processor device comprises a first illumination period for capturing an image of a subject illuminated by first illumination light, and a second illumination period for capturing an image of the subject illuminated by second illumination light having a spectrum different from that of the first illumination light. obtaining a first illumination light image as a medical image during the first illumination period; and obtaining a second illumination light image as a medical image during the second illumination period. and generating a display image for displaying the first illumination light image and/or the second illumination light image.
  • the display image has a target segment and a semi-target segment, and has a target segment display frame rate, which is the number of medical images displayed in the target segment per unit time, and a medical image displayed in the semi-target segment per unit time.
  • the medical image displayed in the section of interest is the first illumination light image or the second illumination light image.
  • An endoscope system of the present invention includes the above-described processor device, light source device, and endoscope.
  • an endoscope capable of smoothly observing an image of particular interest when illuminating by switching a plurality of lights and switching and displaying a plurality of images obtained by illuminating with each light.
  • a system, processor apparatus and method of operation thereof can be provided.
  • FIG. 1 is an explanatory diagram of a configuration of an endoscope system;
  • FIG. 1 is a block diagram showing functions of an endoscope system;
  • FIG. It is a graph which shows the spectrum of 1st illumination light. It is a graph which shows the spectrum SP of the 2nd illumination 1st spectrum light. It is a graph which shows the spectrum SQ of the 2nd illumination 2nd spectrum light. It is a graph which shows the spectrum SR of the 2nd illumination 3rd spectrum light. It is a graph which shows the spectrum SS of the 2nd illumination 4th spectrum light.
  • FIG. 10 is an explanatory diagram showing a first light emission pattern in a multi-light emission mode;
  • FIG. 10 is an explanatory diagram showing a second light emission pattern in multi-light emission mode;
  • FIG. 11 is an explanatory diagram showing a third light emission pattern in multi-light emission mode
  • FIG. 4 is an image diagram showing an example of a display image
  • 4 is a block diagram showing functions of a display control unit
  • FIG. 11 is an image diagram showing an example of a display image setting screen
  • FIG. 10 is an image diagram showing a display image when the target section and the semi-target section have the same size
  • FIG. 9 is an image diagram showing an example of a section display frame rate setting screen
  • FIG. 10 is an image diagram showing a display image when displaying the first illumination light image in the section of interest and the second illumination light image in the sub-section of interest
  • FIG. 11 is an image diagram showing an example of a section imaging frame rate setting screen; It is explanatory drawing which shows an addition averaging method.
  • FIG. 4 is an explanatory diagram showing a motion vector method;
  • FIG. 4 is an explanatory diagram showing a simple copy method;
  • FIG. 4 is an explanatory diagram showing selection of a target medical image or a quasi-target medical image to be displayed;
  • 4 is a block diagram showing functions of a lesion recognition unit;
  • FIG. 10 is an explanatory diagram showing that a detection result is output when a medical image is input as a classification for detection;
  • FIG. 10 is an explanatory diagram showing that when a medical image is input to classification for discrimination, a discrimination result is output;
  • FIG. 11 is an image diagram showing a display image when displaying a detection result; It is an image diagram showing a display image when displaying a discrimination result.
  • FIG. 10 is an image diagram showing a display image when displaying a still image and character information;
  • FIG. 10 is an explanatory diagram of an example of displaying a display image when an extended processor device is provided;
  • the endoscope system 10 includes an endoscope 12, a light source device 13, a processor device 14, a display 15 and a user interface 16.
  • the endoscope 12 is optically connected to the light source device 13 and electrically connected to the processor device 14 .
  • the endoscope 12 has an insertion section 12a, an operation section 12b, a bending section 12c and a distal end section 12d.
  • the insertion portion 12a is inserted into the body of the subject.
  • the operation portion 12b is provided at the proximal end portion of the insertion portion 12a.
  • the curved portion 12c and the distal end portion 12d are provided on the distal end side of the insertion portion 12a.
  • the bending portion 12c is bent by operating the angle knob 12e of the operation portion 12b.
  • the distal end portion 12d is directed in a desired direction by the bending motion of the bending portion 12c.
  • a forceps channel (not shown) for inserting a treatment tool or the like is provided from the insertion portion 12a to the distal end portion 12d.
  • the treatment instrument is inserted into the forceps channel from the forceps port 12j.
  • An optical system for forming a subject image and an optical system for illuminating the subject with illumination light are provided inside the endoscope 12 .
  • the operation unit 12b is provided with an angle knob 12e, a mode changeover switch 12f, a still image acquisition instruction switch 12h, and a zoom operation unit 12i.
  • the mode changeover switch 12f is used for an observation mode changeover operation.
  • a still image acquisition instruction switch 12h is used to instruct acquisition of a still image of a subject.
  • a zoom operation unit 12 i is used for operating the zoom lens 42 .
  • the light source device 13 generates illumination light.
  • the display 15 displays medical images.
  • the medical images include a first illumination light image and a second illumination light image that use different illumination light for imaging, and a target medical image and a sub-target medical image that differ in display sections.
  • the user interface 16 has a keyboard, mouse, microphone, tablet, touch pen, and the like, and receives input operations such as function settings.
  • the processor device 14 performs system control of the endoscope system 10 and further performs image processing and the like on image signals transmitted from the endoscope 12 .
  • the light source device 13 includes a light source unit 20 , a light source processor 21 that controls the light source unit 20 , and an optical path coupling unit 22 .
  • the light source unit 20 has a plurality of semiconductor light sources, which are turned on or off. Also, when lighting a plurality of semiconductor light sources, the illumination light for illuminating the subject is emitted by controlling the amount of light emitted from each semiconductor light source.
  • the light source unit 20 includes a V-LED (Violet Light Emitting Diode) 20a, a B-LED (Blue Light Emitting Diode) 20b, a G-LED (Green Light Emitting Diode) 20c, and an R-LED (Red Light Emitting Diode) 20d. It has four color LEDs.
  • the light source unit 20 or the light source processor 21 may be built in the endoscope 12 .
  • the light source processor 21 may be incorporated in the processor device 14 .
  • the endoscope system 10 has a mono-emission mode and a multi-emission mode, and the modes are switched via the central control unit 50 by operating the mode switching switch 12f.
  • the mono light emission mode is a mode in which the illumination light of the same spectrum is continuously emitted to illuminate the object to be observed.
  • the multi-light emission mode is a mode in which a plurality of illumination lights with different spectra are emitted while being switched according to a specific pattern to illuminate the subject.
  • the illumination light includes first illumination light and second illumination light having a spectrum different from that of the first illumination light.
  • the first illumination light is normal light (broadband light such as white light) used for screening observation by giving brightness to the entire subject.
  • the second illumination light is a plurality of types of special light used for emphasizing specific structures such as ducts and blood vessels of mucous membranes that are subjects.
  • the V-LED 20a When emitting the first illumination light, as shown in FIG. 3, the V-LED 20a emits violet light V with a central wavelength of 405 ⁇ 10 nm and a wavelength range of 380-420 nm.
  • the B-LED 20b generates blue light B with a central wavelength of 450 ⁇ 10 nm and a wavelength range of 420-500 nm.
  • the G-LED 20c generates green light G with a wavelength range of 480-600 nm.
  • the R-LED 20d emits red light R with a central wavelength of 620-630 nm and a wavelength range of 600-650 nm.
  • the second illumination light includes, for example, second illumination first spectrum light that emphasizes superficial blood vessels, second illumination second spectrum light that emphasizes superficial blood vessels shallower than superficial blood vessels, and absorption coefficients of oxyhemoglobin and deoxyhemoglobin.
  • a second illumination third spectrum light for generating an oxygen saturation image using a difference, and a second illumination fourth spectrum light for generating a color difference extended image extending the color difference between a plurality of subject ranges.
  • the violet light V emits light of the spectrum SP having a higher peak intensity than the blue light B, green light G, and red light R of other colors.
  • the second illumination second spectrum light is emitted, as shown in FIG. 5
  • light of spectrum SQ including violet light V which is light in a wavelength range where the absorption coefficients of oxygenated hemoglobin and reduced hemoglobin are different, is emitted.
  • blue-green light B peak wavelength is, for example, 470 to 480 nm
  • the second illumination fourth spectrum light is emitted, as shown in FIG. 7, the peak intensity of the violet light V and the blue light B is higher than the peak intensities of the green light G and the red light R, and the light of the spectrum SS is emitted.
  • the kind of 2nd illumination light is not restricted to this.
  • the light source processor 21 independently controls the light amounts of the four colors of violet light V, blue light B, green light G, and red light R, and changes the light amounts to produce the first illumination light and the second illumination light (for example, the second illumination light).
  • illumination first spectrum light, second illumination second spectrum light, second illumination third spectrum light, second illumination fourth spectrum light are emitted.
  • illumination light with the same spectrum is continuously emitted for each frame.
  • the display 15 displays the first illumination light image with natural colors.
  • the second illumination light image emphasizing a specific structure is displayed on the display 15 .
  • the term “frame” refers to a unit of period including at least the period from the timing of light emission to the completion of readout of the image signal by the imaging sensor 43 .
  • control is performed to change the light amounts of the violet light V, the blue light B, the green light G, and the red light R for each specific frame F according to a specific light emission pattern.
  • emission patterns are given below.
  • the first illumination light L1 for two frames is emitted during the first illumination period Pe1 in which the subject is illuminated with the first illumination light L1, and the second illumination light L2 is emitted.
  • the second illumination period Pe2 in which the subject is illuminated by the pattern of sequentially emitting one frame of the second illumination light (the second illumination first spectral light L2SP) is repeated.
  • arrows indicate the direction in which time advances.
  • the second light emission pattern As shown in FIG. 9, two frames of the first illumination light L1 are emitted during the first illumination period Pe1, and one frame of the second illumination light L2 (the second illumination light L2) is emitted during the second illumination period Pe2.
  • Illumination first spectrum light L2SP is emitted, then two frames of the first illumination light L1 are emitted during the first illumination period Pe1, and one frame of the second illumination light L2 (the second illumination light L2 is emitted during the second illumination period Pe2).
  • Illumination first spectrum light L2SQ is emitted.
  • the spectrum of the second illumination light emitted after the first illumination light is changed each time.
  • the first illumination light L1 for one frame is emitted during the first illumination period Pe1
  • the second illumination light L2 for four frames is emitted during the second illumination period Pe2. repeat.
  • the second illumination light L2 includes a second illumination first spectral light L2SP, a second illumination second spectral light L2SQ, a second illumination third spectral light L2SR, and a second illumination fourth spectral light. It is automatically switched to emit the light L2SS and the second illumination light L2 having a different spectrum for each frame.
  • the light source processor 21 adjusts the light amount of each light source and performs shooting according to the shooting frame rate setting and the light emission pattern setting, which will be described later.
  • the light emission pattern is not limited to the above three patterns, and can be arbitrarily set as described later.
  • the mono-emission mode and the multi-emission mode when the user wants to acquire a medical image as a still image, by operating the still image acquisition instruction switch 12h, a signal relating to the still image acquisition instruction is sent to the endoscope 12, the light source device 13, and the like. and sent to the processor unit 14 .
  • each of the LEDs 20a to 20d (see FIG. 2) is incident on the light guide 23 via the optical path coupling section 22 composed of mirrors, lenses, and the like.
  • the light guide 23 propagates the light from the optical path coupling portion 22 to the distal end portion 12 d of the endoscope 12 .
  • the distal end portion 12d of the endoscope 12 is provided with an illumination optical system 30a and an imaging optical system 30b.
  • the illumination optical system 30 a has an illumination lens 31 , and the illumination light propagated by the light guide 23 is applied to the subject through the illumination lens 31 .
  • the imaging optical system 30 b has an objective lens 41 and an imaging sensor 43 . Light from a subject irradiated with illumination light enters an imaging sensor 43 via an objective lens 41 and a zoom lens 42 . As a result, an image of the subject is formed on the imaging sensor 43 .
  • the zoom lens 42 is a lens for enlarging a subject, and is moved between the tele end and the wide end by operating the zoom operation section 12i.
  • the imaging sensor 43 is a primary color sensor, and includes B pixels (blue pixels) having blue color filters, G pixels (green pixels) having green color filters, and R pixels (red pixels) having red color filters. and three types of pixels.
  • the imaging sensor 43 is preferably a CCD (Charge-Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).
  • the imaging processor 44 controls the imaging sensor 43 . Specifically, an image signal is output from the imaging sensor 43 by reading the signal of the imaging sensor 43 by the imaging processor 44 . The output image signal is transmitted to the medical image acquisition unit 60 of the processor device 14 .
  • the medical image acquisition unit 60 performs various signal processing such as defect correction processing, offset processing, demosaicing processing, matrix processing, white balance adjustment, gamma conversion processing, and YC conversion processing on the received color image. Next, by performing image processing including 3 ⁇ 3 matrix processing, gradation conversion processing, color conversion processing such as three-dimensional LUT (Look Up Table) processing, color enhancement processing, and structural enhancement processing such as spatial frequency enhancement, A first illumination light image for the first illumination light and a second illumination light image for the second illumination light are acquired.
  • image processing including 3 ⁇ 3 matrix processing, gradation conversion processing, color conversion processing such as three-dimensional LUT (Look Up Table) processing, color enhancement processing, and structural enhancement processing such as spatial frequency enhancement.
  • the medical image acquisition unit 60 when the second illumination light is the second illumination third spectrum light, the medical image acquisition unit 60 generates an oxygen saturation image as the second illumination light image based on the image signal ratio from the imaging sensor 43. do. Furthermore, when the second illumination light is the second illumination fourth spectrum light, the medical image acquisition unit 60 obtains a color difference expanded image that has undergone color difference expansion processing based on the image signal ratio from the imaging sensor 43. A second illumination light image is generated.
  • the processor device 14 includes a central control unit 50, a medical image acquisition unit 60, a display control unit 70, an overall display frame rate recognition unit 90, a section display frame rate setting unit 100, a section photographing frame rate setting unit 110, and an interpolation frame image generation unit. 120 and a lesion recognition unit 130 (see FIG. 2).
  • the programs in the program memory are operated by the central control unit 50 composed of the light source processor 21 and the image processing processor (not shown), thereby the medical image acquisition unit 60 and the display control unit 70, the functions of the entire display frame rate recognition unit 90, the section display frame rate setting unit 100, the section imaging frame rate setting unit 110, the interpolation frame image generation unit 120, and the lesion recognition unit 130 are realized.
  • the first illumination light image and/or the second illumination light image acquired by the medical image acquisition unit 60 are transmitted to the display control unit 70.
  • the display control unit 70 generates a display image 71 as shown in FIG. 11 that displays the first illumination light image and/or the second illumination light image, and causes the display 15 to display it.
  • the display image 71 has one section of interest 72 and at least one or more sub-sections of interest 73 .
  • the display image 71 has a section of interest 72 and a sub-section of interest 73 .
  • the target section 72 is a section that displays the type of medical image that a user such as a doctor wants to focus on most.
  • the semi-interest section 73 is a section that displays a type of medical image other than the medical image displayed in the interest section.
  • Either the first illumination light image or the second illumination light image can be selected as the medical image to be displayed in the section of interest.
  • the types of medical images to be displayed in the attention section 72 and the semi-interest section 73 can be set via the display image setting section 80 of the display control section 70 shown in FIG.
  • the user can set the type of medical image to be displayed in the attention section 72 and the semi-attention section 73 and the number of the semi-attention sections 73 using the tab 82 on the display image setting screen 81 illustrated in FIG. Only one tab 82 is labeled for clarity.
  • the number of semi-interest sections 73 is one.
  • the type of medical image displayed in the section of interest 72 is the first illumination light image.
  • the type of medical image displayed in the semi-interest section 73 is the second illumination light image, and the second illumination light image in this case is captured using the second illumination first spectrum light.
  • the medical image displayed in the semi-interest section can be switched between display and non-display.
  • the display or non-display setting of the semi-interest medical image to be displayed in the semi-interest section 73 is performed on the display image setting screen 81, for example.
  • the display of the quasi-interest medical image displayed in the quasi-interest segment is "present". Switching between display and non-display of the quasi-interest medical image may be performed on the display image 71 .
  • the sizes of the target section 72 and the semi-target section 73 occupying the display image 71 can be arbitrarily changed.
  • the layout such as the size and arrangement of the target section 72 and the semi-target section 73 may be set by selecting the layout setting button 83 on the display image setting screen 81 .
  • the target section 72 may be larger than the semi-target section 73 (see FIG. 11), or, as shown in FIG. 14, the target section 72 and the semi-target section 73 may be the same size.
  • a layout template may be prepared and selected. The setting may be performed by directly operating by pinching in, pinching out, dragging with two fingers, or the like.
  • the frame rate for displaying the target section and the semi-target section can each be set within the range of the overall display frame rate of the display 15 .
  • the overall display frame rate recognition section 90 recognizes the frame rate of the display 15 connected to the processor device 14 as the overall display frame rate, and transmits it to the section display frame rate setting section 100 .
  • the overall display frame rate is the number of display images displayed by the display 15 per unit time.
  • the target section display frame rate and the semi-target section display frame rate may be set within the range of the refresh rate of the display 15 .
  • the refresh rate refers to the number of display images that can be processed for display per unit time.
  • the section display frame rate setting unit 100 sets the target section display frame rate and the semi-target section display frame rate within the range of the overall display frame rate.
  • the target section display frame rate is the number of medical images displayed in the target section per unit time.
  • the quasi-region-of-interest display frame rate is the number of medical images displayed in the quasi-region of interest per unit time.
  • the target segment display frame rate is set higher than the sub-target segment table frame rate. It is also possible to set the quasi-focused partition table frame rate higher than the focused partition display frame rate, or set the quasi-focused partition table frame rate and the focused partition display frame rate to the same value.
  • section display frame rate will be used as a collective term that does not distinguish between the target section display frame rate and the semi-target section display frame rate.
  • the partition display frame rate is set via the partition display frame rate setting screen 101.
  • the overall display frame rate is displayed in the overall display frame rate display column 102 .
  • the user enters numerical values in the target section display frame rate setting field 103 and the sub-target section display frame rate setting field 104 within the range displayed in the overall display frame rate display field 102, and the section display frame rate is displayed. Settings can be made.
  • the overall display frame rate is 60 fps (frames per second).
  • the division-of-interest display frame rate of the division of interest and the division-of-interest display frame rate of the divisions of interest indicated by "section of interest (1)" and "section of interest (2)" do not exceed 60 fps, respectively.
  • the target section display frame rate is set to 60 fps
  • the semi-target section display frame rate is set to 15 fps.
  • Input methods for the target section display frame rate setting field 103 and the semi-target section display frame rate setting field 104 include user input using a keyboard or voice, input using tabs, automatic input of predetermined values, and the like. It is not limited to this.
  • the display control unit 70 sets the target section display frame rate and the quasi-target section display frame rate of the section display frame rate setting unit 100, and sets the target section 72 and the quasi-target section 73 of the display image 71.
  • An image (first illumination light image or second illumination light image) is displayed.
  • the first illumination light image is displayed in the target section 72 and the second illumination light image (indicated by diagonal lines) is displayed in the semi-target section 73 .
  • a configuration in which the target section display frame rate and the semi-interest section display frame rate can be set respectively is particularly effective in the multi-emission mode in which multiple types of medical images can be sequentially obtained.
  • the section shooting frame rate setting unit 110 sets the target section shooting frame rate and the sub-target section shooting frame rate.
  • the overall imaging frame rate is an imaging frame rate determined by the performance of the imaging sensor 43, and is the number of medical images that can be acquired per unit time. Assuming that the medical image displayed in the section of interest is the medical image of interest, and the medical image displayed in the section of secondary interest is the medical image of secondary interest, the imaging frame rate of the section of interest is the number of medical images of interest acquired per unit time. . Also, the quasi-part-of-interest imaging frame rate is the number of quasi-interest medical images acquired per unit time.
  • the section-of-interest imaging frame rate can be set to be higher or lower than the section-of-interest display frame rate.
  • the quasi-section-of-interest imaging frame rate can be set to be higher or lower than the quasi-section-of-interest display frame rate.
  • the sum of the section-of-interest imaging frame rate and the sub-section-of-interest imaging frame rate is the overall imaging frame rate.
  • section imaging frame rate will be used as a term that collectively refers to the target section imaging frame rate and the semi-target section imaging frame rate without distinguishing between them.
  • the division imaging frame rate is set via the division imaging frame rate setting screen 111 .
  • the overall imaging frame rate is displayed in the overall imaging frame rate display field 112 of the section imaging frame rate setting screen 111 illustrated in FIG. 17 .
  • the user enters numerical values into the target section imaging frame rate setting field 113 and the sub-target section imaging frame rate setting field 114 within the range displayed in the overall imaging frame rate display field 112 to set the section imaging frame rate. It can be carried out.
  • the overall imaging frame rate is 60 fps
  • the target segment imaging frame rate of the target segment (the target medical image to be displayed in) is 30 fps
  • the semi-interest segment imaging frame rate of each semi-interest segment (the semi-interest medical image displayed in) indicated by is set to 15 fps.
  • the type of illumination light used to acquire the medical image of interest and the medical image of secondary interest can be set on tabs 82, respectively.
  • the first illumination light for the medical image of interest is the first illumination light image
  • the first quasi-interest medical image is the second illumination light image illuminated with the second illumination
  • first spectrum light is the second illumination
  • the second quasi-interest medical image is the second illumination light image. 2 is a second illumination light image illuminated with spectral light;
  • the emission pattern of the illumination light used to acquire the medical image of interest and the quasi-interest medical image can be set in the emission pattern setting field 115.
  • the light emission pattern is set as "1, 1, 2, 1, 1, 3".
  • the light emission pattern of "first illumination light, first illumination light, second illumination first spectrum light, first illumination light, first illumination light, second illumination second spectrum light” is repeated. This indicates that the lights are emitted sequentially.
  • the setting of the light emission pattern set on the section imaging frame rate setting screen 111 is transmitted to the light source processor 21 via the central control unit 50 .
  • Table 1 shows specific examples of the overall display frame rate, section display frame rate, overall imaging frame rate, and section imaging frame rate.
  • "whole” in the lower part of “display frame rate” is “whole display frame rate”
  • "attention section” is “attention section display frame rate”
  • “semi-attention section” is “semi-attention section display frame rate”.
  • “whole” in the lower part of “imaging frame rate” indicates “whole imaging frame rate”
  • interested section indicates “interested section imaging frame rate”
  • “semi-interested section” indicates “semi-interested section imaging frame rate”.
  • the unit of the column in which only numerical values are written is fps.
  • the medical image of interest is the first illumination light image
  • the quasi-interest medical image is the second illumination light image.
  • Example 1 in Table 1 is an example in which the overall display frame rate (60 fps) is higher than the overall imaging frame rate (45 fps), and the target section imaging frame rate (30 fps) is higher than the sub-target section imaging frame rate (15 fps). .
  • Example 2 in Table 1 is an example in which the overall display frame rate (60 fps) is higher than the overall imaging frame rate (45 fps), and the target section imaging frame rate (15 fps) is lower than the sub-target section imaging frame rate (30 fps). .
  • “15 fps ⁇ 2" in the "remarks of semi-interest section” column indicates that there are two types of second illumination light, and the second illumination light image for each type of second illumination light is acquired at 15 fps.
  • the second illumination light images captured at the sub-section-of-interest imaging frame rate of 15 fps are displayed in the two quasi-sections of interest.
  • a second illumination light image captured at a quasi-zone-of-interest imaging frame rate of 15 fps using the second illumination first spectral light is displayed.
  • section display frame rate is 15 fps
  • a second illumination light image captured at a quasi-target section imaging frame rate of 15 fps using the second illumination second spectrum light is displayed.
  • Example 3 in Table 1 is an example in which the overall display frame rate (60 fps) is lower than the overall imaging frame rate (90 fps), and the target section imaging frame rate (60 fps) is higher than the sub-target section imaging frame rate (30 fps). .
  • "15 fps ⁇ 2" in the "remarks of semi-interest section” column indicates that there are two types of second illumination light, and the second illumination light image for each type of second illumination light is acquired at 15 fps.
  • the relationship between the sectional imaging frame rate and the sectional frame rate of the quasi-interest medical image is the same as in the second embodiment.
  • Example 4 in Table 1 is an example in which the overall display frame rate (60 fps) is lower than the overall imaging frame rate (90 fps), and the target section imaging frame rate (30 fps) is lower than the sub-target section imaging frame rate (60 fps).
  • "15 fps ⁇ 4" in the "remarks of semi-interest section” column indicates that there are four types of second illumination light, and the second illumination light image for each type of second illumination light is acquired at 15 fps.
  • the relationship between the section imaging frame rate and the section frame rate of the quasi-interest medical image is the second illumination light image captured using two types of second illuminating light in the two quasi-interest zones of the second embodiment. Same as example.
  • Example 5 in Table 1 is an example in which the semi-target section display frame rate (45 fps) is higher than the target section imaging frame rate (15 fps). This is an example in which an interpolated frame image, which will be described later, is required for the semi-interest section.
  • Example 6 in Table 1 is an example in which the target section display frame rate (60 fps) is lower than the target section imaging frame rate (90 fps). This is an example in which it is necessary to select a medical image of interest to be displayed in the section of interest, which will be described later.
  • an image of particular interest is set as the target section, and the frame rate of the target section is set higher than the frame rate of the semi-target section. By doing so, it is possible to smoothly observe an image of particular interest.
  • the interpolation frame image generation unit 120 preferably generates an interpolation frame image to be displayed in the target section using the target medical image. Specifically, as in Examples 1, 2, 4, and 5, when the target section display frame rate is higher than the target section imaging frame rate, an interpolation frame image is generated. That is, an interpolated frame image is generated when the imaged medical image of interest does not satisfy the display frame rate of the section of interest.
  • the interpolated frame image generation unit 120 preferably generates an interpolated frame image to be displayed in the sub-interest section using the semi-interest medical image.
  • an interpolation frame image is generated when the quasi-section-of-interest display frame rate is higher than the quasi-section-of-interest imaging frame rate. That is, when the captured quasi-interest medical image does not satisfy the quasi-interest segment display frame rate, an interpolated frame image is generated.
  • an interpolated frame image is generated and displayed when the section display frame rate is higher than the section imaging frame rate, so that an image of particular interest is set as the section of interest, and an image to interpolate the image of particular interest is obtained.
  • smooth observation can be performed.
  • the interpolated frame image is not generated, and the imaged target medical images are sequentially displayed in the target section. .
  • the interpolated frame image is preferably generated by an arithmetic mean method using at least two medical images of interest or semi-interested medical images.
  • the averaging method is a method of generating an interpolated frame image by superimposing a plurality of frames of interest medical images or quasi-interest medical images captured in the past.
  • the averaging method will be described using a specific example (FIG. 18) in which the medical image of interest is the first illumination light image.
  • the medical image of interest is the first illumination light image.
  • two frames of the target medical image 121 are captured during the first illumination period in which the illumination light is emitted in the light emission pattern Pa1 once and the first illumination light L1 is emitted (in FIG. Indicated by frames 1 and 2)
  • one frame of the quasi-interest medical image 122 is captured during the second illumination period in which the second illumination light L2 is emitted (indicated by imaging frame 3 in FIG. 18).
  • the interpolation frame image generation unit 120 generates an interpolation frame image 123 (indicated by interpolation frame “1+2” in FIG. 18) by overlapping captured frames 1 and 2.
  • FIG. The display frames are displayed in the order of imaged frame 1, interpolation frame "1+2", imaged frame 2, and so on, and the images are displayed in the target section in this order.
  • imaging frames 4 and 5 are acquired as the target medical image 121 for one emission pattern
  • imaging frame 6 is acquired as the secondary target medical image 122
  • the interpolation frame image 123 is acquired at the interpolation frame "2+4". is generated.
  • one frame of interpolation frame image 123 is generated from two frames of attention medical image 121 for the sake of simplification.
  • An image 123 may be generated.
  • the same procedure is performed. Generate an interpolated frame image.
  • the interpolated frame image 123 When the interpolated frame image 123 is generated by superimposing the target medical images 121 by the averaging method, it may be generated by the simple averaging method by superimposing the plurality of target medical images 121 at the same ratio. It may be generated by a weighted averaging method in which weighting is performed and superimposition is performed. Also, as in Example 5 of Table 1, when generating the interpolated frame image 123 from the quasi-interest medical image, the interpolated frame image is generated in the same manner.
  • the interpolated frame image is preferably generated by the motion vector method using at least two or more target medical images or quasi-target medical images.
  • interpolated frame images are created by superimposing the target medical images or the quasi-interest medical images in consideration of the motion vector of the subject captured between the target medical images or the quasi-interest medical images of multiple frames captured in the past. It is a method to generate.
  • the interpolated frame image generation unit 120 When generating an interpolated frame image by the motion vector method, the interpolated frame image generation unit 120 first analyzes image signals of a plurality of target medical images or quasi-target medical images, and, for example, uses a pattern matching method to generate the preceding and following frames. are searched for a pixel corresponding to the same observation site between the images, and the spatial distance (movement amount) and direction (movement direction) of this pixel are calculated as a movement vector. Next, the image of each frame is weighted by adding the motion vector, and the target medical image or the quasi-target medical image is superimposed.
  • the movement vector method As shown in FIG. 19, when three medical images of interest 124, 125, and 126 acquired in chronological order are superimposed, the distance between the medical image of interest 124 in the first frame and the medical image of interest 125 in the second frame is A movement vector 127a of the subject S and a movement vector 127b of the subject S between the second frame of interest medical image 125 and the third frame of interest medical image 126 are calculated.
  • the motion vector 127b is larger than the motion vector 127a, the image of the third frame is weighted relatively large, and then the target medical images 124, 125, and 126 are superimposed, and the interpolated frame image 128 is superimposed.
  • interpolated frame image 1208 for example, a superposition 128a (indicated by a solid line) of the subject S calculated from the second frame of interest medical image 125 and the third frame of interest medical image 126 appears.
  • an afterimage 128b (indicated by a dotted line) of the subject S may be projected.
  • the generated interpolated frame image 128 is displayed, for example, inserted between the second frame of interest medical image 125 and the third frame of interest medical image 126 .
  • the interpolated frame image is preferably generated by a simple copy method that duplicates the medical image of interest or the medical image of semi-interest.
  • the simple copy method is a method of generating an interpolated frame image by duplicating at least one or more target medical images or quasi-target medical images captured in the past. As a result, the interpolated frame image can be generated while reducing the image analysis load on the processor device 14 .
  • FIG. 20 A specific example of the simple copy method will be explained.
  • the illumination light is the illumination light for one emission pattern Pa1
  • the first illumination period in which the first illumination light L1 is emitted two frames of the target medical image 121 are captured, and the second illumination light L2 is emitted.
  • One frame of the quasi-interest medical image 122 is captured during the second illumination period.
  • the interpolation frame image generation unit 120 superimposes the captured frames 1 and 2 to generate an interpolation frame image 123 (indicated by interpolation frames “1”, “2”, “4”, “5”, and “7” in FIG. 20).
  • the display frames are displayed in the order of imaging frame 1, interpolation frame "1", imaging frame 2, and so on, and the images are displayed in the target section in this order.
  • the method for generating the interpolated frame image includes, but is not limited to, the averaging method, the motion vector method, or the simple copy method.
  • the target medical image displayed in the target section or the semi-target medical image displayed in the semi-target section may be selected.
  • the segment display frame rate is lower than the segment imaging frame rate
  • the number of obtained medical images of interest or quasi-interest medical images exceeds the number of images that can be displayed.
  • Select medical images and adjust the number of images that can be displayed. The selection of images to display is preferably determined by the difference between the segment display frame rate and the segment capture frame rate.
  • the target section display frame rate is 60 fps, and the target section imaging frame rate is 90 fps. do.
  • the quasi-region-of-interest display frame rate is 15 fps and the region-of-interest imaging frame rate is 30 fps (in Example 6 in Table 1, one type of second illumination light is used to capture the quasi-region of interest medical image. , and is imaged at 30 fps.), and a selection is made to display each of the obtained medical images of interest in two frames.
  • the display control unit 70 selects, for example, images to be displayed one by one in two frames of the acquired attention medical image 121 . That is, as display frames, the target medical image 121 is displayed in the order of imaging frames "1", “3", "5", "7", and so on.
  • the lesion recognition unit 130 detects a region of interest, which is a region that the user should pay attention to, from the medical image of interest and/or the medical image of sub-interest, and further discriminates the region of interest.
  • a region of interest is a site such as a lesion that has a feature amount within a specific range.
  • the feature quantity is preferably the shape or color of the subject or a value obtained from the shape, color, or the like.
  • the items of feature values include, for example, blood vessel density, blood vessel shape, number of blood vessel branches, blood vessel thickness, blood vessel length, blood vessel tortuosity, blood vessel invasion depth, duct shape, and duct opening shape. , duct length, duct tortuosity, and color information.
  • the feature amount is preferably at least one of these, or a value obtained by combining two or more of these. It should be noted that the item of the feature amount is not limited to this, and may be added as appropriate according to the usage situation.
  • Detection is to discover a region of interest from a medical image. Differentiation is to identify what properties the region of interest has (eg, neoplastic polyp, non-neoplastic polyp, inflammation, etc.).
  • the lesion recognition unit 130 includes a detection classifier 131 and a discrimination classifier 132, as shown in FIG.
  • the detection classifier 131 receives a medical image 130a, which is a medical image of interest and/or a medical image of interest, from the medical image acquisition unit 60, and detects a region of interest as a detection result 131a.
  • the classification classifier 132 receives a medical image 130a, which is a medical image of interest and/or a medical image of interest and/or a medical image of interest, from the medical image acquisition unit 60, and produces a classification result 132a that is a result of distinguishing a region of interest. to output Either one or both of the detection and discrimination of the region of interest may be performed.
  • the detection classifier 131 and the discrimination classifier 132 are preferably learning models that have been trained using teacher image data.
  • the addition of the detection result and discrimination result to the teacher image data may be performed by a skilled doctor, or may be automatically performed by a device other than the detection classifier 131 and the discrimination classifier 132 .
  • information output by the detection classifier 131, the discrimination classifier 132, or another learning model is added to the medical image, and used as teacher image data for learning of the detection classifier 131 and the discrimination classifier 132. may be used.
  • deep learning for machine learning to generate learning models
  • machine learning includes decision trees, support vector machines, random forests, regression analysis, supervised learning, semi-unsupervised learning, unsupervised learning, reinforcement learning, deep reinforcement learning, learning using neural networks, Includes generative adversarial networks and the like.
  • the detection result output by the detection classifier 131 and the discrimination result output by the discrimination classifier 132 are transmitted to the display control unit 70 .
  • the display control unit 70 preferably performs control to display the detection result or the discrimination result in the target section and/or the semi-target section. You may perform control which displays a detection result and a discrimination result.
  • the target section 72 on the display image 71 shows the detection result of detecting the region of interest 135 from the sub-target medical image 134
  • the target medical image 136 displayed in the target section 72 is A frame 137 is displayed around the included region of interest 135 .
  • a marker 138 indicating an approximate position within the medical image of interest 136 is displayed.
  • the marker 138 indicates that there is a region of interest 135 on the lower right.
  • the method of indicating the detection result is not limited to this, such as a circle or polygonal enclosure, an on-screen warning display, or a warning sound.
  • a caption 139 indicating the discrimination result is displayed in the target section 72 on the display image 71 when the discrimination result obtained by discriminating the region of interest 135 from the semi-interest medical image 134 is displayed.
  • the caption 139 is displayed as "tumor”.
  • the marker 138 may be displayed in a different color according to the discrimination result. For example, the marker 138 is displayed in yellow when the discrimination result is neoplastic, and is displayed in green when it is non-neoplastic.
  • the method of indicating the identification result is not limited to this, such as a circle or polygonal enclosure, a warning display on the screen, or a warning sound.
  • the display image 71 is provided with a still image segment 140 for displaying a still image and a text information segment 141 for displaying text information, as shown in FIG. may
  • an extended processor device 150 may be provided, and the lesion recognition unit 130 may be provided in the extended processor device 150.
  • the medical image of interest 151 and/or the medical image of interest 152 are input to the lesion recognition unit 130 of the extended processor device 150, and as a result, detection results and/or discrimination results of the region of interest 135 are output.
  • the detection results and/or discrimination results are transmitted to the processor device 14 and displayed in the attention section 72 or the semi-interest section 73 .
  • the display image 71 shows the detection result of the region of interest 135 .
  • the present invention is not limited to this, and other medical devices such as an ultrasonic imaging device and a radiation imaging device can be used.
  • a device may be used.
  • a rigid scope or a flexible scope may be used as the endoscope 12 .
  • the endoscope system 10 includes a central control unit 50, a medical image acquisition unit 60, a display control unit 70, an overall display frame rate recognition unit 90, a division display frame rate setting unit 100, a division imaging frame rate setting unit 110, an interpolation
  • a part or all of the frame image generation unit 120 and the lesion recognition unit 130 can be provided in an image processing device that communicates with the processor device 14 and cooperates with the endoscope system 10, for example.
  • it can be provided in a diagnosis support device that acquires an image captured by the endoscope 12 directly from the endoscope system 10 or indirectly from the PACS.
  • the endoscope system 10 is connected to various inspection devices such as the first inspection device, the second inspection device, .
  • the central control unit 50, the medical image acquisition unit 60, the display control unit 70, the overall display frame rate recognition unit 90, the section display frame rate setting unit 100, the section shooting frame rate setting unit 110, the interpolation frame image generation unit 120 and the lesion recognition unit 130, the hardware structure of a processing unit that executes various processes is various processors as shown below.
  • the circuit configuration is changed after manufacturing such as CPU (Central Processing Unit), FPGA (Field Programmable Gate Array), which is a general-purpose processor that executes software (program) and functions as various processing units.
  • Programmable Logic Devices which are processors, dedicated electric circuits, which are processors with circuit configurations specially designed to perform various types of processing, and the like.
  • One processing unit may be configured by one of these various processors, and may be configured by a combination of two or more processors of the same type or different types (for example, multiple FPGAs or a combination of a CPU and an FPGA).
  • a plurality of processing units may be configured by one processor.
  • a plurality of processing units may be configured by one processor.
  • this processor functions as a plurality of processing units.
  • SoC System On Chip
  • SoC System On Chip
  • the hardware structure of these various processors is, more specifically, an electric circuit in the form of a combination of circuit elements such as semiconductor elements.
  • the hardware structure of the storage unit is a storage device such as an HDD (hard disc drive) or SSD (solid state drive).

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

L'invention concerne un système endoscope, un dispositif de traitement, et un procédé de fonctionnement associé avec lequel, lors de la commutation et de l'éclairage avec de multiples rayons de lumière et l'affichage de multiples images obtenues par l'éclairage des rayons de lumière respectifs, des images d'un intérêt particulier peuvent être observées sans à-coups. Un dispositif de traitement (14) effectue la commande pour basculer automatiquement entre une première lumière d'éclairage et une seconde lumière d'éclairage, acquérir une première image de lumière d'éclairage et une seconde image de lumière d'éclairage, et générer une image d'affichage (71) comprenant une section d'intérêt (72) et une section d'intérêt secondaire (73). Le taux de trame d'affichage correspondant à la section d'intérêt est supérieur au taux de trame d'affichage correspondant à la section d'intérêt secondaire, et le taux de trame d'affichage correspondant à la section d'intérêt et le taux de trame d'affichage correspondant à la section d'intérêt secondaire se trouvent à l'intérieur de la plage du taux de trame d'affichage global.
PCT/JP2022/019386 2021-07-28 2022-04-28 Système endoscope, dispositif de traitement, et procédé de fonctionnement associé WO2023007896A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021123622 2021-07-28
JP2021-123622 2021-07-28

Publications (1)

Publication Number Publication Date
WO2023007896A1 true WO2023007896A1 (fr) 2023-02-02

Family

ID=85086528

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/019386 WO2023007896A1 (fr) 2021-07-28 2022-04-28 Système endoscope, dispositif de traitement, et procédé de fonctionnement associé

Country Status (1)

Country Link
WO (1) WO2023007896A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010063589A (ja) * 2008-09-10 2010-03-25 Fujifilm Corp 内視鏡システム、およびその駆動制御方法
JP2011188929A (ja) * 2010-03-12 2011-09-29 Olympus Corp 蛍光内視鏡装置
WO2020008834A1 (fr) * 2018-07-05 2020-01-09 富士フイルム株式会社 Dispositif de traitement d'image, procédé et système endoscopique
WO2020158165A1 (fr) * 2019-01-30 2020-08-06 富士フイルム株式会社 Système d'endoscope

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010063589A (ja) * 2008-09-10 2010-03-25 Fujifilm Corp 内視鏡システム、およびその駆動制御方法
JP2011188929A (ja) * 2010-03-12 2011-09-29 Olympus Corp 蛍光内視鏡装置
WO2020008834A1 (fr) * 2018-07-05 2020-01-09 富士フイルム株式会社 Dispositif de traitement d'image, procédé et système endoscopique
WO2020158165A1 (fr) * 2019-01-30 2020-08-06 富士フイルム株式会社 Système d'endoscope

Similar Documents

Publication Publication Date Title
JP7542585B2 (ja) 画像処理装置、内視鏡システム及び画像処理装置の作動方法
JP7335399B2 (ja) 医用画像処理装置及び内視鏡システム並びに医用画像処理装置の作動方法
WO2020162275A1 (fr) Dispositif de traitement d'image médicale, système d'endoscope et procédé de traitement d'image médicale
JP7374280B2 (ja) 内視鏡装置、内視鏡プロセッサ、及び内視鏡装置の作動方法
WO2020170809A1 (fr) Dispositif de traitement d'image médicale, système d'endoscope et procédé de traitement d'image médicale
WO2022014235A1 (fr) Dispositif de traitement d'analyse d'image, système d'endoscopie, procédé pour faire fonctionner le dispositif de traitement d'analyse d'image et programme pour dispositif de traitement d'analyse d'image
JP2020065685A (ja) 内視鏡システム
JP7386347B2 (ja) 内視鏡システム及びその作動方法
JP7402314B2 (ja) 医用画像処理システム、医用画像処理システムの作動方法
US20230101620A1 (en) Medical image processing apparatus, endoscope system, method of operating medical image processing apparatus, and non-transitory computer readable medium
WO2023007896A1 (fr) Système endoscope, dispositif de traitement, et procédé de fonctionnement associé
JP6731065B2 (ja) 内視鏡システム及びその作動方法
WO2022071413A1 (fr) Dispositif de traitement d'image, système d'endoscope, procédé de fonctionnement de dispositif de traitement d'image et programme de dispositif de traitement d'image
JP7556960B2 (ja) 内視鏡システム及びその作動方法
JP7411515B2 (ja) 内視鏡システム及びその作動方法
US20240013392A1 (en) Processor device, medical image processing device, medical image processing system, and endoscope system
JP7524307B2 (ja) 内視鏡システム、制御方法、及び制御プログラム
US20220022739A1 (en) Endoscope control device, method of changing wavelength characteristics of illumination light, and information storage medium
JP7307709B2 (ja) 処置具管理システム、内視鏡システム及びその作動方法
WO2021176890A1 (fr) Système endoscope, procédé de commande, et programme de commande
WO2023058503A1 (fr) Système d'endoscope, dispositif de traitement d'image médicale et son procédé de fonctionnement
WO2022044371A1 (fr) Système d'endoscope et procédé de fonctionnement de celui-ci
JP2023178526A (ja) 画像処理装置、内視鏡システム、画像処理装置の作動方法、及び画像処理装置用プログラム
JP2023018543A (ja) 内視鏡システム及びその作動方法
JP2023055365A (ja) 画像処理装置及びその作動方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22848978

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22848978

Country of ref document: EP

Kind code of ref document: A1