CN110381807B - Endoscope system, processor device, and method for operating endoscope system - Google Patents

Endoscope system, processor device, and method for operating endoscope system Download PDF

Info

Publication number
CN110381807B
CN110381807B CN201880015393.XA CN201880015393A CN110381807B CN 110381807 B CN110381807 B CN 110381807B CN 201880015393 A CN201880015393 A CN 201880015393A CN 110381807 B CN110381807 B CN 110381807B
Authority
CN
China
Prior art keywords
endoscope
image
control unit
determination
observation target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201880015393.XA
Other languages
Chinese (zh)
Other versions
CN110381807A (en
Inventor
大酒正明
山本拓明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Corp
Original Assignee
Fujifilm Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujifilm Corp filed Critical Fujifilm Corp
Publication of CN110381807A publication Critical patent/CN110381807A/en
Application granted granted Critical
Publication of CN110381807B publication Critical patent/CN110381807B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000094Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00006Operational features of endoscopes characterised by electronic signal processing of control signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000096Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00055Operational features of endoscopes provided with output arrangements for alerting the user
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/065Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/1032Determining colour for diagnostic purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/041Capsule endoscopes for imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/043Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for fluorescence imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0638Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths

Abstract

The invention provides an endoscope system, a processor device and a method for operating the endoscope system, wherein the endoscope system can acquire information helpful for diagnosis by suppressing processing load at a time when a user such as a doctor does not want to perform processing such as determination and automatically performing processing such as determination at an appropriate time when the user such as the doctor wants to perform the processing. An endoscope system (10) is provided with an image acquisition unit (54), a determination processing unit (82), and a determination processing control unit (83). An image acquisition unit (54) acquires an endoscopic image obtained by imaging an observation object with an endoscope (12). A determination processing unit (82) determines a portion of the observation target having a specific feature by performing determination processing using the endoscopic image. A determination process control unit (83) controls the start or end of the determination process on the basis of a change in the operation of the endoscope (12) or an endoscope image.

Description

Endoscope system, processor device, and method for operating endoscope system
Technical Field
The present invention relates to an endoscope system, a processor device, and a method for operating the endoscope system, which discriminate a portion having a specific feature from an endoscopic image obtained by imaging an observation target using an endoscope.
Background
In the medical field, diagnosis is generally performed by an endoscope system including a light source device, an endoscope, and a processor device. In recent years, there has been known an endoscope system which is intended to discriminate a portion having a specific characteristic such as a lesion using an endoscope image, in addition to displaying the endoscope image on a display unit such as a display. For example, patent document 1 describes an endoscope system that calculates a feature amount using an endoscope image and then discriminates a lesion using the calculated feature amount.
Further, an endoscope system is known which extracts a blood vessel or the like important for diagnosis using an endoscope image. For example, patent document 2 describes an endoscope system that performs a blood vessel extraction process while changing the execution frequency according to the movement amount.
Prior art documents
Patent document
Patent document 1: japanese laid-open patent publication No. 2012 040075
Patent document 2: japanese patent laid-open publication No. 2016-144507
Disclosure of Invention
Technical problem to be solved by the invention
As described above, there has been known an endoscope system that provides an endoscopic image and information that contributes to diagnosis in addition to the endoscopic image by discriminating or recognizing a portion having a specific feature such as a lesion using the endoscopic image.
However, the process of discriminating a portion having a specific feature such as a lesion is usually a heavy process requiring a very high processing capability. Therefore, it is actually difficult to continuously perform, in real time, a process of discriminating a portion having a specific feature such as a lesion while continuously consuming a large amount of resources of the processor device during observation.
Therefore, it is practical to appropriately perform processing such as discrimination of a portion having a specific feature such as a lesion, if necessary, in accordance with a request of a user by a button operation or the like. On the other hand, since the operation of the endoscope system is inherently complicated, it is not preferable to further apply an operational load to a user such as a doctor.
Accordingly, an object of the present invention is to provide an endoscope system that can acquire information that contributes to diagnosis by suppressing a processing load at a time that is not desired by a user such as a doctor and automatically performing processing such as determination at an appropriate time that is desired by the user such as the doctor, a processor device, and a method of operating the endoscope system.
Means for solving the technical problem
An endoscope system includes an image acquisition unit, a determination processing unit, and a determination processing control unit. The image acquisition unit acquires an endoscopic image obtained by imaging an observation target with an endoscope. The discrimination processing unit discriminates a portion having a specific feature in the observation target by performing discrimination processing using the endoscope image. The determination process control unit controls the start or end of the determination process based on a change in the operation of the endoscope or the endoscope image.
The discrimination processing control unit includes a feature amount calculation unit and a comparison unit. The feature amount calculation unit calculates a feature amount using the endoscopic image. The comparison unit compares the feature amount with a threshold value. In this case, the determination process control unit preferably starts or ends the determination process based on the comparison result in the comparison unit.
Preferably, the determination process control unit starts or ends the determination process when the endoscope enlarges or reduces the observation target.
Preferably, the determination processing control unit starts the determination processing when the illumination light used when the endoscope captures the observation target is switched to the specific illumination light, and ends the determination processing when the illumination light used when the endoscope captures the observation target is switched to illumination light other than the specific illumination light.
Preferably, the determination processing control unit starts the determination processing when the illumination light is switched to illumination light having a wavelength of 450nm or less.
Preferably, the discrimination processing control unit starts or ends the discrimination processing when the object passes through a channel inserted through the endoscope.
Preferably, the discrimination processing control unit starts the discrimination processing when water passes through the passage.
Preferably, the discrimination processing control unit starts the discrimination processing when the coloring agent passes through the channel.
Preferably, the determination process control unit starts or ends the determination process when the position at which the endoscope captures the observation target changes or when the position at which the endoscope captures the observation target does not change.
Preferably, the determination processing control unit detects a position where the endoscope captures the observation target based on a change in the observation target captured in the endoscope image.
Preferably, the determination processing control unit detects a position where the endoscope captures the observation target by using a position sensor.
Preferably, the determination process control unit ends the determination process when the position at which the endoscope captures the observation target changes.
Preferably, the determination process control unit starts the determination process based on a change in the operation of the endoscope or the endoscope image, and ends the determination process after a predetermined time period has elapsed after the start of the determination process.
Preferably, the determination process control unit starts or ends the determination process when the endoscope captures a still image.
Preferably, the endoscope apparatus further includes a notification control unit configured to control start or end of notification of a determination result, which is a result of the determination process, based on a change in an operation of the endoscope or an endoscope image.
Preferably, the notification control unit includes: a feature value calculation unit that calculates a feature value using the endoscopic image; and a comparison unit that compares the feature value with a threshold value, and the notification control unit starts or ends notification of the determination result based on the comparison result in the comparison unit.
Preferably, the notification control unit starts or ends the notification of the determination result when the endoscope enlarges or reduces the observation target.
Preferably, the notification control unit starts the notification of the determination result when the illumination light used when the endoscope captures the observation target is switched to the specific illumination light, and ends the notification of the determination result when the illumination light used when the endoscope captures the observation target is switched to illumination light other than the specific illumination light.
Preferably, the notification control unit starts the notification of the determination result when the illumination light is switched to illumination light having a wavelength of 450nm or less.
Preferably, the notification control unit starts or ends the notification of the determination result when the object passes through a channel inserted into the endoscope.
Preferably, the notification control unit starts the notification of the determination result when the water passes through the passage.
Preferably, the notification control unit starts the notification of the determination result when the coloring agent passes through the channel.
Preferably, the notification control unit starts or ends the notification of the determination result when the position at which the endoscope captures the observation target changes or when the position at which the endoscope captures the observation target does not change.
Preferably, the notification control unit detects a position where the endoscope captures the observation target based on a change in the observation target captured in the endoscope image.
Preferably, the notification control unit detects a position at which the endoscope captures the observation target by using the position sensor.
Preferably, the notification control unit terminates the notification of the determination result when the position at which the endoscope captures the observation target changes.
Preferably, the notification control unit starts the notification of the determination result based on a change in the operation of the endoscope or the endoscope image, and ends the notification of the determination result after a predetermined fixed time period has elapsed after the start of the notification of the determination result.
Preferably, the notification control unit starts or ends the notification of the determination result when the endoscope captures the still image.
Preferably, the notification control unit notifies the determination result by using any one of voice, image, and message, or a combination thereof.
A processor device includes an image acquisition unit, a determination processing unit, and a determination processing control unit. The image acquisition unit acquires an endoscopic image obtained by imaging an observation target with an endoscope. The discrimination processing unit discriminates a portion having a specific feature in the observation target by performing discrimination processing using the endoscope image. The determination process control unit controls the start or end of the determination process based on a change in the operation of the endoscope or the endoscope image.
The endoscope system of the present invention includes: an image acquisition unit acquiring an endoscopic image obtained by imaging an observation target with an endoscope; and a step in which the discrimination processing unit discriminates a portion of the observation target having the specific feature by performing discrimination processing using the endoscope image. The operating method of the endoscope system of the present invention includes a step in which the determination processing control unit controls the start or end of the determination processing based on a change in the operation of the endoscope or an endoscope image.
Effects of the invention
According to the endoscope system, the processor device, and the operating method of the endoscope system of the present invention, it is possible to acquire information that contributes to diagnosis by suppressing the processing load at a timing that is not desired by a user such as a doctor, and automatically executing processing such as determination at an appropriate timing that is desired by the user such as the doctor.
Drawings
Fig. 1 is an external view of an endoscope system.
Fig. 2 is a block diagram of an endoscope system.
Fig. 3 is a block diagram of the image processing section.
Fig. 4 is a flowchart showing a flow of the start and end of the determination process.
Fig. 5 shows an example of a display when the discrimination processing is performed.
Fig. 6 is a display example when the discrimination processing is not performed.
Fig. 7 is a block diagram of an image processing unit according to embodiment 2.
Fig. 8 is a flowchart showing the flow of the start and end of the determination process in embodiment 2.
Fig. 9 is a block diagram of an image processing unit according to embodiment 3.
Fig. 10 is a block diagram of an image processing unit according to embodiment 3.
Fig. 11 is a block diagram of an image processing unit and the like in embodiment 4.
Fig. 12 is a flowchart showing the flow of the start and end of the determination process in embodiment 4.
Fig. 13 is an explanatory diagram of a modification.
Fig. 14 is a block diagram of an image processing unit according to embodiment 5.
Fig. 15 is a flowchart showing the flow of the start and end of the determination process in embodiment 5.
Fig. 16 is an explanatory diagram of a modification.
Fig. 17 is a block diagram of an image processing unit according to embodiment 6.
Fig. 18 is a flowchart showing the flow of the start and end of the determination process in embodiment 6.
Fig. 19 is a block diagram of an image processing unit according to embodiment 7.
Fig. 20 is a display example of a display in embodiment 7.
Fig. 21 is a schematic diagram of the capsule endoscope.
Detailed Description
[ embodiment 1 ]
As shown in fig. 1, the endoscope system 10 includes an endoscope 12, a light source device 14, a processor device 16, a display 18, and a console 19. The endoscope 12 photographs an observation target. The light source device 14 generates illumination light. The processor device 16 performs system control, image processing, and the like of the endoscope system 10. The display 18 is a display unit that displays an image for display (hereinafter, referred to as an endoscopic image; for example, see the endoscopic image 101 of fig. 5 or the endoscopic image 102 of fig. 6) generated by the processor device 16. The console 19 is an input device for inputting settings to the processor device 16 and the like.
The endoscope 12 includes an insertion portion 12a to be inserted into the subject, an operation portion 12b provided at a proximal end portion of the insertion portion 12a, a bending portion 12c provided at a distal end side of the insertion portion 12a, and a distal end portion 12 d. The bending portion 12c is bent by operating the corner knob 12e of the operating portion 12 b. The bent portion 12c is bent, and the distal end portion 12d is oriented in a desired direction. The distal end portion 12d is provided with a forceps port (not shown) through which a treatment instrument such as a forceps is projected toward an observation target, a jet port (not shown) through which air, water, or the like is jetted toward the observation target, or the like.
The operating section 12b is provided with a zoom operating section 13a, a mode selector switch 13b, and the like, in addition to the corner knob 12 e. By operating the zoom operation unit 13a, the observation target can be enlarged or reduced to perform imaging. The observation mode can be switched by the operation of the mode switching switch 13 b. The endoscope system 10 has a plurality of observation modes such as a normal observation mode in which an observation target is observed in a natural color using white light as illumination light, and a special observation mode in which blood vessels and the like located on a mucosal surface layer are emphasized. By operating the mode changeover switch 13b, these plurality of observation modes can be changed as appropriate at any time.
As shown in fig. 2, the light source device 14 includes a light source unit 20 that emits illumination light, and a light source control unit 22 that controls driving of the light source unit 20.
The Light source unit 20 includes, for example, a plurality of LEDs (Light Emitting diodes) that emit Light having different central wavelengths or wavelength ranges (hereinafter, simply referred to as "different wavelengths") as a Light source, and can emit a plurality of types of illumination Light having different wavelengths by Light emission or lighting of the LEDs, adjustment of the Light amount, and the like. For example, the light source unit 20 can emit, as illumination light, a wide-band violet light, a wide-band blue light, a wide-band green light, and a wide-band red light, respectively. The light source unit 20 can emit, as illumination light, violet light, blue light, green light, and red light in a narrow band (a wavelength range of about 10nm to 20 nm) in addition to the wide band of violet light, blue light, green light, and red light.
In addition, as the light source section 20, a combination of an LD (Laser Diode), a fluorescent material, and a band limiting filter, a combination of a lamp such as a xenon lamp, and a band limiting filter, or the like can be used instead of the LED. Of course, when the light source unit 20 is formed of an LED, a fluorescent material or a band limiting filter may be used in combination.
The light source control unit 22 independently controls the timing of turning on or off each light source constituting the light source unit 20, the amount of light emission at the time of turning on, and the like. As a result, the light source unit 20 can emit a plurality of types of illumination light having different wavelengths. The light source control unit 22 controls the light source unit 20 based on the imaging timing (so-called frame) of the image sensor 48.
The illumination light emitted from the light source unit 20 is incident on the light guide 41. The light guide 41 is incorporated in the endoscope 12 and the universal cord, and transmits illumination light to the distal end portion 12d of the endoscope 12. The universal cord is a cord that connects the endoscope 12 with the light source device 14 and the processor device 16. Further, as the light guide 41, a multimode fiber can be used. For example, an optical fiber cable having a core diameter of 105 μm, a cladding diameter of 125 μm, and a small diameter of 0.3 to 0.5mm including a protective layer serving as a sheath can be used.
An illumination optical system 30a and an imaging optical system 30b are provided at the distal end portion 12d of the endoscope 12. The illumination optical system 30a has an illumination lens 45, and emits illumination light toward the observation target via the illumination lens 45. The imaging optical system 30b includes an objective lens 46, a zoom lens 47, and an image sensor 48. The image sensor 48 photographs the observation target by using reflected light of the illumination light returned from the observation target (including, in addition to the reflected light, flash light, fluorescence emitted from the observation target, fluorescence due to a drug administered to the observation target, and the like). The zoom lens 47 is moved by operating the zoom operation unit 13a, thereby enlarging or reducing the observation target photographed by the image sensor 48.
The image sensor 48 is, for example, a color sensor having a primary color filter, and includes 3 kinds of pixels, i.e., a B pixel (blue pixel) having a blue color filter, a G pixel (green pixel) having a green color filter, and an R pixel (red pixel) having a red color filter. The blue color filter transmits mainly violet to blue light. The green color filter transmits mainly green light. The red color filter transmits mainly red light. When the observation target is photographed by the primary color image sensor 48 as described above, 3 kinds of images, i.e., a B image (blue image) obtained from the B pixels, a G image (green image) obtained from the G pixels, and an R image (red image) obtained from the R pixels, can be obtained at the same time at maximum.
As the image sensor 48, a CCD (Charge Coupled Device) sensor or a CMOS (Complementary Metal Oxide Semiconductor) sensor can be used. The image sensor 48 of the present embodiment is a primary color sensor, but a complementary color sensor may be used. The complementary color system color sensor includes, for example, a cyan pixel provided with a cyan filter, a magenta pixel provided with a magenta filter, a yellow pixel provided with a yellow filter, and a green pixel provided with a green (green) filter. When complementary color sensors are used, images obtained from the pixels of the respective colors are subjected to complementary color-primary color conversion, and can be converted into B images, G images, and R images. Also, a monochrome sensor provided with no color filter can be used as the image sensor 48 instead of the color sensor. At this time, the observation target is sequentially imaged by the illumination light of each color such as BGR, and an image of each color can be obtained.
The processor device 16 includes a control unit 52, an image acquisition unit 54, an image processing unit 61, and a display control unit 66.
The control unit 52 performs overall control of the endoscope system 10 such as synchronization control of the illumination light irradiation timing and the imaging timing. For example, the control unit 52 performs synchronization control of the illumination timing and the imaging timing of the illumination light. When the zoom operation unit 13a is operated, the control unit 52 receives an operation signal from the zoom operation unit 13a, and may input a control signal (hereinafter, referred to as an image capture size signal) indicating the image capture size of the observation target (that is, the degree of enlargement or reduction of the observation target) to the image processing unit 61. The image processing unit 61 may or may not perform the predetermined processing or adjust the intensity of the predetermined processing, depending on the imaging size of the observation target based on the imaging size signal.
When the mode switch 13b is operated, the control unit 52 receives an operation signal from the mode switch 13b, and inputs a control signal (hereinafter, referred to as an observation mode designation signal) designating an observation mode to the light source control unit 22 and the image processing unit 61. The observation mode designation signal input from the control unit 52 to the light source control unit 22 and the image processing unit 61 designates the type, the light emission order, and the like of the illumination light to be used. The light source control unit 22 emits illumination light in accordance with the observation mode designation signal by the light source unit 20. The image processing unit 61 may or may not perform the predetermined processing or adjust the intensity of the predetermined processing according to the observation mode by specifying the signal according to the observation mode.
The image acquisition unit 54 acquires an image of the observation target from the image sensor 48. The image of the observation target acquired by the image acquisition unit 54 from the image sensor 48 is an endoscopic image. The image to be displayed, which is generated from the image of the observation target acquired from the image sensor 48, is also an endoscopic image. Hereinafter, when the distinction is necessary, the image of the observation target acquired from the image sensor 48 is referred to as "captured endoscopic image" or simply "image", and the image generated for display is referred to as "endoscopic image for display" or simply "endoscopic image". In the present embodiment, since the image sensor 48 has a color filter, the image acquisition unit 54 acquires an image (captured endoscopic image) for each illumination light and for each color filter.
The image acquisition unit 54 includes a DSP (Digital Signal Processor)56, a noise reduction unit 58, and a conversion unit 59, and performs various processes on the acquired image as necessary using these components.
The DSP56 performs various processes such as a defect correction process, an offset process, a gain correction process, a linear matrix process, a gamma conversion process, a demosaicing process, and a YC conversion process on the acquired image as necessary.
The defect correction processing is processing of correcting a pixel value of a pixel corresponding to a defective pixel of the image sensor 48. The offset processing is processing of reducing a dark current component from an image on which the defect correction processing has been performed and setting an accurate zero level. The gain correction processing is processing for adjusting the signal level of each image by multiplying the image subjected to the offset processing by a gain. The linear matrix processing is processing for improving color reproducibility of an image that has been subjected to offset processing, and the gamma conversion processing is processing for adjusting lightness and chroma of an image after the linear matrix processing. The demosaicing process (also referred to as an isotropic process or a synchronization process) is a process of interpolating pixel values of missing pixels, and is performed on an image after the gamma conversion process. The missing pixels are pixels having no pixel value due to the arrangement of the color filters (due to the arrangement of pixels of other colors in the image sensor 48). For example, since the B image is an image obtained by imaging an observation target in B pixels, there is no pixel value in pixels at positions corresponding to G pixels and R pixels. The demosaicing process interpolates the B image to generate pixel values of pixels located at positions of G pixels and R pixels of the image sensor 48. The YC conversion processing is processing of converting the image after the demosaicing processing into a luminance channel Y and color difference channels Cb and Cr.
The noise reduction unit 58 performs noise reduction processing on the luminance channel Y, the color difference channel Cb, and the color difference channel Cr by, for example, a moving average method or a median filtering method. The conversion unit 59 converts the luminance channel Y, the color difference channel Cb, and the color difference channel Cr after the noise reduction processing into images of respective colors of BGR again.
The image processing unit 61 generates an endoscopic image to be displayed on the display 18, using the image acquired by the image acquiring unit 54. The image processing unit 61 performs a discrimination process for discriminating a portion having a specific feature of the observation target from the endoscopic image, as necessary. For this purpose, as shown in fig. 3, the image processing unit 61 includes an image generating unit 81, a determination processing unit 82, and a determination processing control unit 83.
The image generating unit 81 generates an endoscopic image corresponding to the observation mode from the image acquired by the image acquiring unit 54. When the observation mode is the normal observation mode, the image generating unit 81 generates an endoscopic image that enables observation of an observation target in a natural color. When the observation mode is the special observation mode, an endoscopic image corresponding to the purpose of the special observation mode is generated by using a combination of images different from the normal observation mode, a method of assigning the used images to color channels different from the normal observation mode, or the like. For example, an endoscopic image is generated in which a blood vessel or the like located on the surface layer of the mucosa is emphasized.
When the image processing unit 61 generates an endoscopic image according to the observation mode, the image acquired by the image acquisition unit 54 is subjected to color conversion processing, color emphasis processing, and structure emphasis processing, as necessary. The color conversion process performs 3 × 3 matrix processing, gradation conversion processing, three-dimensional LUT (look-up table) processing, and the like on an image of each color of BGR. The color emphasis process is a process of emphasizing colors of an image, and the structure emphasis process is a process of emphasizing a tissue or a structure of an observation target such as a blood vessel or a pit pattern.
The discrimination processing unit 82 discriminates a portion having a specific feature (hereinafter referred to as a specific portion) of the observation target by performing discrimination processing using the endoscopic image generated by the image generating unit 81. The "specific portion" is a portion distinguishable from other portions (for example, normal mucosa) depending on the presence or absence of a tissue or structure, the shape (thickness, length, state of a boundary between the tissue and other tissues, i.e., a border, disturbance of the tissue, or meandering of linear tissue), the amount (area, density, or the like), the distribution, or the color, or the state of the tissue or structure (for example, in the case of an internal parameter such as depth under mucosa or oxygen saturation of hemoglobin contained in a blood vessel). As described above, the "specific feature" that specifies the specific portion refers to a feature that can be known by observing the endoscopic image or information obtained by an operation using the endoscopic image or an image used for generating the endoscopic image. Hereinafter, information such as the number of specific features that specify a specific portion is referred to as "feature amount".
"determination" means that information contributing to diagnosis is further obtained based on the characteristics of the specific portion or the like, and "determination processing" means processing performed by the determination processing portion 82 for determination. For example, the identification process of identifying whether or not a specific portion is a lesion based on a feature quantity indicating a feature of the specific portion is a discrimination process. Further, the determination process includes a process of calculating information indicating the probability of a lesion, such as an index indicating the probability of a lesion in the specific portion, based on the feature amount indicating the feature of the specific portion. When the specific portion is a lesion, a process of classifying or classifying the lesion, recognizing the benign or malignant state, recognizing the progress or invasion degree of the lesion, or the like is also a discrimination process.
The determination processing unit 82 performs the determination processing described above using, for example, a convolutional Neural Network 71 (see fig. 2), a Neural Network, an SVM (Support Vector machine), a classifier (adaptive boosting), or the like, which is learned in advance by a specific task (more specifically, a specific classification). The result of the discrimination processing (hereinafter, referred to as a discrimination result) is, for example, "whether or not the specific portion is a lesion" (including a case where information that can discriminate whether or not the specific portion is a lesion). Therefore, when the specific portion is a lesion, the classification or classification of the lesion, the identification of the benign or malignant state, the progression degree or the invasion degree of the lesion, and the like are the discrimination results. Further, it is possible to use, as the determination result, a score indicating the accuracy of determination as to whether or not the specific portion is a lesion, or the certainty of classification of the lesion. In the present embodiment, the determination processing unit 82 determines whether or not the specific portion is a lesion, and uses a score indicating the certainty of the determination as a determination result. The fraction is for example a value of 0.0 to 1.0. When the discrimination processing is performed, the discrimination processing section 82 stores information of the endoscopic image used for the discrimination processing in the storage section 72 (see fig. 2). The information of the endoscopic image used in the discrimination processing means a frame number, a score as a discrimination result, and the like in addition to the endoscopic image itself. The determination processing unit 82 stores the endoscopic image, the frame number, the score as the determination result, and the like in the storage unit 72 in association with each other. The storage unit 72 is a memory such as a memory or a hard disk.
The determination processing unit 82 uses the endoscopic image (endoscopic image for display) generated by the image generation unit 81 for the determination processing, but the endoscopic image for display is generated using the captured endoscopic image. Therefore, as a result, the determination processing section 82 performs determination processing using an endoscopic image obtained by imaging an observation target with an endoscope. The determination processing section 82 can perform the determination processing using the captured endoscopic image as it is instead of the display endoscopic image.
The determination process control unit 83 controls the start or end of the determination process in the determination processing unit 82 based on a change in the operation of the endoscope 12 or the endoscope image. The "change in operation of the endoscope 12" is a change or switching of the operation mode of the endoscope 12, and the determination processing is usually a change or switching of an operation mode having no direct relationship. For example, switching of illumination light by enlargement or reduction of an observation target or switching of an observation mode is a change in the operation of the endoscope 12. "according to an endoscopic image" means a change in characteristics of an observation target according to an endoscopic image, for example, a change in characteristic amount.
The term "start of the discrimination processing" means that when the discrimination processing is not performed on the endoscopic image of any one frame, the discrimination processing is performed on the endoscopic image of at least 1 or more frames consecutive to the endoscopic image. That is, the "start of the discrimination processing" also includes a case where the discrimination processing is performed only on endoscopic images of 1 frame when frames for which the discrimination processing is not performed are consecutive. The term "end of the discrimination processing" means that when the discrimination processing is performed on the endoscopic image of any one frame, the discrimination processing is not performed on the endoscopic images of 1 or more consecutive frames. That is, the term "end of the discrimination processing" includes a case where the discrimination processing is not performed only on endoscopic images of 1 frame when frames on which the discrimination processing is performed are consecutive.
In the present embodiment, the determination process control unit 83 controls the start and end of the determination process in the determination processing unit 82 based on the endoscopic image. For this purpose, the determination processing control unit 83 includes a feature value calculation unit 86 and a comparison unit 87.
The feature amount calculation unit 86 acquires the endoscopic image from the image generation unit 81. The feature amount calculation unit 86 calculates a feature amount of an observation target captured in the endoscopic image using the acquired endoscopic image. The feature amount is calculated for a part (for example, every several frames) or all of the endoscopic images generated by the image generating unit 81. The feature amount calculation unit 86 calculates a feature amount for all or a part of the endoscopic image.
The comparison unit 87 compares the feature amount calculated by the feature amount calculation unit 86 with a preset feature amount of an endoscopic image in which a normal observation target is captured. The result of comparison (hereinafter referred to as a comparison result) by the comparison unit 87 indicates whether or not the value of the feature amount or the distribution of the values calculated by the feature amount calculation unit 86 is within a normal range, and indicates a change in the feature of the observation target captured in the sequentially acquired endoscopic images. In the present embodiment, the comparison unit 87 compares the feature amount calculated by the feature amount calculation unit 86 with a threshold value, using the preset feature amount of the endoscopic image in which a normal observation target is captured as the threshold value. The feature amount calculation unit 86 compares the feature amount with the threshold value in sequence every time the feature amount is calculated. The case where the feature amount is equal to or greater than the threshold value is, for example, the case where the subject is an endoscopic image such as red, and the like, and it is basically the case where the discrimination processing is desired. Conversely, the case where the feature amount becomes smaller than the threshold value is, for example, the case where the observation target is normal without redness or the like and the discrimination processing is substantially unnecessary.
The discrimination processing control unit 83 starts or ends the discrimination processing based on the comparison result of the comparison unit 87. As a result, the determination process control unit 83 controls the start or end of the determination process in the determination processing unit 82 based on the endoscopic image. More specifically, in the present embodiment, the determination process is started when the feature amount becomes equal to or greater than the threshold value, and the determination process is ended when the feature amount becomes smaller than the threshold value. As a result, the determination processing section 82 automatically executes the determination processing when the feature amount is equal to or greater than the threshold value, and automatically ends the determination processing when the feature amount becomes smaller than the threshold value.
The display control unit 66 acquires the endoscopic images from the image processing unit 61, converts the acquired endoscopic images into a format suitable for display, and sequentially outputs and displays the images on the display 18. This enables a doctor or the like to observe an observation target using an endoscopic image. When the determination processing section 82 performs the determination processing, the display control section 66 acquires a score as a result of the determination from the determination processing section 82 in addition to the endoscopic image. Then, the score as the discrimination result is displayed on the display 18 together with the endoscopic image.
Next, the start and end of the determination process in the endoscope system 10 will be described along the flowchart shown in fig. 4. First, an observation mode is selected and an observation target is photographed (S11). Thereby, the image acquisition unit acquires the captured endoscopic image from the image sensor 48, and the image generation unit 81 generates an endoscopic image for display using the captured endoscopic image.
When the endoscope image is acquired, the determination processing control unit 83 calculates the feature amount in the feature amount calculation unit 86 (S12), and then compares the feature amount with the threshold in the comparison unit 87 (S13). When the feature value is equal to or greater than the threshold value as a result of the comparison in the comparison unit 87 (yes in S13), the discrimination processing controller 83 executes the discrimination processing in the discrimination processing unit 82 (S14). In general, since the imaging rate is sufficiently faster than the change of the endoscopic image, if the feature amount once becomes equal to or greater than the threshold value, the feature amount becomes equal to or greater than the threshold value for a certain period of time thereafter. Therefore, in practice, the discrimination processing starts when the feature amount becomes equal to or greater than the threshold value. When the determination process is started, the display control unit 66 sequentially displays the endoscopic image 101 used in the determination process and the score 99 as the determination result on the display 18 as shown in fig. 5.
On the other hand, when the feature value is smaller than the threshold value as a result of the comparison by the comparison unit 87 (no in S13), the discrimination processing control unit 83 skips the discrimination processing by the discrimination processing unit 82. In general, since the shooting rate is sufficiently faster than the change of the endoscopic image, the feature amount is smaller than the threshold value for a certain period of time after the feature amount becomes smaller than the threshold value once. Therefore, actually, the discrimination processing is ended when the feature amount becomes smaller than the threshold value. When the determination process is completed, the display control unit 66 sequentially displays the endoscopic images 102 that have not been subjected to the determination process on the display 18, as shown in fig. 6.
Then, the start and end of the determination process are repeated until the observation is completed (S15). Therefore, when the feature amounts of the sequentially generated endoscopic images are sequentially calculated and the calculated feature amounts are sequentially compared with the threshold value, if the feature amount is equal to or greater than the threshold value from a state in which the feature amount is smaller than the threshold value, the determination process can be automatically started without performing a button operation or the like, and the score as the determination result can be provided to a doctor or the like to support diagnosis. When the necessity of the determination process is reduced when the feature amount is smaller than the threshold value from the state in which the determination process is started, the determination process can be automatically ended without performing a button operation or the like, and the processing load (consumption of resources such as a memory) of the processor device 16 can be suppressed.
[ 2 nd embodiment ]
In the above-described embodiment 1, the discrimination processing control unit 83 starts and ends the discrimination processing based on the change in the feature amount (change in the endoscope image), but instead, the discrimination processing may be started or ended when the endoscope 12 enlarges or reduces the observation target.
In this case, as shown in fig. 7, the image processing unit 61 is provided with a determination process control unit 283 instead of the determination process control unit 83 of embodiment 1. The determination processing control unit 283 acquires information on the operation state of the zoom operation unit 13a via the control unit 52. Specifically, the determination processing control unit 283 acquires an imaging size signal indicating the imaging size of the observation target. Then, the discrimination processing is started or ended in accordance with the presence or absence of enlargement or reduction of the observation target or the degree thereof, which is known from the imaging size signal.
For example, as shown in fig. 8, when the observation target is photographed (S211) and the endoscopic image is acquired, the determination processing control unit 283 detects whether or not the observation target is under magnification observation based on the image pickup size signal (S212). When the enlargement observation is performed (yes in S212), the discrimination processing control unit 83 executes the discrimination processing in the discrimination processing unit 82 (S213). When the discrimination processing is not executed in the previous frame, this actually becomes the start of the discrimination processing. On the other hand, when the enlargement observation is not performed (NO in S212), the determination process control unit 283 skips the determination process in the determination processing unit 82. When the discrimination processing is executed in the previous frame, this actually becomes the end of the discrimination processing.
By repeating the above operations until the end of observation (S214), the determination process is automatically performed while the enlargement observation is performed, and when the enlargement observation is ended, the determination process is also automatically ended in an interlocking manner. This is because, in a case where a doctor or the like performs an enlarged observation of an observation target, the enlarged portion may be a lesion or a lesion may exist. Therefore, if the discrimination processing is started and ended depending on the presence or absence of amplification as described above, the discrimination processing can be performed at a timing at which a doctor or the like desires the discrimination processing to be appropriate, and the score 99 as the discrimination result can be provided. Further, at unnecessary times, unnecessary determination processing is not performed, and thus the processing load on the processor device 16 can be reduced.
In embodiment 2 described above, the discrimination processing control unit 283 starts the discrimination processing when the observation target is enlarged, but may start the discrimination processing when the observation target is reduced. The case where the observation target is reduced and the observation target is photographed by so-called "pull" is, for example, a case where the presence or absence of a lesion or a portion where there is a possibility of a lesion is confirmed. In this case, when the discrimination processing is performed and the score as the discrimination result is displayed, it is possible to easily confirm whether or not a lesion exists in the observation range or a portion where the lesion is likely to exist.
In the above-described embodiment 2 and the above-described modification, the discrimination process is started when the observation target is enlarged or reduced, but the discrimination process may be started or ended depending on the degree of enlargement or reduction of the observation target, such as when the observation target is enlarged or reduced to or below a specific enlargement ratio, or when the enlargement ratio (reduction ratio) is within a specific range.
In the above-described embodiment 2 and the above-described modification, the determination process control unit 283 starts or ends the determination process based on the image capture size signal indicating the operation state of the zoom operation unit 13a, but the determination process control unit 283 may detect the enlargement or reduction of the observation target or the degree thereof using the endoscopic image instead of the image capture size signal and start or end the determination process based on the detection result. At this time, the determination processing control unit 283 acquires the endoscopic image from the image acquisition unit 54 or the image generation unit 81 and holds the endoscopic image of at least 1 frame or more in the past. When an endoscopic image is newly acquired, the newly acquired endoscopic image is compared with a held conventional endoscopic image, and enlargement or reduction of an observation target or a degree thereof is detected. At this time, the determination processing control unit 283 also functions as an enlargement/reduction detection unit that detects the enlargement or reduction of the observation target, or the degree thereof.
[ embodiment 3 ]
The endoscope system 10 can start or end the determination process based on "a change in the operation of the endoscope 12 or an endoscope image" different from the above-described embodiments 1 and 2. For example, the determination process can be started or ended when the observation mode is switched, that is, when the illumination light used when the endoscope 12 captures the observation target is switched.
In this case, as shown in fig. 9, the image processing unit 61 is provided with a discrimination processing control unit 383 instead of the discrimination processing control unit 83 of embodiment 1 or the discrimination processing control unit 283 of embodiment 2. The discrimination processing control section 383 acquires information about the operation condition of the mode changeover switch 13b via the control section 52. Specifically, the discrimination processing control section 383 acquires an observation mode designation signal which designates an observation mode. Then, switching of the illumination light is detected from the type of observation mode designated by the observation mode designation signal, and the determination processing is started or ended.
For example, as shown in fig. 10, when the observation target is imaged (S311) and the endoscope image is acquired, the discrimination processing control section 383 detects whether or not the specific illumination light is used as the illumination light, based on the presence or absence of the input of the observation mode designation signal and the type of the observation mode designated by the observation mode designation signal (S312). The specific illumination light is illumination light of an observation mode in many cases where a doctor or the like desires to execute the discrimination processing. When the illumination light is switched to the specific illumination light (YES in S312), the discrimination processing control section 383 performs the discrimination processing in the discrimination processing section 82 (S313). When the discrimination processing is not executed in the previous frame, this actually becomes the start of the discrimination processing. On the other hand, when the illumination light is switched to illumination light other than the specific illumination light (NO in S312), the discrimination processing control section 383 skips the discrimination in the discrimination processing section 82. When the discrimination processing is executed in the previous frame, this actually becomes the end of the discrimination processing.
By repeating the above-described operations until the observation is completed (S314), the determination process is automatically performed while the observation mode is the observation mode using the specific illumination light, and when the observation mode is changed to another observation mode not using the specific illumination light, the determination process is also automatically completed in conjunction therewith. In this way, when the discrimination processing is started when the "observation mode using the specific illumination light" is switched to the observation mode using the other illumination light, which is often the case where the discrimination processing is desired to be used simultaneously by the doctor or the like, and the discrimination processing is ended when the observation mode using the other illumination light is switched, the discrimination processing can be performed at an appropriate timing when the doctor or the like desires the discrimination processing, and the score 99 as the discrimination result can be provided. Further, at unnecessary times, unnecessary determination processing is not performed, and thus the processing load on the processor device 16 can be reduced.
In the above-described embodiment 3, the discrimination processing control section 383 starts or ends the discrimination processing based on the observation mode designation signal, but the discrimination processing control section 383 can detect switching of the illumination light and the observation mode by the endoscope image instead of the observation mode designation signal. At this time, the discrimination processing control section 383 acquires the endoscopic image from the image acquisition section 54 or the image generation section 81. Then, switching of the illumination light and the observation mode is detected based on the characteristics (color tone, etc.) of the acquired endoscope image.
The specific illumination light is, for example, illumination light having a wavelength of 450nm or less. That is, it is preferable to start the determination process when the illumination light is switched to illumination light having a wavelength of 450nm or less. Since it is easy to observe a fine structure such as a blood vessel or a pit pattern in a mucosal surface layer by using illumination light having a wavelength of 450nm or less, a doctor or the like often desires detailed observation and determination results for diagnosis or the like. "use of illumination light having a wavelength of 450nm or less" means that illumination light having a wavelength of 450nm or less is actually used alone, and only reflected light of illumination light having a wavelength of 450nm or less or the like is used to image an observation target.
[ 4 th embodiment ]
The endoscope system 10 can start or end the determination process in accordance with "a change in the operation of the endoscope 12 or an endoscopic image" different from the above-described embodiments 1, 2, and 3. For example, the determination process can be started when an object passes through a channel inserted into the endoscope 12. The channel inserted into the endoscope 12 is, for example, a forceps channel 401 for inserting a treatment instrument such as a forceps and projecting toward the observation target from the distal end portion 12d, and an air supply and supply channel 402 for inserting water, air, or the like and discharging the water, air, or the like toward the observation target from the distal end portion 12d (see fig. 11).
When the determination process is started or ended when an object passes through a channel inserted into the endoscope 12, as shown in fig. 11, a passage sensor 403 for detecting passage of a treatment instrument such as a forceps is provided, for example, in the middle of the forceps channel 401 (for example, near the distal end portion 12 d). The image processing section 61 is provided with a discrimination processing control section 483 instead of the discrimination processing control section 83 and the like of embodiment 1. The determination processing control section 483 acquires a detection signal of the treatment instrument from the passage sensor 403 via the control section 52. When it is detected that the distal end of the treatment instrument or the like passes through the forceps channel 401, the determination process is started or ended.
Specifically, as shown in fig. 12, when an observation target is imaged (S411) and an endoscopic image is acquired, the determination processing control section 483 acquires a detection signal from the passage sensor 403, and detects whether or not the treatment instrument has passed through the forceps channel 401 based on the acquired detection signal (S412). When the treatment instrument does not pass through the forceps channel 401 (no in S412), the determination process is skipped. When it is detected that the treatment instrument has passed through the forceps channel 401 from the state where the treatment instrument is not inserted into the forceps channel 401 (yes in S412), and the treatment instrument protrudes from the distal end portion 12d toward the observation target (yes in S413), the determination processing control section 483 executes the determination processing in the determination processing section 82 (S414). When the discrimination processing is not executed in the previous frame, this actually becomes the start of the discrimination processing. On the other hand, when the treatment instrument passes through the forceps channel 401 (yes in S412), the case where the passage of the treatment instrument is detected once, and then the passage of the treatment instrument is detected again is the case where the treatment instrument is retracted (no in S413), and therefore the determination processing control section 483 skips the determination processing in the determination processing section 82. When the discrimination processing is executed in the previous frame, this actually becomes the end of the discrimination processing.
By repeating the above-described operations until the observation is completed (S415), the determination process is automatically performed only when the treatment instrument protrudes from the distal end portion 12d, and the determination process is automatically completed when the treatment instrument is retracted. When the treatment instrument is not used from the beginning, the determination process is not performed. In the case of using a treatment instrument, there is a lesion or the like, and the treatment is necessary, and therefore, it is often desired to perform the discrimination processing. Therefore, if the discrimination processing is performed when the treatment instrument is used as described above, the discrimination processing can be performed at a timing when a doctor or the like desires the discrimination processing, and the score 99 as the discrimination result can be automatically provided. Further, at unnecessary times, unnecessary determination processing is not performed, and thus the processing load on the processor device 16 can be reduced.
In the above-described embodiment 3, the passage sensor 403 is provided in the forceps channel 401 to detect the passage of the treatment instrument, but instead, the passage of water or the like in the air supply/water supply channel 402 may be detected. At this time, as shown in fig. 13, a passage sensor 404 for detecting the passage of water or the like is provided in the middle of the air/water supply passage 402 (for example, near the distal end portion 12 d). Then, the determination process control section 483 starts the determination process when water or the like passes through the air supply/water supply passage 402. In the case of discharging water or the like, for example, a portion in contact with water or the like or a peripheral portion thereof is observed in more detail, and therefore, it is sometimes desired to perform the discrimination processing simultaneously. Therefore, if the ejection of water or the like is detected and the discrimination processing is performed when the ejection of water or the like is detected as described above, the discrimination processing can be performed at a timing at which the doctor or the like desires the discrimination processing to be appropriate, and the score 99 as the discrimination result can be automatically provided. Further, at unnecessary times, unnecessary determination processing is not performed, and thus the processing load on the processor device 16 can be reduced.
In addition, the object detected by the sensor 404 in the air and water supply channel 402 is preferably water or a coloring agent for coloring the structure of the observation target or the like. This is because the coloring agent is used to facilitate observation of the structure of the observation target, and therefore, even when the coloring agent is spread, it is a time when the doctor or the like desires to perform the discrimination processing.
As described above, when the determination process is started when water, a coloring agent, or the like passes through air/water supply channel 402, it is preferable that the determination process is ended after a predetermined fixed time period has elapsed after the start of the determination process. This is because the time for water or coloring agent to pass through air and water supply channel 402 is extremely short, and the fixed time after water or coloring agent is ejected from front end portion 12d is the time for obtaining the effect of ejecting water or coloring agent and observing the observation target.
In the above-described embodiment 4 and the modification, the passage sensor 403 or the passage sensor 404 is used, but the determination processing control section 483 may detect water, a coloring agent, or the like from an endoscopic image instead of using the passage sensor 403 or the passage sensor 404. At this time, the determination processing control section 483 acquires an endoscopic image from the image acquisition section 54 or the image generation section 81. Then, the ejection of water, a coloring agent, or the like is detected based on a change in the characteristics of the acquired endoscopic image (disappearance of residue or the like, change in color of the observation target due to the coloring agent, or the like).
[ 5 th embodiment ]
The endoscope system 10 can start or end the determination process based on the "change in the operation of the endoscope 12 or the endoscope image" different from the above-described embodiments 1, 2, 3, and 4. For example, the determination process can be started or ended when the position at which the endoscope 12 captures the observation target changes or when the position at which the endoscope 12 captures the observation target does not change. The change in the imaging position described here includes a change in the imaging position due to a change in the direction of the distal end portion 12d with respect to the observation target, in addition to a change in the imaging position due to insertion and removal of the insertion portion 12 a.
As described above, when the determination process is started or ended when the position at which the endoscope 12 captures the observation target changes or when the position at which the endoscope 12 captures the observation target does not change, the image processing unit 61 is provided with the determination process control unit 583 instead of the determination process control unit 83 of embodiment 1 and the like, as shown in fig. 14. The determination process control unit 583 includes a position detection unit 586. The position detecting unit 586 detects a change or no change in the position at which the endoscope 12 captures an image of the observation target (hereinafter referred to as an imaging position) using the endoscope image. For example, the position detecting unit 586 acquires an endoscopic image from the image acquiring unit 54 or the image generating unit 81, and holds an endoscopic image of at least 1 frame or more in the past. When an endoscopic image is newly acquired, the newly acquired endoscopic image is compared with a held conventional endoscopic image, and a change in the imaging position is detected from a change in the observation target captured in the endoscopic image.
The determination processing control unit 583 uses the detection result of the position detection unit 586, and as a result, detects a change in the imaging position from a change in the observation target captured in the endoscopic image. The determination process control unit 583 controls the start and end of the determination process in the determination processing unit 82 based on a change or non-change in the detected imaging position. In the present embodiment, the determination process control unit 583 starts the determination process when the imaging position changes, and ends the determination process when the imaging position does not change.
Specifically, as shown in fig. 15, when the observation target is imaged (S511) and the determination processing control unit 583 and the position detection unit 586 acquire the endoscope image, the position detection unit 586 compares the held conventional endoscope image with the newly acquired endoscope image and detects a change in the imaging position. When the imaging position has not changed or when the change of the imaging position has stopped (no in S512), the determination process control unit 583 executes the determination process in the determination processing unit 82 (S513). When the discrimination processing is not executed in the previous frame, this actually becomes the start of the discrimination processing. On the other hand, when there is a change in the imaging position (yes in S512), the discrimination process control unit 583 skips the discrimination process in the discrimination process unit 82. When the discrimination processing is executed in the previous frame, this actually becomes the end of the discrimination processing.
By repeating the above operation until the observation is finished (S514), the determination process is automatically performed when there is no change in the imaging position, and the determination process is automatically finished in association with the change in the imaging position. In this way, the discrimination processing is performed when there is no change in the imaging position because, in a case where there is no change in the imaging position, it is necessary to observe in detail whether or not the observation target located at the imaging position is a lesion, and it is desirable to perform the discrimination processing at the same time. Therefore, when the discrimination processing is started and ended based on the change in the imaging position as described above, the discrimination processing can be performed at a timing at which the doctor or the like desires the discrimination processing to be appropriate, and the score 99 as the discrimination result can be provided. Further, at unnecessary times, unnecessary determination processing is not performed, and thus the processing load on the processor device 16 can be reduced.
In embodiment 5 described above, the determination process is performed when there is no change in the shooting position, but the determination process may be started when there is a change in the shooting position and ended when the change in the shooting position disappears. In this way, the specific contents of the discrimination processing mainly contribute to the search for the presence or absence of a lesion or the like (so-called screening), and the screening can be assisted by performing the discrimination processing when the screening is performed with a large change in the imaging position. Of course, when both the discrimination processing suitable for detailed observation and the discrimination processing suitable for screening are performed, the discrimination processing suitable for detailed observation can be performed when there is no change in the imaging position, and the discrimination processing suitable for screening can be performed when there is a change in the imaging position.
Further, "no change in the imaging position (no change in the imaging position)" means that the change in the imaging position is small compared to the screening and the like, and there is substantially no change in the imaging position. The "change in the imaging position" means that the imaging position is changed more greatly than in the detailed observation, and the imaging position is not substantially stationary.
In the above-described embodiment 5 and modification, the determination processing control unit 583 detects the imaging position using the endoscopic image, but as shown in fig. 16, the imaging position may be detected using a position sensor 591 (see fig. 16) instead of using the endoscopic image. The position sensor 591 is provided at the distal end portion 12d of the endoscope 12, for example, and detects the relative position of the distal end portion 12d with respect to a medical table (not shown) or the like. Then, the determination process control unit 583 detects a change (or no change) in the imaging position based on a change (or no change) in the position of the tip portion 12d detected by the position sensor 591. The position sensor 591 may be a sensor that detects the insertion length of the insertion portion 12a into the subject.
[ 6 th embodiment ]
The endoscope system 10 can start or end the determination process based on the "change in the operation of the endoscope 12 or the endoscope image" different from the above-described embodiments 1, 2, 3, 4, and 5. For example, the determination process can be started or ended when the endoscope 12 captures a still image of the observation target (that is, when there is a change in the operation of the endoscope 12).
As shown in fig. 17, when the endoscope 12 captures a still image of an observation target, the endoscope 12 is provided with a still image imaging instruction section 601 for inputting an imaging instruction of the still image. The still image imaging instruction unit 601 is, for example, a so-called freeze button or release button provided in the operation unit 12 b. An optical switch (not shown) or the like can be used as the still image imaging instruction unit 601. The still image imaging instructing unit 601 inputs an instruction to the control unit 52 to image a still image. As a result, the control unit 52 captures an observation target under the condition for still image capture by the endoscope 12, and generates at least 1 still image of the observation target by the image processing unit 61.
The image processing unit 61 is provided with a discrimination process control unit 683 instead of the discrimination process control unit 83 and the like in the above embodiments. The determination process control unit 683 acquires an image capturing instruction of a still image from the still image capturing instruction unit 601 via the control unit 52. As a result, the determination processing control unit 683 detects that the endoscope 12 has captured a still image. The determination process control unit 683 starts the determination process when the endoscope 12 captures a still image.
Specifically, as shown in fig. 18, while the observation target is being captured for the moving image (S611), when the endoscope 12 captures a still image of the observation target (S612: yes), the discrimination process control unit 683 executes the discrimination process (S613). When the discrimination processing is not executed in the previous frame, this actually becomes the start of the discrimination processing. On the other hand, when the endoscope 12 does not capture a still image (no in S612), the determination process control unit 683 skips the determination process (S613). Therefore, when a still image is captured in the previous frame and the discrimination processing is executed, this actually becomes the end of the discrimination processing.
By repeating the above operations until the observation is completed (S614), the determination process is automatically performed when the endoscope 12 captures a still image of the observation target, and the determination process is not performed when the endoscope 12 does not capture a still image of the observation target. The endoscope 12 captures a still image of an observation target, and it is necessary to observe the observation target in detail when there is a lesion or the like. Therefore, when the discrimination processing is started and ended when the endoscope 12 captures a still image as described above, the discrimination processing can be performed at an appropriate timing when the doctor or the like desires the discrimination processing, and the score 99 as the discrimination result can be provided. Further, at unnecessary times, unnecessary determination processing is not performed, and thus the processing load on the processor device 16 can be reduced.
Further, each determination processing control unit such as the determination processing control unit 83 in the above-described embodiments 1, 2, 3, 4, and 5 controls the start and end of the determination processing in the determination processing unit 82 based on the change in the operation of the endoscope 12 or the endoscope image, but each determination processing control unit in the above-described embodiments can control only one of the start and end of the determination processing in the determination processing unit 82 based on the change in the operation of the endoscope 12 or the endoscope image.
In addition, each of the discrimination processing control units in the above-described embodiments 1, 2, 3, 4, and 5 can end the discrimination processing when a predetermined time has elapsed after the start of the discrimination processing, in response to a change in the operation of the endoscope 12 or the start of the discrimination processing in the endoscope image. As described above, when the determination process is ended after the predetermined fixed time period has elapsed after the start of the determination process, it is not necessary to detect a change in the operation of the endoscope 12 or a change in the endoscope image for the end of the determination process, and therefore the processes of the determination process control units can be reduced in weight.
[ 7 th embodiment ]
In the above-described embodiments 1, 2, 3, 4, 5 and 6, the start or end of the determination process is controlled based on the change in the operation of the endoscope 12 or the endoscope image, but the determination result may be notified based on the change in the operation of the endoscope 12 or the start or end of the endoscope image. At this time, as shown in fig. 19, the image processing unit 61 is provided with a notification control unit 690 in addition to the determination processing control unit 83 and the like in embodiment 1.
The notification control unit 690 obtains the determination result from the determination processing unit 82. The notification control unit 690 inputs the obtained determination result to the display control unit 66 or to another output device (for example, a speaker that outputs sound or voice, or a light-emitting indicator). As a result, the notification control unit 690 notifies the user of the determination result by using any one of voice, image, and message, or a combination thereof, for example.
The notification control unit 690 does not always output the acquired determination result to the display control unit 66 or the like, but determines whether or not to output the acquired determination result to the display control unit 66 or the like based on a change in the operation of the endoscope 12 or an endoscope image, as in any of the determination processing control units in the above embodiments. As a result, the notification control unit 690 notifies the determination result of the start or end of the endoscope image based on the change in the operation of the endoscope 12. For example, when the score 99 is acquired as the determination result and it is determined that it is appropriate to notify the determination result from the change in the operation of the endoscope 12 or the endoscope image, the notification control unit 690 inputs the acquired score 99 to the display control unit 66, and as a result, notifies the determination result by displaying the score 99 on the display 18 (see fig. 5).
When a plurality of determination results are acquired, the notification control unit 690 can control the start or end of the notification of the determination results based on the change in the operation of the endoscope 12 or the endoscope image for each determination result. Note that the notification control unit 690 can start or end the notification of the determination result with the same reference as the reference for the start or end of the determination process. The notification control unit 690 may set the criterion for starting or ending the notification of the determination result to a criterion different from the criterion for starting or ending the determination process. For example, when the determination process control unit 83 controls the start or end of the determination process by the method according to embodiment 1, the notification control unit 690 can start or end the notification of the determination result with the same reference as the determination process control unit 283 according to embodiment 2.
The notification control unit 690 can notify the determination result and information related to the determination result together. For example, as shown in fig. 20, the notification control unit 690 can notify, as information related to the determination result, a score 692 and a degree of invasion 695 of a lesion, which are examples of the determination result, and also notify a message 693 such as an outline 691 of a specific portion (lesion), an image 694 of a similar case, or "the possibility of cancer existing in the region" as well.
The outline 691 of the specific portion (lesion) can be known from the discrimination processing unit 82, for example, the positional information of the specific portion subjected to the discrimination processing in the endoscopic image. The message 693 can be obtained by comparing the determination result with a message database (not shown) prepared in advance. In fig. 20, the message 693 is displayed on the display 18 to be notified, but the message 693 may be notified by voice. The image 694 of similar cases can be combined with the discrimination results, for example, obtained from a case database (not shown). In fig. 20, an image 694 of 1 similar case is displayed, but when there are a plurality of similar case images 694, the images of the plurality of similar cases can be displayed on the display 18.
In fig. 20, a plurality of scores 692 are shown in the form of a graph, but the same display can be performed in each of the above embodiments such as embodiment 1. In fig. 20, the degree of invasion 695 of a lesion as an example of the determination result is shown on the display 18, but when the determination result is shown in the form of a graph, a table, or the like, in each of the above embodiments such as embodiment 1, the determination result may be shown in the form of a graph, a table, or the like. The same applies to the score 692 and the determination results other than the degree of invasion 695 of the lesion. The outline 691 of the lesion, the message 693, and the image 694 of the similar case are examples of information related to the determination result, and the notification control unit 690 can notify other information related to the determination result. For example, in addition to the above, the notification control unit 690 can notify, as information relating to the determination result, the position of a specific portion in the subject, an endoscopic image (past image) captured in a past diagnosis, or the like.
In embodiment 7, the image processing unit 61 includes the discrimination processing control units and the notification control unit 690 of the embodiments such as the discrimination processing control unit 83 of embodiment 1, and the image processing unit 61 may include only the notification control unit 690 in place of the discrimination processing control unit 83. In this case, the endoscope system 10 includes: an image acquisition unit 54 for acquiring an endoscopic image obtained by imaging an observation target with an endoscope; a determination processing unit 82 for performing determination processing using the endoscopic image to determine a portion of the observation target having a specific feature; and a notification control unit 690 for controlling the start or end of notification of the determination result, which is the result of the determination process, based on the change in the operation of the endoscope 12 or the endoscope image.
Note that, in either the case where the image processing unit 61 includes the notification control unit 690 together with the determination process control unit 83 or the like or the case where the image processing unit 61 includes only the notification control unit 690 instead of the determination process control unit 83 or the like, the notification control unit 690 may be configured similarly to the determination process control unit 83 of embodiment 1.
For example, the notification control unit 690 preferably includes: a feature value calculation unit 86 for calculating a feature value using the endoscopic image; and a comparison unit 87 for comparing the feature value with the threshold value, and a notification control unit 690 for starting or ending the notification of the determination result based on the comparison result in the comparison unit 87.
The notification control unit 690 can start or end the notification of the determination result when the endoscope 12 enlarges or reduces the observation target.
The notification control unit 690 can start the notification of the determination result when the illumination light used when the endoscope 12 images the observation target is switched to the specific illumination light, and can end the notification of the determination result when the illumination light used when the endoscope images the observation target is switched to the illumination light other than the specific illumination light.
The notification controller 690 can start notification of the determination result when the illumination light is switched to illumination light having a wavelength of 450nm or less.
The notification control unit 690 can start or end the notification of the determination result when the object passes through the channel inserted into the endoscope 12.
The notification controller 690 can start the notification of the determination result when water passes through the passage.
The notification control section 690 can start notification of the determination result when the coloring agent passes through the channel.
The notification control unit 690 can start or end the notification of the determination result when the position at which the endoscope captures the observation target changes or when the position at which the endoscope captures the observation target does not change.
The notification control unit 690 can detect the position where the endoscope captures the observation target from the change in the observation target captured in the endoscope image.
The notification control unit 690 can detect the position of the observation target imaged by the endoscope using the position sensor.
The notification control unit 690 can end the notification of the determination result when the position at which the endoscope 12 captures the observation target changes.
The notification control unit 690 can start the notification of the determination result based on the change in the operation of the endoscope 12 or the notification of the determination result of the endoscopic image, and after the notification of the determination result is started, can end the notification of the determination result after a predetermined fixed time period has elapsed.
The notification control unit 690 can start or end the notification of the determination result when the endoscope 12 captures a still image.
In the above embodiments, the present invention is applied to the endoscope system 10 that performs observation by inserting the endoscope 12 provided with the image sensor 48 into the subject, but the present invention is also suitable for a capsule endoscope system. As shown in fig. 17, for example, the capsule endoscope system includes at least a capsule endoscope 700 and a processor device (not shown).
The capsule endoscope 700 includes a light source unit 702, a controller 703, an image sensor 704, an image processor 706, and a transmitting/receiving antenna 708. The light source unit 702 corresponds to the light source unit 20. The controller 703 functions in the same manner as the light source controller 22 and the controller 52. The controller 703 can communicate with the processor device of the capsule endoscope system by radio using the transmitting/receiving antenna 708. The processor device of the capsule endoscope system is substantially the same as the processor device 16 of the above-described embodiment, but the image processor 706 corresponding to the image acquiring unit 54 and the image processor 61 is provided in the capsule endoscope 700, and the endoscope image is transmitted to the processor device via the transmitting/receiving antenna 708. The image sensor 704 is configured similarly to the image sensor 48.
Description of the symbols
10-endoscope system, 12-endoscope, 12 a-insertion section, 12 b-operation section, 12 c-bending section, 12 d-tip section, 12 e-angle knob, 13 a-zoom operation section, 13 b-mode selector switch, 14-light source device, 16-Processor device, 18-display, 19-console, 20-light source section, 22-light source control section, 30 a-illumination optical system, 30 b-image pickup optical system, 41-light guide, 45-illumination lens, 46-objective lens, 47-zoom lens, 48-image sensor, 52-control section, 54-image acquisition section, 56-DSP (Digital Signal Processor, Digital Signal-Processor), 58-noise reduction section, 59-conversion unit, 61-image processing unit, 66-display control unit, 71-CNN (convolutional-Neural Network), 72-storage unit, 81-image generation unit, 82-discrimination processing unit, 83, 283, 383, 483, 583, 683-discrimination processing control unit, 86-feature value calculation unit, 87-comparison unit, 99, 692-score, 101, 102-endoscopic image, 401-forceps channel, 402-air supply channel, 403, 404-passage sensor, 586-position detection unit, 591-position sensor, 601-still image imaging instruction unit, 690-notification control unit, 691-contour, 693-message, 694-image of similar case, 695-invasion degree of lesion, 700-capsule endoscope, 702-light source unit, 703-control unit, 704-image sensor, 706-image processing unit, 708-transmitting/receiving antenna.

Claims (29)

1. An endoscope system comprising:
an image acquisition unit that acquires an endoscopic image obtained by imaging an observation target with an endoscope;
a determination processing unit configured to determine a portion having a specific feature in the observation target by performing determination processing using the endoscopic image; and
a determination process control unit for controlling the start or end of the determination process based on a change in the operation of the endoscope or the endoscope image,
the determination control unit starts the determination process based on a change in the operation of the endoscope or the endoscopic image, and ends the determination process after a predetermined fixed time period has elapsed after the determination process is started.
2. The endoscopic system of claim 1,
the discrimination processing control unit includes:
a feature value calculation unit that calculates a feature value using the endoscopic image; and
a comparison unit that compares the feature amount with a threshold value,
the discrimination processing control section starts or ends the discrimination processing based on a comparison result in the comparison section.
3. The endoscopic system of claim 1,
the discrimination processing control unit starts or ends the discrimination processing when the endoscope enlarges or reduces the observation target.
4. The endoscopic system of claim 1,
the discrimination processing control unit starts the discrimination processing when the illumination light used when the endoscope captures the observation target is switched to specific illumination light, and ends the discrimination processing when the illumination light used when the endoscope captures the observation target is switched to illumination light other than the specific illumination light.
5. The endoscopic system of claim 4,
the discrimination processing control unit starts the discrimination processing when the illumination light is switched to illumination light having a wavelength of 450nm or less.
6. The endoscopic system of claim 1,
the determination processing control unit starts or ends the determination processing when a position at which the endoscope captures the observation target changes or when a position at which the endoscope captures the observation target does not change.
7. The endoscopic system of claim 6,
the determination processing control unit detects a position where the endoscope captures the observation target based on a change in the observation target captured in the endoscope image.
8. The endoscopic system of claim 6,
the determination processing control unit detects a position at which the endoscope captures the observation target using a position sensor.
9. The endoscopic system of claim 6,
the discrimination processing control unit terminates the discrimination processing when a position at which the endoscope captures the observation target changes.
10. The endoscopic system of claim 1,
the determination processing control unit starts or ends the determination processing when the endoscope captures a still image.
11. An endoscope system comprising:
an image acquisition unit that acquires an endoscopic image obtained by imaging an observation target with an endoscope;
a determination processing unit configured to determine a portion having a specific feature in the observation target by performing determination processing using the endoscopic image; and
a determination process control unit for controlling the start or end of the determination process based on a change in the operation of the endoscope or the endoscope image,
the discrimination processing control unit starts or ends the discrimination processing when an object passes through a channel inserted into the endoscope.
12. The endoscopic system of claim 11,
the discrimination processing control unit starts the discrimination processing when water passes through the passage.
13. The endoscopic system of claim 11,
the discrimination processing control section starts the discrimination processing when the coloring agent passes through the channel.
14. An endoscope system comprising:
an image acquisition unit that acquires an endoscopic image obtained by imaging an observation target with an endoscope;
a determination processing unit configured to determine a portion having a specific feature in the observation target by performing determination processing using the endoscopic image;
a determination process control unit that controls the start or end of the determination process based on a change in the operation of the endoscope or the endoscope image; and
a notification control unit that controls start or end of notification of a determination result that is a result of the determination process, based on a change in the operation of the endoscope or the endoscope image,
the notification control unit starts the notification of the determination result based on a change in the operation of the endoscope or the endoscopic image, and ends the notification of the determination result after a predetermined fixed time period has elapsed after the start of the notification of the determination result.
15. The endoscopic system of claim 14,
the notification control unit includes:
a feature value calculation unit that calculates a feature value using the endoscopic image; and
a comparison unit that compares the feature amount with a threshold value,
the notification control unit starts or ends the notification of the determination result based on the comparison result in the comparison unit.
16. The endoscopic system of claim 14,
the notification control unit starts or ends the notification of the determination result when the endoscope enlarges or reduces the observation target.
17. The endoscopic system of claim 14,
the notification control unit starts the notification of the determination result when the illumination light used when the endoscope captures the observation target is switched to specific illumination light, and ends the notification of the determination result when the illumination light used when the endoscope captures the observation target is switched to illumination light other than the specific illumination light.
18. The endoscopic system of claim 17,
the notification control unit starts notification of the determination result when the illumination light is switched to illumination light having a wavelength of 450nm or less.
19. The endoscopic system of claim 14,
the notification control unit starts or ends the notification of the determination result when the position at which the endoscope captures the observation target changes or when the position at which the endoscope captures the observation target does not change.
20. The endoscopic system of claim 19,
the notification control unit detects a position where the endoscope captures the observation target based on a change in the observation target captured in the endoscope image.
21. The endoscopic system of claim 19,
the notification control unit detects a position at which the endoscope captures the observation target using a position sensor.
22. The endoscopic system of claim 19,
the notification control unit terminates the notification of the determination result when the position at which the endoscope captures the observation target changes.
23. The endoscopic system of claim 14,
the notification control unit starts or ends the notification of the determination result when the endoscope captures a still image.
24. The endoscopic system of claim 14,
the notification control unit notifies the determination result by using any one of voice, image, and message, or a combination thereof.
25. An endoscope system comprising:
an image acquisition unit that acquires an endoscopic image obtained by imaging an observation target with an endoscope;
a determination processing unit configured to determine a portion having a specific feature in the observation target by performing determination processing using the endoscopic image;
a determination process control unit that controls the start or end of the determination process based on a change in the operation of the endoscope or the endoscope image; and
a notification control unit that controls start or end of notification of a determination result that is a result of the determination process, based on a change in the operation of the endoscope or the endoscope image,
the notification control unit starts or ends the notification of the determination result when an object passes through a channel inserted into the endoscope.
26. The endoscopic system of claim 25,
the notification control section starts notification of the determination result in a case where water passes through the passage.
27. The endoscopic system of claim 25,
the notification control section starts notification of the discrimination result when the coloring agent passes through the channel.
28. A processor device is provided with:
an image acquisition unit that acquires an endoscopic image obtained by imaging an observation target with an endoscope;
a determination processing unit configured to determine a portion having a specific feature in the observation target by performing determination processing using the endoscopic image; and
a determination process control unit for controlling the start or end of the determination process based on a change in the operation of the endoscope or the endoscope image,
the determination control unit starts the determination process based on a change in the operation of the endoscope or the endoscopic image, and ends the determination process after a predetermined fixed time period has elapsed after the determination process is started.
29. A processor device is provided with:
an image acquisition unit that acquires an endoscopic image obtained by imaging an observation target with an endoscope;
a determination processing unit configured to determine a portion having a specific feature in the observation target by performing determination processing using the endoscopic image; and
a determination process control unit for controlling the start or end of the determination process based on a change in the operation of the endoscope or the endoscope image,
the discrimination processing control unit starts or ends the discrimination processing when an object passes through a channel inserted into the endoscope.
CN201880015393.XA 2017-03-03 2018-02-22 Endoscope system, processor device, and method for operating endoscope system Active CN110381807B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2017-040344 2017-03-03
JP2017040344 2017-03-03
PCT/JP2018/006547 WO2018159461A1 (en) 2017-03-03 2018-02-22 Endoscope system, processor device, and method of operating endoscope system

Publications (2)

Publication Number Publication Date
CN110381807A CN110381807A (en) 2019-10-25
CN110381807B true CN110381807B (en) 2022-01-18

Family

ID=63370447

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880015393.XA Active CN110381807B (en) 2017-03-03 2018-02-22 Endoscope system, processor device, and method for operating endoscope system

Country Status (5)

Country Link
US (2) US11759092B2 (en)
EP (1) EP3590415A4 (en)
JP (1) JP6785942B2 (en)
CN (1) CN110381807B (en)
WO (1) WO2018159461A1 (en)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018098465A1 (en) 2016-11-28 2018-05-31 Inventio, Inc. Endoscope with separable, disposable shaft
US10810460B2 (en) * 2018-06-13 2020-10-20 Cosmo Artificial Intelligence—AI Limited Systems and methods for training generative adversarial networks and use of trained generative adversarial networks
US11100633B2 (en) 2018-06-13 2021-08-24 Cosmo Artificial Intelligence—Al Limited Systems and methods for processing real-time video from a medical image device and detecting objects in the video
CN112584746A (en) * 2018-08-23 2021-03-30 富士胶片株式会社 Medical image processing apparatus, endoscope system, and method for operating medical image processing apparatus
JP7326308B2 (en) * 2018-09-11 2023-08-15 富士フイルム株式会社 MEDICAL IMAGE PROCESSING APPARATUS, OPERATION METHOD OF MEDICAL IMAGE PROCESSING APPARATUS, ENDOSCOPE SYSTEM, PROCESSOR DEVICE, DIAGNOSTIC SUPPORT DEVICE, AND PROGRAM
JP6872581B2 (en) * 2018-12-04 2021-05-19 Hoya株式会社 Information processing equipment, endoscope processors, information processing methods and programs
WO2020116115A1 (en) * 2018-12-04 2020-06-11 Hoya株式会社 Information processing device and model generation method
US20220148182A1 (en) * 2019-03-12 2022-05-12 Nec Corporation Inspection device, inspection method and storage medium
JP7214886B2 (en) * 2019-10-23 2023-01-30 富士フイルム株式会社 Image processing device and its operating method
USD1018844S1 (en) 2020-01-09 2024-03-19 Adaptivendo Llc Endoscope handle
WO2021140644A1 (en) * 2020-01-10 2021-07-15 日本電気株式会社 Endoscopy assistance device, endoscopy assistance method, and computer-readable recording medium
WO2021149552A1 (en) * 2020-01-20 2021-07-29 富士フイルム株式会社 Medical image processing device, method for operating medical image processing device, and endoscope system
CN115279247A (en) * 2020-03-11 2022-11-01 奥林巴斯株式会社 Processing system, image processing method, and learning method
EP4119999A4 (en) 2020-03-11 2023-09-06 FUJIFILM Corporation Endoscope system, control method, and control program
CN115397303A (en) * 2020-04-08 2022-11-25 富士胶片株式会社 Processor device and working method thereof
WO2022097294A1 (en) * 2020-11-09 2022-05-12 オリンパス株式会社 Information processing system, endoscope system, and information processing method
WO2024013848A1 (en) * 2022-07-12 2024-01-18 日本電気株式会社 Image processing device, image processing method, and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102201109A (en) * 2010-03-26 2011-09-28 株式会社岛津制作所 Image processing method and radiographic apparatus using the same
CN102753078A (en) * 2010-09-24 2012-10-24 奥林巴斯医疗株式会社 Image-display device and capsule-type endoscope system
WO2013065473A1 (en) * 2011-10-31 2013-05-10 オリンパスメディカルシステムズ株式会社 Medical device
CN103356292A (en) * 2012-03-05 2013-10-23 株式会社东芝 Medical image processing system
EP2910173A1 (en) * 2012-10-18 2015-08-26 Olympus Medical Systems Corp. Image processing device, and image processing method
JP2016007355A (en) * 2014-06-24 2016-01-18 富士フイルム株式会社 Light source device, endoscope system, operation method of light source device, and operation method of endoscope system
JP2016067706A (en) * 2014-09-30 2016-05-09 富士フイルム株式会社 Processor device, endoscope system, method for operating processor device, and program
JP2016067775A (en) * 2014-09-30 2016-05-09 富士フイルム株式会社 Endoscope system, processor device, method for operating endoscope system, and method for operating processor device
JP2016144507A (en) * 2015-02-06 2016-08-12 富士フイルム株式会社 Endoscope system, processor device, and operation method for endoscope system
CN105916430A (en) * 2014-11-25 2016-08-31 索尼公司 Endoscope system, method for operating endoscope system, and program

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4622954A (en) * 1984-05-15 1986-11-18 Fuji Photo Optical Co., Ltd. Endoscope having a plate-like image sensor for forming images
JP4063933B2 (en) * 1997-12-01 2008-03-19 オリンパス株式会社 Surgery simulation device
JP4936528B2 (en) * 2007-03-28 2012-05-23 富士フイルム株式会社 Capsule endoscope system and method for operating capsule endoscope system
CA2682940A1 (en) * 2007-04-11 2008-10-23 Forth Photonics Limited A supporting structure and a workstation incorporating the supporting structure for improving, objectifying and documenting in vivo examinations of the uterus
JP2011131002A (en) * 2009-12-25 2011-07-07 Fujifilm Corp Fluorescent image capturing apparatus
JP5528255B2 (en) 2010-08-16 2014-06-25 Hoya株式会社 Endoscopic image processing system
JP5562808B2 (en) * 2010-11-11 2014-07-30 オリンパス株式会社 Endoscope apparatus and program
JP5657375B2 (en) * 2010-12-24 2015-01-21 オリンパス株式会社 Endoscope apparatus and program
JP5329593B2 (en) * 2011-04-01 2013-10-30 富士フイルム株式会社 Biological information acquisition system and method of operating biological information acquisition system
JP5611892B2 (en) * 2011-05-24 2014-10-22 富士フイルム株式会社 Endoscope system and method for operating endoscope system
WO2015040570A1 (en) * 2013-09-18 2015-03-26 Illumigyn Ltd. Optical speculum
WO2017027638A1 (en) * 2015-08-10 2017-02-16 The Board Of Trustees Of The Leland Stanford Junior University 3d reconstruction and registration of endoscopic data
WO2017075085A1 (en) * 2015-10-28 2017-05-04 Endochoice, Inc. Device and method for tracking the position of an endoscope within a patient's body
WO2022003493A1 (en) * 2020-06-30 2022-01-06 Auris Health, Inc. Robotic medical system with collision proximity indicators
US20210401527A1 (en) * 2020-06-30 2021-12-30 Auris Health, Inc. Robotic medical systems including user interfaces with graphical representations of user input devices
US11759270B2 (en) * 2020-07-23 2023-09-19 Cilag Gmbh International Robotic surgical tool with replaceable carriage
US11751954B2 (en) * 2020-07-23 2023-09-12 Cilag Gmbh International Robotic surgical tool with translatable drive puck
US11974823B2 (en) * 2020-07-23 2024-05-07 Cilag Gmbh International Robotic surgical tool with drop in instrumentation core
US11918195B2 (en) * 2020-09-29 2024-03-05 Cilag Gmbh International Decoupling mechanisms for surgical tools
US11678870B2 (en) * 2020-09-29 2023-06-20 Cilag Gmbh International Manual drive function for surgical tools
US11701116B2 (en) * 2020-09-29 2023-07-18 Cilag Gmbh International Manual drive functions for surgical tool having carriage architecture

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102201109A (en) * 2010-03-26 2011-09-28 株式会社岛津制作所 Image processing method and radiographic apparatus using the same
CN102753078A (en) * 2010-09-24 2012-10-24 奥林巴斯医疗株式会社 Image-display device and capsule-type endoscope system
WO2013065473A1 (en) * 2011-10-31 2013-05-10 オリンパスメディカルシステムズ株式会社 Medical device
CN103356292A (en) * 2012-03-05 2013-10-23 株式会社东芝 Medical image processing system
EP2910173A1 (en) * 2012-10-18 2015-08-26 Olympus Medical Systems Corp. Image processing device, and image processing method
JP2016007355A (en) * 2014-06-24 2016-01-18 富士フイルム株式会社 Light source device, endoscope system, operation method of light source device, and operation method of endoscope system
JP2016067706A (en) * 2014-09-30 2016-05-09 富士フイルム株式会社 Processor device, endoscope system, method for operating processor device, and program
JP2016067775A (en) * 2014-09-30 2016-05-09 富士フイルム株式会社 Endoscope system, processor device, method for operating endoscope system, and method for operating processor device
CN105916430A (en) * 2014-11-25 2016-08-31 索尼公司 Endoscope system, method for operating endoscope system, and program
JP2016144507A (en) * 2015-02-06 2016-08-12 富士フイルム株式会社 Endoscope system, processor device, and operation method for endoscope system

Also Published As

Publication number Publication date
CN110381807A (en) 2019-10-25
WO2018159461A1 (en) 2018-09-07
JPWO2018159461A1 (en) 2019-12-26
EP3590415A4 (en) 2020-03-18
US20190380617A1 (en) 2019-12-19
JP6785942B2 (en) 2020-11-18
EP3590415A1 (en) 2020-01-08
US20230404365A1 (en) 2023-12-21
US11759092B2 (en) 2023-09-19

Similar Documents

Publication Publication Date Title
CN110381807B (en) Endoscope system, processor device, and method for operating endoscope system
US10799098B2 (en) Medical image processing device, endoscope system, diagnosis support device, and medical service support device
US11259692B2 (en) Endoscope system, processor device, and method for operating endoscope system
US10335014B2 (en) Endoscope system, processor device, and method for operating endoscope system
US10709310B2 (en) Endoscope system, processor device, and method for operating endoscope system
JP5269921B2 (en) Electronic endoscope system and method for operating electronic endoscope system
CN110461209B (en) Endoscope system and processor device
JP6533180B2 (en) Endoscope system, processor device, and operation method of endoscope system
JP6941233B2 (en) Image processing equipment, endoscopic system, and image processing method
WO2016121556A1 (en) Processor device for endoscope, method for operating same, and control program
US11044416B2 (en) Endoscope system, processor device, and endoscope system operation method
JP7374280B2 (en) Endoscope device, endoscope processor, and method of operating the endoscope device
JP2019030406A (en) Endoscope system
JP2016158837A (en) Endoscope light source device, endoscope system, and operation method of endoscope light source device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant