CN111035348A - Endoscope system - Google Patents

Endoscope system Download PDF

Info

Publication number
CN111035348A
CN111035348A CN201910962096.2A CN201910962096A CN111035348A CN 111035348 A CN111035348 A CN 111035348A CN 201910962096 A CN201910962096 A CN 201910962096A CN 111035348 A CN111035348 A CN 111035348A
Authority
CN
China
Prior art keywords
image
view
display
direct
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910962096.2A
Other languages
Chinese (zh)
Other versions
CN111035348B (en
Inventor
北野亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Corp
Original Assignee
Fujifilm Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujifilm Corp filed Critical Fujifilm Corp
Publication of CN111035348A publication Critical patent/CN111035348A/en
Application granted granted Critical
Publication of CN111035348B publication Critical patent/CN111035348B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00174Optical arrangements characterised by the viewing angles
    • A61B1/00181Optical arrangements characterised by the viewing angles for multiple fixed viewing angles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00006Operational features of endoscopes characterised by electronic signal processing of control signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000094Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00055Operational features of endoscopes provided with output arrangements for alerting the user
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00064Constructional details of the endoscope body
    • A61B1/00071Insertion part of the endoscope body
    • A61B1/0008Insertion part of the endoscope body characterised by distal tip features
    • A61B1/00091Nozzles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00064Constructional details of the endoscope body
    • A61B1/00071Insertion part of the endoscope body
    • A61B1/0008Insertion part of the endoscope body characterised by distal tip features
    • A61B1/00096Optical elements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00174Optical arrangements characterised by the viewing angles
    • A61B1/00177Optical arrangements characterised by the viewing angles for 90 degrees side-viewing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/005Flexible endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/05Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0661Endoscope light sources
    • A61B1/0684Endoscope light sources using light emitting diodes [LED]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/07Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements using light-conductive means, e.g. optical fibres
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/12Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with cooling or rinsing arrangements
    • A61B1/126Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with cooling or rinsing arrangements provided with means for cleaning in-use
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/76Circuitry for compensating brightness variation in the scene by influencing the image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/268Signal distribution or switching
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/2407Optical details
    • G02B23/2423Optical details of the distal end
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/2407Optical details
    • G02B23/2446Optical details of the image relay
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/555Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes

Abstract

The invention provides an endoscope system which can maintain the concentration of a user even under the condition that a focus area is detected when a direct-view observation image or a side-view observation image is displayed on a display part. A direct-view image (100) is displayed on a display (18), and a side-view image (102) is displayed on the display (18) in a non-display state. A region of interest is detected using the side-looking observation image (102). An observation display region (100a) in which a direct-view observation image (100) is displayed on a display (18) is maintained regardless of the presence or absence of a detection target region.

Description

Endoscope system
Technical Field
The present invention relates to an endoscope system capable of direct-view and side-view observation.
Background
In the medical field, diagnosis is generally performed using an endoscope system including a light source device, an endoscope, and a processor device. In an endoscope system, a light source device generates illumination light. The endoscope has a flexible insertion portion, and when the insertion portion is inserted into a subject, an image of an observation target is captured using, for example, an image sensor mounted on a distal end portion (hereinafter, referred to as a distal end portion) of the insertion portion. Then, the processor device generates an image of the observation target and displays the image on the display.
As endoscopes used in conventional endoscope systems, a direct view type for imaging an observation target located in a direct view direction indicating a distal end direction of a distal end portion (i.e., a front direction along an insertion direction of an insertion portion) and a side view type for imaging an observation target located in a side view direction indicating a side direction of the distal end portion (i.e., an outer peripheral direction of the insertion portion) are known. In recent years, as shown in patent documents 1 and 2, endoscopes are known which are configured to be able to observe both a direct view direction and a side view direction, thereby attempting to obtain a wider range of viewing angles (see patent documents 1 and 2).
However, if both the direct-view direction and the side-view direction are observed, the direct-view image, which is an image in the direct-view direction, and the side-view image, which is an image in the side-view direction, are displayed on one screen, and therefore, there is a problem that it becomes difficult for the user to read information on each image. In contrast, in patent document 3, basically, only the direct-view observation image is displayed on the screen, and when the lesion feature amount is automatically detected, the side-view observation image is displayed on the screen. Thus, the user's concentration is maintained by displaying the side-view image and directing the user's attention to the direct-view image only when necessary, such as when a lesion is detected.
Patent document 1: japanese patent No. 5583873
Patent document 2: japanese patent No. 5698879
Patent document 3: japanese patent No. 6001219
However, in patent document 3, when a side-view observation image is displayed by detecting a region of interest such as a lesion, the size of a direct-view observation image is reduced in order to display the side-view observation image on a screen. In this way, if the size of the through-view image before and after the detection of the lesion changes, it may be difficult to maintain the concentration of the user.
Disclosure of Invention
The present invention aims to provide an endoscope system capable of maintaining the concentration of a user even when the detection of a region of interest is notified when either a direct-view observation image or a side-view observation image is displayed on a display unit.
An endoscope system of the present invention includes: an endoscope having an insertion portion to be inserted into an observation target, a direct-view observation portion having a field of view in a distal end direction of the insertion portion, and a side-view observation portion having a field of view in a side surface direction of the insertion portion; an image acquisition unit that acquires a direct-view observation image using the direct-view observation unit and acquires a side-view observation image using the side-view observation unit; a display control unit that displays one of the direct-view image and the side-view image on the display unit as a display image and displays the other as a non-display image; and a region-of-interest detection unit that detects the region of interest using the display image or the non-display image, wherein the display control unit maintains the 1 st display region in which the display image is displayed on the display unit, regardless of whether or not the region of interest is detected by the region-of-interest detection unit.
When the region of interest is detected, the display control unit preferably displays a detection flag indicating that the region of interest is detected on the display unit. The detection marker preferably indicates the position of the region of interest. The detection marker is preferably displayed in the 2 nd display region provided at a position different from the 1 st display region. The detection indicator is preferably displayed in a 3 rd display region, the 3 rd display region being provided around the 1 st display region and formed in a shape corresponding to the 1 st display region. The 3 rd display area preferably has a wheel shape. The detection marker preferably changes in color or shape according to the size or type of the region of interest.
When the region of interest is detected, the display control unit preferably displays a region of interest image including a portion of the non-display image in which the region of interest is detected on the display unit. When the region of interest is detected, the display control unit preferably cancels the non-display of the portion where the region of interest is detected and displays the portion on the display unit. The display control unit preferably electronically enlarges and displays the display image on the display unit. The image acquisition unit preferably acquires a direct-view observation image and a side-view observation image from an image obtained by one image sensor. The through-view image is preferably obtained by an image sensor for through-view image acquisition, and the side-view image is preferably obtained by an image sensor for side-view image acquisition that is different from the image sensor for through-view image acquisition. The endoscope system preferably includes: and a mode switching switch for switching between a normal display mode in which both the direct-view image and the side-view image are displayed on the display unit and a specific display mode in which the display image is displayed on the display unit and the non-display image is not displayed.
The endoscope system preferably includes: the luminance information acquisition unit acquires luminance information for a non-display image from the non-display image, and the non-display image is subjected to gain processing for setting a target luminance based on the luminance information for the non-display image. The endoscope system preferably includes: the image processing unit performs different image processing on the non-display image for each frame. Preferably, the endoscope system includes a direct-view observation window provided in the direct-view observation unit, a side-view observation window provided in the side-view observation unit, a side-view liquid supply line for supplying a cleaning liquid to a side-view nozzle used for cleaning the side-view observation window, and a direct-view liquid supply line for supplying a cleaning liquid to a direct-view nozzle used for cleaning the direct-view observation window, the side-view liquid supply line and the side-view nozzle being provided separately from each other, and the cleaning liquid is automatically blown to the side-view observation window through the side-view liquid supply line and the side-view nozzle when a.
Effects of the invention
According to the present invention, when any one of the direct-view image and the side-view image is displayed on the display unit, even when the detection of the region of interest is notified, the user's concentration can be maintained.
Drawings
Fig. 1 is an external view of an endoscope system.
Fig. 2 is an external perspective view of the tip portion.
Fig. 3 is a front view of the front end portion.
Fig. 4 is a partial cross-sectional view of the 1 st projection.
Fig. 5 is a cross-sectional view of the 2 nd projection.
Fig. 6 is a block diagram of an endoscope system.
Fig. 7 is an explanatory diagram showing an image sensor.
Fig. 8 is an image diagram of a display for displaying a direct-view image and a side-view image.
Fig. 9 is an image diagram of a display that displays only a direct-view image.
Fig. 10 is an image diagram of a display for displaying a direct-view image and a detection marker.
Fig. 11 is an image diagram of a display for displaying a direct-view image and a detection marker displayed in a marker display region.
Fig. 12 is an image diagram of a display for displaying a direct-view image and a region-of-interest image.
Fig. 13(a) is an image diagram of a display showing a display form when the region of interest is not detected, and fig. 13(B) is an image diagram of a display showing a display form when the region of interest is detected.
Fig. 14 is an image diagram of a display that displays a direct-view image, an arrow of an image sensor that detects a region of interest, and an image of the region of interest.
Fig. 15 is an explanatory diagram illustrating AE control in the normal display mode.
Fig. 16 is an explanatory diagram illustrating AE control in the specific display mode.
Fig. 17 is a block diagram showing an endoscope system including a direct-view pump, a side-view pump, and the like.
Fig. 18 is an explanatory diagram showing an operation related to cleaning of the direct-view observation window or the side-view observation window performed when the normal display mode is set.
Fig. 19 is an explanatory diagram showing an operation related to cleaning of the direct-view observation window or the side-view observation window performed when the specific display mode is set.
Description of the symbols
10-endoscopic system, 11-universal cord, 12-endoscope, 12A-insertion section, 12 b-operation section, 12 c-bending section, 12 d-tip section, 12 e-corner button, 13A-wash switch, 13 b-mode selector switch, 14-light source device, 16-processor device, 17-can, 18-display, 19-user interface, 21-front face, 31-1 st protrusion, 32-2 nd protrusion, 41-direct view section, 41A-direct view window, 42-side view section, 42A-side view window, 43-direct view side view illumination section, 43-side view illumination section, 43A-direct view side view illumination window, 51, 52, 53-nozzle, 54-direct view illumination section, 54A-direct-view illumination window, 61-imaging lens, 62-front group lens, 63-reflection lens, 64-rear group lens, 66-image sensor, 66 a-direct-view imaging region, 66 b-side-view imaging region, 67-cover glass, 71-light guide, 72-reflection member, 73-filling member, 77-light guide, 78-illumination lens, 81-direct-view illumination section, 81A-direct-view illumination window, 82-nip, 84-light guide, 91-light source, 92-light source control section, 93-light guide, 96-control section, 96 a-luminance information calculation section, 97-image acquisition section, 98-display control section, 99-image processing section, 99-region-of-interest detection section, 99 a-region-of-interest detecting section, 99 b-foreign matter detecting section, 100-direct-view observation image, 100 a-observation display region, 100 b-observation display region, 100 c-observation display region, 102-side-view observation image, 102 a-part, 103-dead corner, 105-image, 108-detection marker, 110-peripheral region section, 110 a-sub-screen, 114-marker display region, 116-detection marker, 120-arrow, 122-image, 130-direct-view liquid sending line, 132-side-view liquid sending line, 134-side-view pump, 136-direct-view pump, ROI-region-of interest.
Detailed Description
As shown in fig. 1, the endoscope system 10 includes an endoscope 12 that images an observation target, a light source device 14 that generates illumination light, a processor device 16 that generates an image for observation (hereinafter, referred to as an observation image) using an image obtained by imaging the observation target (hereinafter, referred to as a captured image), a display 18 that is a display unit that displays the observation image, and a user interface 19 that is one of the user interfaces. The endoscope 12 is optically connected to the light source device 14 using the universal cord 11 and is electrically connected to the processor device 16. The endoscope 12 is connected to a tank 17 for storing a cleaning liquid (e.g., water) and the like by using a universal cord 11. A mechanism such as a pump for conveying the cleaning liquid or the like in the tank 17 is provided in the light source device 14, for example. In fig. 1, a keyboard is provided as the user interface 19, but a mouse, a touch panel, or the like may be provided.
The endoscope 12 includes an insertion portion 12a to be inserted into the subject, an operation portion 12b located at a proximal end portion of the insertion portion 12a, and a bending portion 12c and a distal end portion 12d located at a distal end side of the insertion portion 12 a. When the corner knob 12e located in the operation portion 12b is operated, the bending portion 12c is bent. As a result of the bending of the bent portion 12c, the distal end portion 12d is oriented in a desired direction.
The operating portion 12b includes, for example, a cleaning switch 13a for ejecting a cleaning liquid from a nozzle located at the distal end portion 12d, in addition to the corner knob 12 e. When dirt adheres to the distal end portion 12d due to contact with an observation object or the like, when the cleaning switch 13a is pressed, the cleaning liquid is discharged from the nozzle located at the distal end portion 12d toward at least a part of the distal end portion 12d, and as a result, the portion of the distal end portion 12d to which the cleaning liquid is discharged can be cleaned. In the endoscope system 10, the cleaning liquid is water or a liquid such as a drug solution. In the present specification, the cleaning liquid is referred to as a "cleaning liquid" for convenience of description, but as long as the cleaning liquid is used, a gas such as air ejected from a nozzle, a mixture of a solid and a substance having a different phase (phase), or the like is also included in the "cleaning liquid".
The operation unit 12b is provided with a mode selector switch 13 b. In the present embodiment, as will be described later, the endoscope system 10 is provided with two modes: a normal display mode (see fig. 8) in which a through-view image 100 obtained using the through-view unit 41 is displayed on the display 18 for the central portion, and a side-view image 102 obtained using the side-view unit 42 is displayed for the outer peripheral portion of the through-view image 100; in the specific display mode, the direct-view image 100 is displayed on the display 18 in an enlarged manner, and the side-view image is used only for detecting the region of interest and is not displayed on the display 18. The switching between the normal display mode and the specific display mode is performed by the mode switching switch 13 b.
As shown in fig. 2 and 3, the distal end portion 12d of the insertion portion 12a to be inserted into the observation target has two projecting portions, i.e., a 1 st projecting portion 31 and a 2 nd projecting portion 32, which project further from the distal end surface 21 of the distal end portion 12d toward the Z direction, which is the distal end direction of the insertion portion 12 a. The 1 st projection 31 and the 2 nd projection 32 are adjacent to each other. Hereinafter, the direction in which the 1 st protrusion 31 exists with respect to the 2 nd protrusion 32 is referred to as the Y direction, and the direction perpendicular to the Z direction and the Y direction is referred to as the X direction. The Z direction is also referred to as a direct-view direction since it is a visual field direction of the direct-view observation unit 41. The Y direction is also referred to as a side view direction since it is a view direction of the side view observation unit 42. The positive side in the X direction of the insertion portion 12a, the distal end portion 12d, the 1 st projection 31, or the 2 nd projection 32 is referred to as "left", and the negative side in the X direction is referred to as "right". The positive Z-direction side of the insertion portion 12a, the distal end portion 12d, the 1 st projection 31, or the 2 nd projection 32 is the "front" or "distal end (distal direction)", and the negative Z-direction side is the "base end (base direction)".
The 1 st projecting portion 31 has a substantially cylindrical shape as a whole, and includes a direct-view observation window 41A as an observation window of the direct-view observation portion 41 at a distal end thereof, and a side-view observation window 42A as an observation window of the side-view observation portion 42 at a side surface thereof. The direct-view observation unit 41 has a field of view in the distal end direction of the insertion unit 12a, and images an observation target located in the distal end direction of the insertion unit 12 a. The direct-view observation unit 41 includes, for example, an imaging lens, an image sensor, and the like. An optical component such as an imaging lens or a transparent protective member for protecting the optical component such as the imaging lens constituting the direct-view observation unit 41 is exposed to the tip (surface facing the Z direction) of the 1 st projecting unit 31. The portion exposed at the distal end of the 1 st protruding portion 31 is a direct-view observation window 41A into which light incident from an observation target located in the distal end direction with respect to the insertion portion 12a is taken.
The side view observation unit 42 has a field of view in the side surface direction of the insertion portion 12a, and photographs an observation target located in the side surface direction of the insertion portion 12 a. The side-view observation unit 42 includes, for example, an imaging lens, an image sensor, and the like, as in the direct-view observation unit 41. Among them, the optical components such as the imaging lens and the like constituting the side-view observation unit 42, and the transparent protective member for protecting the optical components such as the imaging lens and the like are exposed to the side surface of the 1 st projecting portion 31 (the surface forming the outer periphery of the 1 st projecting portion 31). The exposed portion on the side surface of the 1 st protruding portion 31 is a side view observation window 42A that takes in light incident from an observation target located in the side surface direction with respect to the insertion portion 12A. In the endoscope 12 of the present embodiment, the side-view observation portion 42 is exposed to the entire circumference of the 1 st protruding portion 31 in the circumferential direction of the 1 st protruding portion 31 except for the joint portion between the 1 st protruding portion 31 and the 2 nd protruding portion 32, and forms one side-view observation window 42A in a band shape.
The 1 st projecting portion 31 further includes a direct-view side-view illumination window 43A, which is an illumination window of the direct-view side-view illumination portion 43, in addition to the direct-view observation window 41A and the side-view observation window 42A. The direct-view side illumination section 43 emits illumination light from the direct-view side illumination window 43A toward the fields of view of the direct-view observation section 41 and the side-view observation section 42. The direct-view side illumination unit 43 includes, for example, a light guide for guiding illumination light emitted from the light source device 14, and optical members such as lenses and mirrors for diffusing the illumination light guided to the distal end portion 12d by the light guide toward the fields of view of the direct-view observation unit 41 and the side-view observation unit 42. An optical member such as a mirror constituting the direct-view side illumination unit 43 or a transparent protective member for protecting the optical member such as a mirror is exposed on the side surface of the 1 st projection 31. The portion exposed to the side surface of the 1 st protruding portion 31 is a direct-view side illumination window 43A that emits illumination light in the side surface direction of the insertion portion 12 a. In the endoscope 12 of the present embodiment, a part of the outer periphery of the 1 st protruding part 31 excluding the joint portion of the 1 st protruding part 31 and the 2 nd protruding part 32 becomes a direct-view side illumination window.
In the present embodiment, as shown in fig. 4, the direct-view observation unit 41 and the side-view observation unit 42 include a common imaging lens 61 and an image sensor 66. The imaging lens 61 includes a front lens group 62, a reflection lens 63 formed by joining two lenses, and a rear lens group 64. The front surface of the front group lens 62 is exposed to the front end of the 1 st projection 31. That is, the front surface of the front group lens 62 constitutes the direct-view observation window 41A of the direct-view observation section 41. The side surface of the reflection lens 63 is exposed to the side surface of the 1 st protruding portion 31. Therefore, the side surface of the reflection lens 63 constitutes the side-view observation window 42A of the side-view observation unit 42.
Light incident from an observation object located in the front end direction of the insertion portion 12a via the front group lens 62 is guided to the rear group lens 64 by the reflection lens 63. Then, the image is formed on the image forming surface of the image sensor 66 through the cover glass 67. Thereby, the imaging lens 61 and the image sensor 66 as the direct-view observation unit 41 capture an image of the observation target located in the distal end direction of the insertion unit 12 a.
On the other hand, light incident from an observation object located in the side surface direction of the insertion portion 12a via the side surface of the reflection lens 63 is reflected by the reflection lens 63 in order on the joint surface of the two lenses forming the reflection lens 63 and the front surface of the reflection lens 63 and guided to the rear group lens 64. Then, the image is formed on the image forming surface of the image sensor 66 through the cover glass 67. Thereby, the imaging lens 61 and the image sensor 66 as the side-view observation unit 42 capture an image of the observation target located in the lateral direction of the insertion portion 12 a.
The direct-view side illumination unit 43 includes a light guide 71, a reflecting member 72, and a filling member 73. The light guide 71 is optically connected to the light source device 14, and guides illumination light emitted from the light source device 14. Then, the light is emitted from the end surface of the light guide 71 to the reflecting member 72 through the filling member 73. The reflecting member 72 diffuses the illumination light incident from the light guide 71 toward the side surface of the insertion portion 12a, and emits the illumination light in a range including at least the visual field of the side-view observation portion 42. The filling member 73 is a transparent member that protects the emission end surface of the light guide 71 and the reflection member 72. The filling member 73 smoothly fills a groove portion formed between the light guide 71 and the reflection member 72 along the side surface of the 1 st protruding portion 31. Therefore, the filling member 73 constitutes the direct-view side illumination window 43A.
In the endoscope 12 of the present embodiment, the side-view observation window 42A is provided on the distal end side of the side surfaces of the 1 st protruding portion 31, and the front-view side-view illumination window 43A is provided on the proximal end side of the side surfaces of the 1 st protruding portion 31. However, it is preferable that the side-view observation window 42A be provided on the side surface of the 1 st projecting portion 31 on the distal end side as much as possible, because vignetting or the like due to the distal end surface 21 or the like is prevented and the visual field of the side-view illumination portion 43 is easily secured.
As shown in fig. 2 and 3, the 2 nd projecting portion 32 has a nozzle for discharging a cleaning liquid to clean the tip portion 12 d. More specifically, the 2 nd protrusion 32 includes a nozzle 51 and a nozzle 52 for discharging the cleaning liquid toward the side-view observation window 42A. The nozzle 51 is located at the right side surface of the 2 nd protrusion 32, and the nozzle 52 is disposed at the left side surface of the 2 nd protrusion 32. The nozzle 51 and the nozzle 52 are provided in the 2 nd projecting portion 32, and have the same property in that the side-view observation window 42A is cleaned by discharging the cleaning liquid toward the side-view observation window 42A. The 2 nd projecting portion 32 has a nozzle 53 at the tip of the 2 nd projecting portion 32. The nozzle 53 cleans the direct-view observation window 41A by ejecting a cleaning liquid toward the direct-view observation window 41A, which is an exposed portion of the direct-view observation section 41. In the present embodiment, the operation related to the cleaning of the direct-view observation window 41A or the side-view observation window 42A performed in the normal display mode is different from the operation related to the cleaning of the direct-view observation window 41A or the side-view observation window 42A performed in the specific display mode. The details of the operation related to the cleaning in these two modes will be described later.
As shown in fig. 3 and 5, the 2 nd projecting portion 32 has a direct-view illumination window 54A which is an illumination window of the direct-view illumination portion 54 that emits illumination light toward the field of view of the direct-view observation portion 41. The direct-view illumination unit 54 includes, for example, a light guide 77 that guides illumination light emitted from the light source device 14, an illumination lens 78 that diffuses and emits illumination light guided to the distal end portion 12d by the light guide 77 toward the field of view of the direct-view observation unit 41, and the like. An illumination lens constituting the direct-view illumination section 54 or a transparent protection member protecting the illumination lens is exposed to the tip of the 2 nd projecting section 32. The portion exposed to the front end of the 2 nd projecting portion 32 is a direct-view illumination window 54A. In the present embodiment, the front surface of the illumination lens 78 is exposed to the front end of the 2 nd protrusion 32. Thus, the front surface of the illumination lens 78 forms the direct-view illumination window 54A.
As shown in fig. 2 and 3, the distal end surface 21 of the distal end portion 12d has a direct-view illumination window 81A and a jaw opening 82, which are illumination windows of the direct-view illumination unit 81, at positions on the right side with respect to the 1 st projection 31 and the 2 nd projection 32. The direct-view illumination section 81 emits illumination light toward the field of view of the direct-view observation section 41, as in the direct-view illumination section 54 located in the 2 nd projecting section 32. An illumination lens constituting the direct-view illumination section 81 or a transparent protective member for protecting the illumination lens is exposed to the distal end surface 21. The portion exposed to the front end face 21 is a direct-view illumination window 81A.
The jaw 82 is an outlet of a treatment instrument such as forceps. When a treatment instrument such as a forceps is inserted from an inlet (not shown) of a proximal end portion of the endoscope 12, the treatment instrument reaches the jaw opening 82 via the forceps channel, and the distal end thereof can be projected from the jaw opening 82. The forceps channel communicates with the distal end portion 12d, the insertion portion 12a, and the operation portion 12 b.
As shown in fig. 6, the light source device 14 includes a light source 91 that generates illumination light, and a light source control unit 92 that controls the light source 91. The Light sources 91 can be controlled independently, and the wavelengths or wavelength ranges of Light emitted from the LEDs are different from each other (Light Emitting Diode). Instead of the LED, another semiconductor light source such as an LD (Laser Diode) may be used as the light source 91. The semiconductor light source and a phosphor or the like that emits light of another color by using light emitted from the semiconductor light source as excitation light may be used in combination. A lamp light source such as a xenon lamp may be used as the light source 91. The light source 91 may be configured by combining a semiconductor light source, a phosphor, and a lamp light source with a filter for adjusting a wavelength band or a spectral spectrum. For example, by using a filter in combination with a white LED, a variety of illumination lights can be emitted.
The light source control unit 92 controls the turning on/off and the amount of light of the LEDs and the like constituting the light source 91 according to the driving timing of the image sensor 66. In particular, when a single observation image is generated using a plurality of captured images (i.e., a multi-frame observation mode), the light source control unit 92 can change the wavelength band or the spectral spectrum of the illumination light for each captured frame in which the plurality of captured images used for generating the observation image are obtained as a result of controlling the LEDs and the like.
Illumination light generated by the light source 91 is incident on the light guide 93. The light guide 93 is inserted from the light source device 14 into the endoscope 12 and the universal cord, and transmits illumination light to the distal end portion 12d of the endoscope 12. The light guide 93 is branched into at least the light guide 77 constituting the direct-view illumination section 54, the light guide 84 constituting the direct-view illumination section 81, and the light guide 71 constituting the side-view illumination section 43, and transmits illumination light to these illumination sections. Further, multimode fibers can be used as the branched light guides such as the light guide 93 and the light guide 71. For example, a fiber cable having a small diameter of 105 μm in core diameter, 125 μm in clad diameter, and 0.3 to 0.5mm in diameter of a protective layer serving as a sheath can be used.
The processor device 16 includes a control unit 96, an image acquisition unit 97, a display control unit 98, and an image processing unit 99. The control Unit 96 is a CPU (Central Processing Unit) or the like that centrally controls the endoscope system 10. The control unit 96 performs, for example, synchronization control for matching the imaging timing of the image sensor 66 with the light emission timing of each LED or the like constituting the light source 91. The control of the light emission timing of each LED constituting the light source 91 is performed via the light source control unit 92. In the present embodiment, the control unit 96 drives the image sensor 66 at a constant timing. The control unit 96 performs automatic exposure control (ae (auto exposure) control) for controlling automatic exposure. The AE control differs between the normal display mode and the specific display mode. The details of the AE control in the normal display mode and the specific display mode will be described later.
The image acquiring unit 97 acquires a direct-view image using the direct-view observing unit 41 and a side-view image using the side-view observing unit 42. In the present embodiment, the imaging lens 61 and the image sensor 66 are shared by the direct-view observation unit 41 and the side-view observation unit 42. The image acquisition unit 97 acquires an image captured by the image sensor 66. The image obtained by the image obtaining unit 97 is subjected to a specific process, thereby obtaining a captured image including a direct-view image and a side-view image. As shown in fig. 7, an image obtained in a circular direct-view imaging region 66a in the center portion of the image sensor 66 is set as a direct-view image, and an image obtained in a side-view imaging region 66b outside the direct-view imaging region in the image sensor 66 is set as a side-view image. The through-view image and the side-view image obtained by the image obtaining unit 97 are sent to the display control unit 98 or the image processing unit 99.
The display controller 98 acquires the direct-view image and the side-view image from the image acquirer 97. Then, the display control unit 98 displays the direct-view observation image or the side-view observation image on the display 18. As shown in fig. 8, when the normal display mode is set, the display control unit 98 displays both the direct-view image 100 and the side-view image 102 on the display 18. The side-view observation image 102 includes a blind spot 103 that cannot be imaged because the observation target is not within the field of view of the side-view observation unit 42.
On the other hand, as shown in fig. 9, when the specific display mode is set, the display control unit 98 displays only the direct-view image 100 (display image) on the display 18, and the side-view image 102 (non-display image) is not displayed on the display 18 (non-display). The side-view observation image 102 is used for detecting a region of interest included in the observation target by the region of interest detecting unit 99a (see fig. 6). The display control unit 98 controls the display of the display 18 according to the detection result of the region of interest. Details of the display control based on the detection result of the region of interest will be described later. In the specific display mode, the through-view image 100 is electronically magnified and displayed on the entire screen of the display 18. The image 105 displayed on the screen of the display 18 in the side view observation image 102 is not displayed on the display 18 by masking or the like. In the specific display mode, the side-view observation image may be a non-display image and the direct-view observation image may be a display image.
The image processing unit 99 performs various image processing corresponding to the normal display mode or the specific display mode on the direct view image or the side view image. The various image processing includes gain processing, color enhancement processing, structure enhancement processing, and the like. When the normal display mode is set, the direct-view image or the side-view image is displayed on the display 18, and therefore, it is preferable to perform the same image processing on each of the direct-view image and the side-view image. On the other hand, when the specific display mode is set, only the direct-view image is displayed on the display 18, and the direct-view image is not displayed on the display 18, so that the image processing performed on the direct-view image may be different from the image processing performed on the opposite-side-view image.
For example, since the through-view image is displayed on the display 18 as image processing to be performed on the through-view image, it is preferable to perform image processing that is easy for the user to visually recognize. On the other hand, since the side-view observation image is used for detecting the region of interest as the image processing to be performed on the side-view observation image, it is preferable to perform image processing that facilitates detection of the region of interest. For example, if the region of interest is a thin superficial blood vessel or the like, it is preferable to perform a high-frequency enhancement process as the structure enhancement process. In addition, if the region of interest is reddish skin or the like, it is preferable to perform red enhancement processing for enhancing the red color. In addition, in order to easily detect a plurality of regions of interest in the side-view image, different image processing may be performed for each frame. Since the side-view observation image is not displayed on the display 18, even if the image processing is changed for each frame, stress is not applied to the user.
For example, when the side-view observation image is acquired using a plurality of frames including the 1 st frame and the 2 nd frame, the side-view observation image acquired from the 1 st frame may be subjected to the high-frequency enhancement processing, and the side-view observation image acquired from the 2 nd frame may be subjected to the red enhancement processing. In this case, it becomes easy to detect superficial blood vessels from the side-view observation image of the 1 st frame as the region of interest, and to detect redness of the skin from the side-view observation image of the 2 nd frame as the region of interest.
The image processing unit 99 further includes: a region-of-interest detecting unit 99a that performs a region-of-interest detection process for detecting a region of interest from the side-view observation image when the specific display mode is set; and a foreign matter detection unit 99b that detects whether or not a foreign matter is adhered to the side-view observation window 42A from the side-view observation image when the specific display mode is set. When the foreign matter detection unit 99b detects a foreign matter, a process of automatically blowing the cleaning liquid to the side view observation window 42A is performed. The details of the automatic cleaning based on the foreign matter detection will be described later. Further, even if the region of interest is ignored in the direct-view direction, the region of interest detection unit 99a preferably detects the region of interest from the side-view observation image because the ignored region of interest often passes through the region in the side-view direction. Limiting the range of region-of-interest detection can, instead, reduce the computational load for detection. However, the region of interest detecting unit 99a may detect the region of interest from the direct-view image.
As the target region detection processing, for example, NN (Neural Network/Neural Network), CNN (convolutional Neural Network/convolutional Neural Network), Adaboost (adaptive boosting algorithm), random forest, or the like may be used. In the region of interest detection processing, the region of interest may be detected based on a feature value obtained from color information of the side-view observation image, a gradient of a pixel value, or the like. The gradient of the pixel value or the like represents a change in shape (e.g., a major portion of the undulation, a local depression, or a bulge of a mucous membrane), color (e.g., color of whitening due to inflammation, bleeding, redness, or atrophy), tissue characteristics (e.g., thickness, depth, or density of blood vessels, or a combination thereof), structural characteristics (e.g., a pit pattern), or the like of a subject.
The region of interest detected by the region of interest detecting section 99a is, for example, a region including a lesion including cancer, a benign tumor, an inflamed portion (including a portion in which a change such as bleeding or atrophy occurs in addition to so-called inflammation), a diverticulum of the large intestine, a treatment trace (EMR (Endoscopic mucosalectomy) scar, an ESD (Endoscopic Submucosal Dissection) scar, a clamped portion), a bleeding point, a perforation, a vascular abnormality, a burning trace due to heating, a marked portion marked by coloring with a coloring agent, a fluorescent agent, or the like, or a biopsy implementation portion subjected to biopsy (so-called biopsy). That is, a region including a lesion, a region where a lesion is likely to be present, a region where some kind of treatment is performed such as biopsy, a region where detailed observation is necessary regardless of the possibility of a lesion, such as a treatment instrument such as a clip or forceps, a dark region (a region where observation light is difficult to reach due to a back surface of a wrinkle or a deep portion of a lumen), or the like, can be a region of interest. In the endoscope system 10, the region-of-interest detecting section 99a detects a region including any one of a diseased portion, a benign tumor portion, an inflamed portion, a diverticulum of a large intestine, a treatment trace, a bleeding point, a puncture, a blood vessel abnormality marking portion, and a biopsy implementation portion as a region of interest.
Hereinafter, display control based on the detection result of the region of interest will be described. When the region of interest is detected, as shown in fig. 10, the display control unit 98 notifies the display 18 of the detection of the region of interest while maintaining the display area for observation 100a (1 st display area) in which the through-observation image 100 is displayed. Thus, even when the region of interest is detected, the user's attention can be maintained by maintaining the display form of the through-view image 100 that the user gazes at. The maintenance of the display region for observation means, for example, maintaining the size or shape of the display region of the region of interest. In the screen of the display 18 shown in fig. 10, the ratio of the horizontal to the vertical (screen aspect ratio) is preferably 4: 3.
The notification of the detection of the attention area is performed by displaying a detection marker 108 indicating the detection of the attention area or the position of the attention area on a peripheral area portion 110 (2 nd display area) provided at a position different from the observation display area 100a on which the through-observation image is displayed on the display 18. The arrow of the detection marker 108 indicates that the region of interest is detected in the upper left portion in the side view observation image. In this manner, by displaying the detection indicator 108, the user operates the angle knob 12e to move the distal end portion 12d of the endoscope so that the region of interest enters the field of view of the direct-view observation unit 41. When the region of interest enters the field of view of the direct-view observation unit 41, the user can grasp the content of the region of interest. The peripheral region 110 is provided at four corners of the screen of the display 18, and the detection marker 108 is displayed at any one of the four corners. In fig. 10, an arrow having a certain size and a certain color is used as the detection marker, but the color or shape of the detection marker may be changed according to the size or type of the region of interest. For example, the detection marker may be blue when the region of interest is a treatment instrument, or may be red when the region of interest is a lesion.
As shown in fig. 11, a marker display region 114 (3 rd display region) may be provided around the observation display region 100b in which the direct-view image 100 is displayed, and when the region of interest is detected, a detection marker 116 may be displayed in the marker display region 114 in a state in which the observation display region 100b is maintained. The logo display area 114 is formed in accordance with the shape of the observation display area 100 b. Therefore, since the observation display region 100b has a circular shape, the indicator display region 114 has a doughnut shape. When the detection marker 116 is displayed in the wheel-ring-shaped marker display region 114, the detection marker is displayed in a portion corresponding to the position of the region of interest detected in the side-view observation image 102. Thus, the detection marker 116 can indicate the position of the region of interest without using an arrow, as in the detection marker 108 of fig. 10. In the screen of the display 18 shown in fig. 11, the ratio of the horizontal to the vertical (screen aspect ratio) is preferably 4: 3.
When the region Of interest is detected, as shown in fig. 12, a region Of interest image including a portion where the region Of interest roi (region Of interest) is detected in the side view observation image 102 may be clipped and displayed on the sub-screen 110a Of the peripheral region section 110 Of the display 18. Here, as described above, the observation display region 100a for displaying the direct-view image is maintained before and after the detection of the region of interest. In the screen of the display 18 shown in fig. 12, the ratio of the horizontal to the vertical (screen aspect ratio) is preferably 16: 9.
the region-of-interest image is an image cut in an arc shape from the side-view observation image 102 including the image of the region of interest. Therefore, the position of the region of interest can be grasped from the arc shape of the region of interest image. Further, since the region-of-interest image is an image obtained by cutting out a part of the side-view observation image that is not electronically enlarged, the region-of-interest image has high image quality and the region-of-interest can be clearly grasped.
In the above-described embodiment, only the electronically magnified direct-view image 100 is displayed on the display 18 in the specific display mode, but as shown in fig. 13(a), the direct-view image 100 may be displayed on the display 18 without being magnified, and the side-view image 102 may be non-displayed on the display 18 by performing masking (e.g., graying). When the region of interest ROI is detected from the side view image 102, as shown in fig. 13B, the display 18 is displayed with the non-display released only for the portion 102a where the region of interest ROI is detected (mask release processing), and the non-display is maintained except for the portion where the region of interest is detected (mask maintenance processing). In fig. 13, as described above, the observation display region 100c in which the direct-view image 100 is displayed is maintained before and after the detection of the region of interest.
In the above-described embodiment, the two through-view image and the side-view image are acquired by using the single image sensor 66, but the through-view image and the side-view image may be acquired by using different image sensors. For example, in addition to the image sensor for obtaining the direct-view observation image, an image sensor for obtaining the left side-view observation image and an image sensor for obtaining the right side-view observation image may be provided. In this case, when the region of interest is detected by the image sensor for obtaining the right side-looking observation image, as shown in fig. 14, in addition to the direct-viewing observation image 100, an image 122 showing an arrow 120 detected by the image sensor for obtaining the right side-looking observation image and the region of interest ROI obtained by the image sensor for obtaining the right side-looking observation image is displayed on the peripheral region portion 110 of the display 18. In fig. 14, as described above, the observation display region 100a in which the direct-view image 100 is displayed is maintained before and after the detection of the region of interest. In the screen of the display 18 shown in fig. 12, the ratio of the horizontal to the vertical (screen aspect ratio) is preferably 16: 9.
Next, AE control in the normal display mode and the specific display mode will be described. In the normal display mode, since the display 18 displays both the direct-view image and the side-view image, it is preferable that the luminance of the direct-view image and the side-view image be balanced. In the normal display mode, the control unit 96 calculates brightness information for direct viewing from the direct-view image and brightness information for side viewing from the side-view image in the brightness information calculation unit 96a (see fig. 6). The control unit 96 calculates luminance information for the normal display mode obtained from the luminance information for the direct view and the luminance information for the side view, and transmits the calculated luminance information for the normal display mode to the light source control unit 92. The luminance information for the normal display mode is preferably an average of the luminance information for the front view and the luminance information for the side view, for example. The light source control unit 92 controls the light source 91 so as to obtain a target light amount based on the luminance information for the normal display mode.
Further, when the luminance in the direct view direction is different from the luminance in the side view direction, it becomes difficult to maintain the balance of the luminance if the light source control is performed based on any luminance information. For example, when the side-viewing direction is dark, if the light amount of the illumination light is increased based on the luminance information in the side-viewing direction, the direct-viewing direction becomes excessively bright depending on the shape of the organ, and it becomes difficult to maintain the luminance balance. Therefore, in the present embodiment, the light source control is performed using the average of the luminance information in the direct view direction and the side view direction, and thus the balance of luminance is easily maintained.
On the other hand, in the specific display mode, only the display through-view image is displayed on the display 18, and therefore the brightness of the through-view image and the side-view image may be different. Therefore, in the specific display mode, the control unit 96 calculates brightness information for direct viewing from the direct-view image and brightness information for side viewing (brightness information for non-display image) from the side-view image. The control section 96 transmits the brightness information for direct viewing to the light source control section 92. The light source control unit 92 controls the light source 91 so as to obtain the target light amount based on the brightness information for direct viewing. On the other hand, the control unit 96 transmits the side-view luminance information to the image processing unit 99. The image processing unit 99 performs gain processing on the side-view observation image so as to obtain target brightness based on the side-view brightness information. The side view image subjected to the gain processing is used for detection of a region of interest. As described above, in the specific display mode, since the side view observation image is not displayed on the display 18, even if the noise is increased by the gain processing, there is no problem as long as the increased noise is noise to such an extent that the structure or color of the lesion is known, that is, noise that does not affect the detection degree of the region of interest.
The operation related to the cleaning of the direct-view observation window 41A or the side-view observation window 42A in the normal display mode and the operation related to the cleaning of the direct-view observation window 41A or the side-view observation window 42A in the specific display mode will be described. As shown in fig. 17, in the endoscope system 10, a direct-view liquid supply line 130 for supplying a cleaning liquid from the tank 17 to the direct-view window cleaning nozzle 53 (direct-view nozzle) and a side-view liquid supply line 132 for supplying a cleaning liquid from the tank 17 to the side-view window cleaning nozzles 51 and 52 (side-view nozzles) are provided separately. The endoscope system 10 is provided with a direct-view pump 134 for sucking the cleaning liquid from the tank 17 and feeding the cleaning liquid to the direct-view liquid feeding line 130, and a side-view pump 136 for sucking the cleaning liquid from the tank 17 and feeding the cleaning liquid to the side-view liquid feeding line 132.
When the normal display mode is set, as shown in fig. 18, the direct-view pump 134 and the side-view pump 136 are driven by operating the wash switch 13 a. Thereby, the cleaning liquid from the tank 17 is blown toward the direct-view observation window 41A via the direct-view liquid delivery line 130 and the nozzle 53. The cleaning liquid from the tank 17 is blown through the side-view liquid delivery pipe 132 and the nozzles 51 and 52 toward the side-view observation window 42A.
When the specific display mode is set, as shown in fig. 19, only the direct-view pump 134 is driven by operating the wash switch 13 a. Thereby, the cleaning liquid from the tank 17 is blown toward the direct-view observation window 41A via the direct-view liquid delivery line 130 and the nozzle 53. On the other hand, when the foreign matter is detected by the foreign matter detection unit 99b of the image processing unit 99, the side view pump 136 is driven. Thereby, the cleaning liquid from the tank 17 is blown toward the side-view observation window 42A via the side-view liquid delivery pipe 132 and the nozzles 51 and 52. In the specific display mode, the side-view observation image is not displayed on the display 18, and therefore, even if the cleaning liquid is blown into the side-view observation window 42A at an arbitrary timing, the observation by the user is not affected.
Further, the direct-view liquid sending pipe 130 and the side-view liquid sending pipe 132 are independent from each other, and thus the cleaning liquid for cleaning the side-view observation window 42A is not blown to the direct-view observation window 41A. Thus, the user can continue the observation without interrupting the examination, and the operation of the wash switch 13a is performed only when the direct-view observation window 41A becomes dirty, so that the number of operations of the wash switch 13a is also reduced, and therefore, the stress on the user can be reduced. In the specific display mode, the side view observation window 42A is automatically cleaned based on the foreign matter detection, but the side view observation window 42A may be cleaned at a fixed timing without depending on the foreign matter detection.
In the above embodiment, the hardware configuration of the processing unit (processing unit) that executes various processes, such as the control unit 96, the image acquisition unit 97, the display control unit 98, and the image processing unit 99 (the region of interest detection unit 99a and the foreign object detection unit 99b), is a processor (processor) as described below. The various processors include a Programmable Logic Device (PLD) such as a CPU (Central Processing Unit/Central Processing Unit), a GPU (graphic Processing Unit/graphics Processing Unit), an FPGA (Field Programmable Gate Array) or the like, which is a general-purpose processor that executes software (program) to function as various Processing units and can change a circuit configuration after manufacture, and a dedicated electric circuit or the like, which is a processor having a circuit configuration specifically designed to execute various Processing.
One processing unit may be constituted by one of these various processors, or may be constituted by a combination of two or more processors of the same kind or different kinds (for example, a plurality of FPGAs, a combination of a CPU and an FPGA, a combination of a CPU and a GPU, or the like). Further, the plurality of processing units may be configured by one processor. As an example in which a plurality of processing units are configured by one processor, the 1 st embodiment is a system in which one processor is configured by a combination of one or more CPUs and software, as typified by a computer such as a client or a server, and functions as a plurality of processing units. The 2 nd System uses a processor in which the functions of the entire System including a plurality of processing units are realized by one IC (Integrated Circuit) Chip, as represented by a System On Chip (SoC) or the like. In this manner, the various processing units are configured using one or more of the various processors as a hardware configuration.
More specifically, the hardware configuration of these various processors is an electric circuit (circuit) in which circuit elements are combined, such as a semiconductor element.

Claims (19)

1. An endoscope system, comprising:
an endoscope having an insertion portion to be inserted into an observation target, a direct-view observation portion having a field of view in a distal end direction of the insertion portion, and a side-view observation portion having a field of view in a side surface direction of the insertion portion;
an image acquisition unit that acquires a direct-view observation image using the direct-view observation unit and acquires a side-view observation image using the side-view observation unit;
a display control unit that displays one of the direct-view image and the side-view image on a display unit as a display image and displays the other as a non-display image; and
a region-of-interest detection unit that detects a region of interest using the display image or the non-display image,
the display control unit maintains the 1 st display region in which the display image is displayed on the display unit, regardless of whether or not the region of interest is detected by the region of interest detection unit.
2. The endoscopic system of claim 1,
when the region of interest is detected, the display control unit displays a detection flag indicating that the region of interest is detected on the display unit.
3. The endoscopic system of claim 2,
the detection indicates a location of the region of interest with a marker.
4. The endoscopic system of claim 2,
the detection mark is displayed in a 2 nd display area provided at a position different from the 1 st display area.
5. The endoscopic system of claim 3,
the detection mark is displayed in a 2 nd display area provided at a position different from the 1 st display area.
6. The endoscopic system of claim 2,
the detection indicator is displayed in a 3 rd display area, and the 3 rd display area is provided around the 1 st display area and is formed in a shape corresponding to the 1 st display area.
7. The endoscopic system of claim 3,
the detection indicator is displayed in a 3 rd display area, and the 3 rd display area is provided around the 1 st display area and is formed in a shape corresponding to the 1 st display area.
8. The endoscopic system of claim 6,
the 3 rd display area has a wheel shape.
9. The endoscopic system of claim 7,
the 3 rd display area has a wheel shape.
10. The endoscopic system of any of claims 2 to 9,
the detection marker changes in color or shape according to the size or type of the region of interest.
11. The endoscopic system of claim 1,
when the region of interest is detected, the display control unit displays a region of interest image including a portion of the non-display image in which the region of interest is detected on the display unit.
12. The endoscopic system of claim 1,
when the region of interest is detected, the display control unit releases the non-display of the portion where the region of interest is detected and displays the portion on the display unit.
13. The endoscopic system of any of claims 1 to 9,
the display control unit electronically enlarges the display image and displays the enlarged display image on the display unit.
14. The endoscopic system of any of claims 1 to 9,
the image acquisition section acquires the direct-view observation image and the side-view observation image from an image obtained by one image sensor.
15. The endoscopic system of any of claims 1 to 9,
the through-view observation image is obtained by an image sensor for through-view observation image acquisition, and the side-view observation image is obtained by an image sensor for side-view observation image acquisition that is different from the image sensor for through-view observation image acquisition.
16. The endoscopic system of any of claims 1 to 9, wherein there is:
and a mode switching switch that switches between a normal display mode in which the display unit displays both the direct-view image and the side-view image and a specific display mode in which the display unit displays the display image and the non-display image is not displayed.
17. The endoscope system according to any one of claims 1 to 9, comprising:
a luminance information acquiring unit that acquires luminance information for a non-display image from the non-display image,
the non-display image is subjected to gain processing for setting a target luminance based on luminance information for the non-display image.
18. The endoscope system according to any one of claims 1 to 9, comprising:
and an image processing unit that performs different image processing on the non-display image for each frame.
19. The endoscopic system of any of claims 1 to 9,
a direct viewing observation window is arranged on the direct viewing observation part, a side viewing observation window is arranged on the side viewing observation part,
a side-view liquid feed line for feeding a cleaning liquid to a side-view nozzle used for cleaning the side-view observation window and a direct-view liquid feed line for feeding the cleaning liquid to a direct-view nozzle used for cleaning the direct-view observation window are provided separately from each other,
when a foreign object is detected in the non-display image, the cleaning liquid is automatically blown toward the side-view observation window through the side-view liquid sending line and the side-view nozzle.
CN201910962096.2A 2018-10-11 2019-10-10 Endoscope system Active CN111035348B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018192464A JP7092633B2 (en) 2018-10-11 2018-10-11 Endoscope system
JP2018-192464 2018-10-11

Publications (2)

Publication Number Publication Date
CN111035348A true CN111035348A (en) 2020-04-21
CN111035348B CN111035348B (en) 2024-03-26

Family

ID=70160653

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910962096.2A Active CN111035348B (en) 2018-10-11 2019-10-10 Endoscope system

Country Status (3)

Country Link
US (1) US10694119B2 (en)
JP (1) JP7092633B2 (en)
CN (1) CN111035348B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230181002A1 (en) * 2021-12-14 2023-06-15 Karl Storz Imaging, Inc. Frame processing of imaging scope data for user interface presentation

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102469930A (en) * 2009-11-06 2012-05-23 奥林巴斯医疗株式会社 Endoscope system
JP2012130429A (en) * 2010-12-20 2012-07-12 Fujifilm Corp Endoscope apparatus
CN104540438A (en) * 2012-07-17 2015-04-22 Hoya株式会社 Image processing device and endoscopic instrument
JP2016007445A (en) * 2014-06-25 2016-01-18 オリンパス株式会社 Endoscope system
CN105939650A (en) * 2014-02-14 2016-09-14 奥林巴斯株式会社 Endoscope system
US20170085762A1 (en) * 2014-11-06 2017-03-23 Olympus Corporation Endoscope system
US20170188798A1 (en) * 2014-10-28 2017-07-06 Olympus Corporation Endoscope system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5583873B1 (en) 2012-09-28 2014-09-03 オリンパスメディカルシステムズ株式会社 Endoscope apparatus having cleaning mechanism
WO2014088076A1 (en) 2012-12-05 2014-06-12 オリンパスメディカルシステムズ株式会社 Endoscope device
CN106102550B (en) 2014-05-16 2018-04-03 奥林巴斯株式会社 Endoscopic system
WO2017179168A1 (en) * 2016-04-14 2017-10-19 オリンパス株式会社 Imaging device
WO2017217269A1 (en) * 2016-06-14 2017-12-21 オリンパス株式会社 Endoscope
JP6779089B2 (en) 2016-10-05 2020-11-04 富士フイルム株式会社 Endoscope system and how to drive the endoscope system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102469930A (en) * 2009-11-06 2012-05-23 奥林巴斯医疗株式会社 Endoscope system
JP2012130429A (en) * 2010-12-20 2012-07-12 Fujifilm Corp Endoscope apparatus
CN104540438A (en) * 2012-07-17 2015-04-22 Hoya株式会社 Image processing device and endoscopic instrument
CN105939650A (en) * 2014-02-14 2016-09-14 奥林巴斯株式会社 Endoscope system
JP2016007445A (en) * 2014-06-25 2016-01-18 オリンパス株式会社 Endoscope system
US20170188798A1 (en) * 2014-10-28 2017-07-06 Olympus Corporation Endoscope system
US20170085762A1 (en) * 2014-11-06 2017-03-23 Olympus Corporation Endoscope system

Also Published As

Publication number Publication date
US10694119B2 (en) 2020-06-23
JP2020058639A (en) 2020-04-16
CN111035348B (en) 2024-03-26
JP7092633B2 (en) 2022-06-28
US20200120288A1 (en) 2020-04-16

Similar Documents

Publication Publication Date Title
JP6785941B2 (en) Endoscopic system and how to operate it
JP4875319B2 (en) Endoscope
US20210045811A1 (en) Medical laser apparatus and system
US11426054B2 (en) Medical image processing system, endoscope system, diagnosis support apparatus, and medical service support apparatus
JP6779089B2 (en) Endoscope system and how to drive the endoscope system
JP6001219B1 (en) Endoscope system
WO2017051455A1 (en) Endoscope device
JP6827516B2 (en) Endoscope system and how to drive the endoscope system
JP7328432B2 (en) medical control device, medical observation system, control device and observation system
EP3838109A1 (en) Endoscope system
JP7374280B2 (en) Endoscope device, endoscope processor, and method of operating the endoscope device
JP5608580B2 (en) Endoscope
CN111035348B (en) Endoscope system
JP5468942B2 (en) Endoscope device
US9867557B2 (en) Probe
JP7324307B2 (en) Light source device, endoscope system and control method
JP2014104138A (en) Endoscope and endoscope system
JP2013215266A (en) Probe
JP2005013560A (en) Optical observation probe and endoscopic observation apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant