US10602916B2 - Endoscope system that adjusts luminance of frame image including images of a pluraliry of regions and actuating method for endoscope system - Google Patents

Endoscope system that adjusts luminance of frame image including images of a pluraliry of regions and actuating method for endoscope system Download PDF

Info

Publication number
US10602916B2
US10602916B2 US15/448,764 US201715448764A US10602916B2 US 10602916 B2 US10602916 B2 US 10602916B2 US 201715448764 A US201715448764 A US 201715448764A US 10602916 B2 US10602916 B2 US 10602916B2
Authority
US
United States
Prior art keywords
image
luminance
value
luminance value
range
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US15/448,764
Other languages
English (en)
Other versions
US20170172392A1 (en
Inventor
Takashi Ito
Kazuki Honda
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Assigned to OLYMPUS CORPORATION reassignment OLYMPUS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HONDA, KAZUKI, ITO, TAKASHI
Publication of US20170172392A1 publication Critical patent/US20170172392A1/en
Application granted granted Critical
Publication of US10602916B2 publication Critical patent/US10602916B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00006Operational features of endoscopes characterised by electronic signal processing of control signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00064Constructional details of the endoscope body
    • A61B1/00071Insertion part of the endoscope body
    • A61B1/0008Insertion part of the endoscope body characterised by distal tip features
    • A61B1/00101Insertion part of the endoscope body characterised by distal tip features the distal tip features being detachable
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00131Accessories for endoscopes
    • A61B1/0014Fastening element for attaching accessories to the outside of an endoscope, e.g. clips, clamps or bands
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00174Optical arrangements characterised by the viewing angles
    • A61B1/00181Optical arrangements characterised by the viewing angles for multiple fixed viewing angles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/05Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/05Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
    • A61B1/053Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion being detachable
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0607Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements for annular illumination
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0615Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements for radial illumination
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0625Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements for multiple fixed illumination angles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0655Control therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0661Endoscope light sources
    • A61B1/0669Endoscope light sources at proximal end of an endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/07Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements using light-conductive means, e.g. optical fibres
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/2407Optical details
    • G02B23/2415Stereoscopic endoscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/2407Optical details
    • G02B23/2461Illumination
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/2476Non-optical details, e.g. housings, mountings, supports
    • G02B23/2484Arrangements in relation to a camera or imaging device
    • G06T5/009
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/92Dynamic range modification of images or parts thereof based on global image properties
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/76Circuitry for compensating brightness variation in the scene by influencing the image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • H04N5/2256
    • H04N5/23232
    • H04N5/2354
    • H04N5/243
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing
    • H04N2005/2255
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/555Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes

Definitions

  • the present invention relates to an endoscope system that acquires a first image and a second image in different regions in a same subject and an actuating method for the endoscope system.
  • Examples of the wide angle endoscope include, for example, a type for forming, on one image pickup device, a front-view subject image via a front-view observation window and a side-view subject image via a side-view observation window described in Japanese Patent No. 4782900, and a type in which cameras obtained by combining optical systems and image pickup devices are provided respectively for front view and side view described in Japanese Patent Application Laid-Open Publication No. 2013-542467 (International Publication No. 2012/056453).
  • illuminance when a subject is irradiated by light having constant luminance, in general, illuminance is high in a proximity portion and is low in a remote portion. The proximity portion is more brightly observed and the remote portion is more darkly observed. More specifically, when, for example, a subject having a luminal shape is observed using the wide angle endoscope, an observation region by front view is a remote portion in the axial direction of a lumen and is dark and an observation region by side view is a proximity portion of a lumen inner wall and is bright.
  • International Publication No. 2011/055613 describes an endoscope system that individually detects brightness of a front-view field of view image and brightness of a side-view field of view image and controls, on the basis of a detection result, a light source device such that one field of view image reaches a brightness target value suitable for observation.
  • Japanese Patent Application Laid-Open Publication No. 2013-066648 describes an image processing apparatus for endoscopes that acquires a forward image corresponding to a forward field of view and a sideward image corresponding to a sideward field of view, performs forward magnification chromatic aberration correction processing when a processing target image signal is the forward image, and performs sideward magnification chromatic aberration correction processing when the processing target image signal is the sideward image.
  • Japanese Patent Application Laid-Open Publication No. 2001-290103 describes a technique for, in an observation system that observes a front-view forward field of view image with light transmitted through a half mirror and observes an endoscopic image of a liquid crystal monitor with light reflected by the half mirror, changing brightness of an image displayed on the liquid crystal monitor to switch the observation of the forward field of view image and the observation of the endoscopic image.
  • An endoscope system includes: an image forming section configured to form an image of one frame in which a first image obtained by picking up an image of a first region of a subject and a second image obtained by picking up an image of a second region of the subject are arranged to be adjacent to each other; a light source section configured to emit illumination light having brightness based on a brightness evaluation result of the image of the one frame to the first and second regions; a luminance detecting section configured to detect a minimum luminance value and a maximum luminance value in the first image included in the image of the one frame and detect a minimum luminance value and a maximum luminance value in the second image included in the image of the one frame; and an image processing section configured to determine, among the first image and the second image, whether or not a luminance range, which is a difference value between the maximum luminance value and the minimum luminance value, is larger than a predetermined luminance range defined by a difference value between a predetermined luminance upper limit value and a predetermined luminance lower
  • An actuating method for an endoscope system is an actuating method for an endoscope system including an illuminating section configured to radiate light on a first region of a subject and a second region of the subject different from the first region, the actuating method including: a first subject-image acquiring section provided in an insertion section picking up an image of the first region; a second subject-image acquiring section provided in the insertion section picking up an image of the second region; an image forming section arranging a first image picked up by the first subject-image acquiring section and a second image picked up by the second subject-image acquiring section to be adjacent to each other to form an image of one frame; a light source section emitting illumination light having brightness based on a brightness evaluation result of the image of the one frame to the first and second regions; a luminance detecting section detecting a minimum luminance value and a maximum luminance value in the first image included in the image of the one frame and detecting a minimum luminance value and a maximum luminance
  • FIG. 1 is a diagram showing a configuration of an endoscope system in a first embodiment of the present invention
  • FIG. 2 is a diagram showing a display example of a screen of a monitor in the first embodiment
  • FIG. 3 is a top view of a situation in which an endoscope is inserted in a state in which the endoscope is in close proximity to an inner wall on a right side of a subject formed in a luminal shape in the first embodiment;
  • FIG. 4 is a diagram showing an example of a luminance distribution of images obtained in the state shown in FIG. 3 in the first embodiment
  • FIG. 5 is a diagram showing an example of a luminance distribution of images obtained in the state shown in FIG. 3 when light adjustment is performed in a peak mode in the first embodiment
  • FIG. 6 is a diagram showing an example of a luminance distribution of images obtained in the state shown in FIG. 3 when light adjustment is performed in an average mode in the first embodiment
  • FIG. 7 is a diagram showing a situation of a change of a luminance distribution by gradation conversion in the first embodiment
  • FIG. 8 is a diagram showing an example of a gradation conversion curve for fitting images within a proper luminance range in the first embodiment
  • FIG. 9 is a diagram showing an example of a luminance distribution obtained when gaps of black display are present between a forward image and a right sideward image and between the forward image and a left sideward image in the first embodiment;
  • FIG. 10 is a flowchart for explaining processing of luminance adjustment of the endoscope system in the first embodiment
  • FIG. 11 is a flowchart for explaining details of acquisition processing for a maximum luminance value Amax and a minimum luminance value Amin in step S 4 in FIG. 10 in the first embodiment;
  • FIG. 12 is a flowchart for explaining details of gradation conversion processing in step S 13 in FIG. 10 in the first embodiment
  • FIG. 13 is a diagram showing an example in which the luminance distribution of the images obtained in the state shown in FIG. 3 is subjected to the gradation conversion for each of the images in respective directions of fields of view in the first embodiment;
  • FIG. 14 is a diagram partially showing an internal configuration of an endoscope in a first modification of the first embodiment
  • FIG. 15 is a diagram showing a display example on a monitor of an image obtained from the endoscope in the first modification of the first embodiment
  • FIG. 16 is a perspective view partially showing a configuration of an endoscope in a second modification in a state in which a sideward-image acquiring unit is mounted in the first embodiment
  • FIG. 17 is a perspective view partially showing the configuration of the endoscope in the second modification in a state in which the sideward-image acquiring unit is detached in the first embodiment.
  • FIG. 18 is a diagram showing a modification of an endoscope system in which an illuminating section is configured by a light emitting element in the first embodiment.
  • FIGS. 1 to 18 show a first embodiment of the present invention.
  • FIG. 1 is a diagram showing a configuration of an endoscope system.
  • the endoscope system includes an endoscope 1 , a video processor 2 , and a monitor 3 .
  • the endoscope 1 is configured as an electronic endoscope including an insertion section 1 a inserted into an inside of a subject.
  • the endoscope 1 includes a first subject-image acquiring section provided in the insertion section 1 a , the first subject-image being configured to acquire section acquiring a first picked-up image related to a first subject image (optical image) of a first region in the subject, a second subject-image acquiring section provided in the insertion section 1 a , the second subject-image acquiring section being configured to acquire a second image pickup signal related to a second subject image (optical image) of a second region in the subject different from the first region, and an illuminating section configured to radiate light on the first region and the second region.
  • the first region is a region including a forward direction (a region of a forward field of view) in the subject.
  • the first subject-image acquiring section includes an image pickup section 11 (a first image pickup section) disposed to be directed forward at a distal end portion of the insertion section 1 a , the image pickup section 11 being configured to photoelectrically convert the first subject image (optical image) of the first region in the subject including a forward direction along a longitudinal direction of the insertion section 1 a and generating the first image pickup signal.
  • the second region different from the first region is a region including a sideward direction (a region of a sideward field of view) in the same subject.
  • the second subject-image acquiring section photoelectrically converts the second subject image (optical image) in the second region in the subject including the sideward direction crossing the longitudinal direction of the insertion section 1 a to generate the second image pickup signal. More specifically, the second subject-image acquiring section is disposed in plurality in a plurality of angle positions in a circumferential direction of the insertion section 1 a and acquires a plurality of second image pickup signals related to a plurality of second subject images.
  • the second subject-image acquiring section include an image pickup section 12 (a second image pickup section separate from the first image pickup section) that picks up an image of a field of view region in a right sideward direction and an image pickup section 13 (a second image pickup section separate from the first image pickup section) that picks up an image of a field of view region in a left sideward direction.
  • the right sideward field of view and the left sideward field of view are, for example, equally divided two positions in the circumferential direction centering on a forward field of view.
  • the image pickup sections 11 to 13 include image pickup optical systems and image pickup devices.
  • the image pickup sections 11 to 13 photoelectrically convert, with the image pickup devices, subject images formed by the image pickup optical systems to generate image pickup signals and output the generated image pickup signals to the video processor 2 via a signal line 14 .
  • first subject-image acquiring section and the second subject-image acquiring section respectively include the image pickup optical systems and the image pickup devices.
  • first subject-image acquiring section and the second subject-image acquiring section may share at least one of the image pickup optical system and the image pickup device. That is, although the first subject-image acquiring section and the second subject-image acquiring section acquire optical images in respective directions of fields of view, the first subject-image acquiring section and the second subject-image acquiring section may share at least a part of the image pickup optical system or may form optical images respectively acquired by the first subject-image acquiring section and the second subject-image acquiring section in different regions on the same image pickup device shared by the first subject-image acquiring section and the second subject-image acquiring section.
  • an illuminating section 15 that radiates light on an image pickup range by the image pickup section 11
  • an illuminating section 16 that radiates light on an image pickup range by the image pickup section 12
  • an illuminating section 17 that radiates light on an image pickup range by the image pickup section 13 are provided. Therefore, the illuminating section 15 illuminates a forward direction, the illuminating section 16 illuminates a right sideward direction, and the illuminating section 17 illuminates a left sideward direction.
  • the illuminating sections 15 , 16 , and 17 are provided in plurality around the image pickup sections 11 , 12 , and 13 in order to reduce illumination unevenness.
  • Illumination light from the video processor 2 is supplied to the illuminating sections 15 , 16 , and 17 via a light guide 18 configured as an optical fiber bundle.
  • a proximal end side of the light guide 18 is converged and a distal end side of the light guide 18 branches to the respective illuminating sections 15 , 16 , and 17 .
  • the video processor 2 includes an image forming section 20 , a light source section 21 , a diaphragm 22 , a brightness-range calculating section 23 , an image processing section 24 , and an output section 25 .
  • the light source section 21 generates illumination light and emits the illumination light as parallel rays via a collimator lens or the like.
  • the diaphragm 22 limits a passing range of the illumination light emitted from the light source section 21 to thereby control a light amount of the illumination light that reaches the proximal end of the light guide 18 .
  • the distal end side of the light guide 18 branches to be connected to the respective illuminating sections 15 , 16 , and 17 . Therefore, in the configuration example shown in FIG. 1 , adjustment of light amounts of the illumination lights radiated from the respective illuminating sections 15 , 16 , and 17 on the subject is performed by simultaneously increasing the light amounts or performed by simultaneously reducing the light amounts.
  • the image forming section 20 forms a first image and a second image on the basis of image pickup signals corresponding to different regions in a same subject.
  • the image forming section 20 receives a first image pickup signal from a first image pickup section electrically connected via the signal line 14 to form a first image (image signal) and receives a second image pickup signal from second image pickup sections to form second images (image signals).
  • the image forming section 20 forms, on the basis of the first image and the second images, an image in which the first image is set in the center and a plurality of second images are respectively arranged in a plurality of angle positions in the circumferential direction of the first image according to respective directions of fields of view of the image pickup sections 11 to 13 .
  • the image forming section 20 includes, for example, a frame buffer.
  • the image forming section 20 stores, in addresses corresponding to pixel positions in the frame buffer, image pickup signals sequentially inputted, for example, in pixel units from the image pickup sections 11 to 13 to thereby form an image for one frame formed by respective pixels of the first image and respective pixels of the second images.
  • the brightness-range calculating section 23 is a luminance detecting section that detects minimum luminance values and maximum luminance values in the first image and the second images. More specifically, the brightness-range calculating section 23 calculates a brightness range by detecting a minimum luminance value and a maximum luminance value of a pixel (excluding an excluded region explained below) out of all the pixels forming the images (the first image and the second images) formed by the image forming section 20 on the basis of the image pickup signals obtained from the respective image pickup sections 11 , 12 , and 13 .
  • the image processing section 24 receives an image signal (an image) from the image forming section 20 electrically connected via the brightness-range calculating section 23 .
  • the image processing section 24 sets a predetermined luminance range and subjects the first image and the second images to gradation conversion such that a luminance range from the minimum luminance value to the maximum luminance value is fit within a predetermined luminance range.
  • An example of the predetermined luminance range is a proper luminance range from a proper lower limit luminance value to a proper upper limit luminance value.
  • the proper luminance range is a luminance range set assuming that a three-dimensional appearance (feeling of convex-concave) and a distance feeling can be reproduced and a displayed image is suitable for observation.
  • the proper upper limit luminance value is an upper limit value of luminance set assuming that the upper limit value is suitable for observation of the subject.
  • the proper lower limit luminance value is a lower limit value of luminance set assuming that the lower limit value is suitable for the observation of the subject.
  • the image processing section 24 performs not only the gradation conversion but also general various kinds of image processing and the like.
  • the output section 25 is an image output section that generates, on the basis of the image subjected to the gradation conversion by the image processing section 24 , a display signal for causing the monitor 3 to display the image.
  • the monitor 3 functioning as a display section displays the image fit within the proper luminance range while maintaining a luminance magnitude relation among the pixels. Therefore, an observer can observe, at proper brightness, an image having shading and three-dimensional appearance with which a diagnosis is easily performed.
  • FIG. 2 is a diagram showing a display example of a screen 30 of the monitor 3 . Note that, in FIGS. 2 and 3 referred to below, a forward direction is indicated by F, a right sideward direction is indicated by R, and a left sideward direction is indicated by L.
  • a forward image 31 (based on the first image) based on the image pickup signal obtained from the image pickup section 11 is arranged and displayed in the center
  • a right sideward image 32 based on the image pickup signal obtained from the image pickup section 12 is arranged and displayed on the right of the forward image 31
  • a left sideward image 33 based on the image pickup signal obtained from the image pickup section 13 is arranged and displayed on the left of the forward image 31 (the right sideward image 32 and the left sideward image 33 are respectively based on the second images). That is, the image forming section 20 forms an image such that the first image and the second images are arranged to be adjacent to each other in the same screen.
  • Arrangement of the respective images 31 to 33 viewed from the observer is arrangement coinciding with respective directions of fields of view viewed from the endoscope 1 .
  • An image configuration is realized as if an observation is performed by one super-wide angle camera. Note that, in the example shown in FIG. 2 , the forward image 31 , the right sideward image 32 , and the left sideward image 33 are displayed on the screen 30 of one monitor 3 . However, the forward image 31 , the right sideward image 32 , and the left sideward image 33 may be respectively displayed on screens of separate monitors.
  • FIG. 3 is a top view showing a situation in which the endoscope 1 is inserted in a state in which the endoscope 1 is in close proximity to an inner wall on a right side of a subject OBJ formed in a luminal shape.
  • a portion of the subject OBJ present on the right sideward direction R side is a proximity portion and a portion of the subject OBJ present on the left sideward direction L side is a remote portion.
  • the subject OBJ present on the forward direction F side is a remote portion in a center portion and is slightly a proximity portion in a peripheral portion reflecting the fact that the subject OBJ has the luminal shape. In such a state, a dynamic range of a luminance distribution of the subject OBJ easily increases.
  • FIG. 4 is a diagram showing an example of a luminance distribution of the images obtained in a state shown in FIG. 3 .
  • Pr indicates a horizontal pixel position of a boundary between the forward image 31 and the right sideward image 32
  • P 1 indicates a horizontal pixel position of a boundary between the forward image 31 and the left sideward image 33 .
  • Xmax indicates a proper upper limit luminance value, which is an upper limit of a proper luminance range
  • Xmin indicates a proper lower limit luminance value, which is a lower limit of the proper luminance range.
  • FIG. 4 the luminance distributions are shown as a continuous luminance distribution.
  • pixels as many as rows of the image pickup device are present in a predetermined horizontal position of an image. Therefore, actually, luminance values as many as the rows are distributed in one row of the image.
  • FIG. 4 for example, an example of a luminance distribution in one row along an arrow shown in FIG. 2 is shown.
  • the respective images 31 , 32 , and 33 are configured by pluralities of rows, if an x axis indicates a horizontal pixel position and a y axis indicates a luminance value and, further, a z axis is used as an axis indicating a row number, the luminance distribution is a two-dimensional distribution in the three-dimensional coordinate.
  • a luminance value of the right sideward image 32 on a right side with respect to the horizontal pixel position Pr exceeds the proper upper limit luminance value Xmax.
  • the right sideward image 32 is in a state close to white exceeding a range in which an observation is easily performed.
  • the forward image 31 between the horizontal pixel position P 1 and the horizontal pixel position Pr is fit within the proper luminance range.
  • a minimum point of a luminance value occurs in a center portion reflecting the fact that the subject OBJ is formed in the luminal shape.
  • a luminance value of the left sideward image 33 on a left side with respect to the horizontal pixel position PI is fit within the proper luminance range but is close to the proper lower limit luminance value Xmin.
  • the left sideward image 33 is a relatively dark image portion.
  • a dynamic range of a luminance distribution of the subject is wider than a dynamic range of the proper luminance range.
  • FIG. 5 is a diagram showing an example of a luminance distribution of images obtained in the state shown in FIG. 3 when light adjustment is performed in a peak mode.
  • FIG. 6 is a diagram showing an example of a luminance distribution of images obtained in the state shown in FIG. 3 when light adjustment is performed in an average mode.
  • a peak mode and an average mode are known.
  • the brightness adjustment can be performed by either adjusting illumination light amounts using the diaphragm 22 or shifting a luminance value of the entire images to a bright direction or a dark direction through image processing. It is assumed here that, for example, the brightness adjustment is performed using the diaphragm 22 .
  • the peak mode is a method of adjusting brightness of the images such that a maximum luminance value of the images (the forward image 31 , the right sideward image 32 , and the left sideward image 33 ) coincides with the proper upper limit luminance value, which is the upper limit of the proper luminance range.
  • a luminance distribution of the images obtained by applying the peak mode to the subject having the luminance distribution shown in FIG. 4 is, for example, as shown in FIG. 5 .
  • an upper limit of the luminance distribution is fit within the proper luminance range.
  • a lower limit of the luminance distribution is sometimes smaller than the proper lower limit luminance value Xmin exceeding the proper luminance range as shown in FIG. 5 .
  • a luminance distribution of the images obtained by applying the average mode to the subject having the luminance distribution shown in FIG. 4 is, for example, as shown in FIG. 6 .
  • a dynamic range of a luminance distribution of which is wider than the dynamic range of the proper luminance range as shown in FIG.
  • an upper limit of the luminance distribution is sometimes larger than the proper upper limit luminance value Xmax exceeding the proper luminance range and, further, a lower limit of the luminance distribution is sometimes smaller than the proper lower limit luminance value Xmin exceeding the proper luminance range.
  • Action of the endoscope system in the present embodiment is explained according to flowcharts of FIGS. 10 to 12 with reference to FIGS. 7 to 9 .
  • the endoscope system in the present embodiment does not perform processing shown in the flowcharts of FIGS. 10 to 12 as processing for separately fitting an endoscopic image acquired in advance in the proper luminance range later but performs the processing as processing for performing, with the video processor 2 , on a real time basis, luminance adjustment of an endoscopic image acquired by the endoscope 1 .
  • FIG. 10 is a flowchart for explaining the processing of the luminance adjustment by the endoscope system.
  • the brightness-range calculating section 23 reads the proper upper limit luminance value Xmax and the proper lower limit luminance value Xmin set in advance (or set by a user) (step S 1 ).
  • the endoscope system controls the diaphragm 22 to thereby adjust a light amount of illumination light, for example, in the peak mode or the average mode explained above (step S 2 ).
  • the endoscope system radiates the illumination light, the light amount of which is adjusted in step S 2 , from the illuminating sections 15 to 17 on the subject.
  • the endoscope system acquires images of the subject for one frame with the image pickup sections 11 to 13 and the image forming section 20 (step S 3 ).
  • the brightness-range calculating section 23 compares luminance values of respective pixels in the acquired images (the forward image 31 , the right sideward image 32 , and the left sideward image 33 ) to thereby acquire the maximum luminance value Amax and the minimum luminance value Amin (step S 4 ).
  • the image processing section 24 determines whether a dynamic range (Amax-Amin) of the images is larger than a dynamic range (Xmax-Xmin) of the proper luminance range (step S 5 ).
  • the image processing section 24 determines whether the maximum luminance value Amax of the images is larger than the proper upper limit luminance value Xmax (step S 6 ).
  • the image processing section 24 shifts a luminance value of the entire images downward to thereby perform image processing such that a luminance dynamic range of the entire images is fit within the proper luminance range (step S 7 ).
  • step S 6 When determining in step S 6 that the maximum luminance value Amax of the images is equal to or smaller than the proper upper limit luminance value Xmax, the image processing section 24 further determines whether the minimum luminance value Amin of the images is smaller than the proper lower limit luminance value Xmin (step S 8 ).
  • the image processing section 24 shifts the luminance value of the entire images upward to thereby perform image processing such that the luminance dynamic range of the entire images is fit within the proper luminance range (step S 9 ).
  • the image processing section 24 performs only a luminance shift with which a minimum luminance value is equal to or larger than the proper lower limit luminance value Xmin and a maximum luminance value is equal to or smaller than the proper upper limit luminance value Xmax.
  • step S 10 when determining in step S 5 that (Amax ⁇ Amin)>(Xmax ⁇ Xmin), the image processing section 24 determines whether the maximum luminance value Amax of the images is larger than the proper upper limit luminance value Xmax (step S 10 ).
  • the image processing section 24 When determining that the maximum luminance value Amax is equal to or smaller than the proper upper limit luminance value Xmax, the image processing section 24 further determines whether the minimum luminance value Amin of the images is smaller than the proper lower limit luminance value Xmin (step S 11 ).
  • step S 10 When determining in step S 10 that the maximum luminance value Amax of the images is larger than the proper upper limit luminance value Xmax or when determining in step S 11 that the minimum luminance value Amin of the images is smaller than the proper lower limit luminance value Xmin, the image processing section 24 generates a gradation conversion curve shown in FIG. 8 (step S 12 ).
  • FIG. 8 is a diagram showing an example of a gradation conversion curve for fitting the images within the proper luminance range.
  • the gradation conversion curve is a conversion curve for converting the inputted minimum luminance value Amin into the proper lower limit luminance value Xmin, converting the inputted maximum luminance value Amax into the proper upper limit luminance value Xmax, and, further, converting an input luminance value between the minimum luminance value Amin and the maximum luminance value Amax into an output luminance value between the proper lower limit luminance value Xmin and the proper upper limit luminance value Xmax with a monotonous increase along the input luminance value. Consequently, the image processing section 24 is capable of performing the gradation conversion that does not invert a magnitude relation among the luminance values of the pixels forming the first image and the second images.
  • the gradation conversion may be performed by, for example, table reference instead of using equation 1.
  • the image processing section 24 performs the gradation conversion using the generated gradation conversion curve as explained below with reference to FIG. 12 (step S 13 ). Consequently, a luminance distribution before the gradation conversion indicated by an alternate long and short dash line in FIG. 7 changes to a luminance distribution after the gradation conversion indicated by a solid line. That is, the luminance distribution is fit within the proper luminance range of the luminance value equal to or larger than Xmin and equal to or smaller than Xmax.
  • FIG. 7 is a diagram showing a situation of a change in the luminance distribution by the gradation conversion.
  • step S 7 When the processing in step S 7 , step S 9 , or step S 13 is performed or when it is determined in step S 8 or step S 11 that the minimum luminance value Amin of the images is equal to or larger than the proper lower limit luminance value Xmin (i.e., the images are fit within the proper luminance range even if the luminance adjustment is not performed), the endoscope system performs the other image processing with the image processing section 24 , generates a display signal with the output section 25 , and displays the display signal on the monitor 3 (step S 14 ).
  • the endoscope system performs the other image processing with the image processing section 24 , generates a display signal with the output section 25 , and displays the display signal on the monitor 3 (step S 14 ).
  • the endoscope system determines whether to finish the processing (step S 15 ). When determining not to finish the processing, the endoscope system returns to step S 2 and performs the processing explained above. When determining to finish the processing, the endoscope system ends the processing.
  • FIG. 11 is a flowchart for explaining details of the acquisition processing for the maximum luminance value Amax and the minimum luminance value Amin in step S 4 in FIG. 10 .
  • the processing is performed mainly by the brightness-range calculating section 23 .
  • the brightness-range calculating section 23 initializes the maximum luminance value Amax and the minimum luminance value Amin (step S 21 ). More specifically, the brightness-range calculating section 23 substitutes 0, which is a dynamic range minimum luminance value that can be taken as a luminance value, in the maximum luminance value Amax and substitutes a dynamic range maximum luminance value Ymax (i.e., a maximum value of a luminance dynamic range and is 1023 in the case of a 10-bit signal of 0 to 1023), which can be taken as a luminance value, in the minimum luminance value Amin.
  • 0, which is a dynamic range minimum luminance value that can be taken as a luminance value, in the maximum luminance value Amax substitutes a dynamic range maximum luminance value Ymax (i.e., a maximum value of a luminance dynamic range and is 1023 in the case of a 10-bit signal of 0 to 1023), which can be taken as a luminance value, in the minimum luminance value Amin.
  • the brightness-range calculating section 23 reads an upper threshold Yu (satisfying Yu>Xmax) and a lower threshold Yd (satisfying Yd ⁇ Xmin) indicating a luminance range of an excluded region set in advance (or set by the user) (step S 22 ).
  • the excluded region is a region excluded from a target for which the maximum luminance value Amax and the minimum luminance value Amin are calculated (i.e., both of the maximum luminance value Amax and the minimum luminance value Amin are not calculated from pixels in the excluded region) and is a region excluded from a target of the gradation conversion to be performed later.
  • images 31 to 33 are arrayed without gaps.
  • gaps in which images are not displayed sometimes occur among the images 31 to 33 (see a gap 35 in FIG. 15 ) depending on arrangement and structure (see FIG. 14 ) of the image pickup sections 11 to 13 .
  • Such gaps and the like are directly displayed without performing the gradation conversion. Therefore, the gaps are the excluded regions.
  • a white void region and a black solid region in the images do not change to images in which details can be observed even if the gradation conversion is performed such that the regions are fit within the proper luminance range. Moreover, gradation width of the entire images is narrowed. It is difficult to observe the images. Therefore, the white void region and the black solid region are also the excluded regions.
  • processing for setting, as the excluded regions, pixels having luminance values equal to or larger than a predetermined upper threshold Yu indicating a white void and pixels having luminance values equal to or smaller than a predetermined lower threshold Yd indicating a black solid is performed as explained below.
  • FIG. 9 is a diagram showing an example of a luminance distribution obtained when gaps of black display are present between the forward image 31 and the right sideward image 32 and between the forward image 31 and the left sideward image 33 .
  • a vicinity of the horizontal pixel position P 1 and a vicinity of the horizontal pixel position Pr are the gaps of the black display.
  • the luminance value is equal to or smaller than the lower threshold Yd.
  • the brightness-range calculating section 23 reads a luminance value Y of one pixel from the images (the forward image 31 , the right sideward image 32 , and the left sideward image 33 ) in appropriate order, for example, order of raster scan (step S 23 ) and determines whether the read luminance value Y is equal to or smaller than the lower threshold Yd (step S 24 ).
  • the brightness-range calculating section 23 determines whether the luminance value Y is equal to or larger than the upper threshold Yu (step S 25 ).
  • the brightness-range calculating section 23 determines whether the luminance value Y is smaller than the minimum luminance value Amin set at present (step S 26 ).
  • the brightness-range calculating section 23 substitutes the luminance value Y in the minimum luminance value Amin (step S 27 ).
  • the brightness-range calculating section 23 determines whether the luminance value Y is larger than the maximum luminance value Amax set at present (step S 28 ).
  • the brightness-range calculating section 23 substitutes the luminance value Y in the maximum luminance value Amax (step S 29 ).
  • step S 24 When determining in step S 24 that the luminance value Y is equal to or smaller than the lower threshold Yd, when determining in step S 25 that the luminance value Y is equal to or larger than the upper threshold Yu, when determining in step S 28 that the luminance value Y is equal to or smaller than the maximum luminance value Amax, when performing the processing in step S 27 , or when performing the processing in step S 29 , the brightness-range calculating section 23 determines whether the processing for all the pixels included in the images (the forward image 31 , the right sideward image 32 , and the left sideward image 33 ) has finished (step S 30 ).
  • the brightness-range calculating section 23 shifts to step S 23 and performs the processing concerning next unprocessed pixel as explained above.
  • the brightness-range calculating section 23 returns to the processing shown in FIG. 10 from the processing.
  • FIG. 12 is a flowchart showing details of the gradation conversion processing in step S 13 in FIG. 10 .
  • the processing is performed mainly by the image processing section 24 .
  • the image processing section 24 reads the luminance value Y of one pixel from the respective images (the forward image 31 , the right sideward image 32 , and the left sideward image 33 ) in appropriate order, for example, order of raster scan (step S 41 ) and determines whether the read pixel is a pixel in the excluded region (step S 42 ).
  • the image processing section 24 may perform the determination of the excluded region by comparing the luminance value Y of the read pixel with the lower threshold Yd and the upper threshold Yu again.
  • the image processing section 24 may store a pixel position concerning the pixel determined as being in the excluded region in the processing shown in FIG. 11 and performs the determination of the excluded region by determining whether a position of the pixel read in step S 41 coincides with the stored pixel position of the excluded region.
  • the pixel position, which is the excluded region is stored in the latter case, it is possible to extend and apply the determination when pixels forming boundaries between the first image and the second images are pixels having luminance values other than white pixels and black pixels, for example, gray pixels.
  • the pixel position, which is the excluded region is stored in advance, it is possible to exclude the pixel position from the target of the gradation conversion in the determination in step S 42 .
  • the image processing section 24 performs adjustment of the luminance values if the images are continuous (a gap, which is a non-image portion, is absent between the images). However, the image processing section 24 does not perform the adjustment of the luminance value, for example, when the gap is present between the images.
  • the image processing section 24 subjects the pixel to the gradation conversion according to the gradation conversion curve shown in FIG. 8 (step S 43 ).
  • the image processing section 24 determines whether the processing has finished concerning all pixels included in the images (the forward image 31 , the right sideward image 32 , and the left sideward image 33 ) (step S 44 ).
  • the image processing section 24 shifts to step S 41 and performs the processing concerning the next unprocessed pixel as explained above.
  • the image processing section 24 returns from the processing to the processing shown in FIG. 10 .
  • the image processing section 24 does not perform the adjustment of the luminance value concerning pixels having luminance values equal to or larger than the upper threshold Yu and pixels having luminance values equal to or smaller than the lower threshold Yd.
  • FIG. 13 is a diagram showing an example in which the luminance distribution of the images obtained in the state shown in FIG. 3 is subjected to the gradation conversion for each of the images in respective directions of fields of view.
  • the gradation conversion explained above may be individually performed on necessary images among a plurality of images.
  • An input luminance value may be directly outputted as an output luminance value concerning an image on which the gradation conversion applied to the necessary images is not performed.
  • necessary images among the first image (e.g., the forward image 31 ) and the second images (e.g., the right-sideward image 32 and the left sideward image 33 ) may be individually subjected to the gradation conversion such that a luminance range from a minimum luminance value to a maximum luminance value is fit within a predetermined luminance range.
  • FIG. 14 is a diagram partially showing an internal configuration of the endoscope 1 in the first modification.
  • FIG. 15 is a diagram showing a display example on the monitor 3 of an image obtained from the endoscope 1 in the first modification.
  • the distal end portion of the insertion section 1 a of the endoscope 1 is provided with a compound optical system 41 that transmits and acquires a forward optical image (an image of a field of view in a region including a forward direction in a subject) and reflects and acquires an optical image (an image of a field of view in a region including a sideward direction different from the forward direction in the same subject) in a predetermined angle range (an entire circumference in a circumferential direction when the predetermined angle range is 360°) in the circumferential direction in the sideward direction and an imaging optical system 42 that forms an optical image from the compound optical system 41 on an image pickup device 43 explained below.
  • a forward optical image an image of a field of view in a region including a forward direction in a subject
  • an optical image an image of a field of view in a region including a sideward direction different from the forward direction in the same subject
  • a predetermined angle range an entire circumference in a circumferential direction when the predetermined angle range is 360°
  • a first subject-image acquiring section is configured by portions that form a forward optical image of the compound optical system 41 and the imaging optical system 42 and a portion that picks up the forward optical image of the image pickup device 43 .
  • the first subject-image acquiring section is disposed to be directed forward at the distal end portion of the insertion section 1 a.
  • a second subject-image acquiring section is configured by portions that form the sideward optical image of the compound optical system 41 and the imaging optical system 42 and a portion that picks up the sideward optical image of the image pickup device 43 .
  • the second subject-image acquiring section is disposed in a circumferential surface section of the insertion section 1 a to be capable of picking up a subject image in a predetermined angle range in the circumferential direction.
  • both of the forward optical image and the sideward optical image in the circumferential direction are formed in different image pickup regions on the same image pickup device 43 and image pickup signals are generated.
  • the first subject-image acquiring section and the second subject-image acquiring section share and include one image pickup section (the image pickup device 43 ).
  • An optical image (a first subject image) of a subject present in a first field of view is formed in a part of the image pickup section and a first image pickup signal is generated.
  • An optical image (a second subject image) of a subject present in a second field of view is formed in another part of the image pickup section and a second image pickup signal is generated.
  • the image pickup signals generated by the image pickup device 43 are outputted to the video processor 2 via the signal line 14 .
  • An image including a first image and second images is formed by the image forming section 20 explained above in the video processor 2 and is processed by the brightness-range calculating section 23 and the image processing section 24 (i.e., the image pickup section is electrically connected to the image forming section 20 ).
  • a distal end side of the light guide 18 branches.
  • One branch is connected to the illuminating section 15 that radiates light in a forward direction.
  • Another branch is connected to the illuminating section 16 that radiates light, for example, in a right sideward direction.
  • Still another branch is connected to the illuminating section 17 that radiates light in a left sideward direction. Note that, in a configuration in the modification, a subject image in the circumferential direction is picked up concerning a sideward direction. Therefore, an illuminating section that radiates light in an upper sideward direction, an illuminating section that radiates light in a lower sideward direction, an illuminating section that radiates light in another sideward direction, or the like may be provided.
  • An image obtained from the endoscope 1 having such a configuration is displayed on the screen 30 of the monitor 3 , for example, as shown in FIG. 15 .
  • the forward optical image acquired from the compound optical system 41 is formed by the imaging optical system 42 as a circular optical image in a center of the image pickup device 43 .
  • a first image pickup signal of a forward field of view is generated.
  • a circular forward image 31 A is formed by the image forming section 20 on the basis of the first image pickup signal of the forward field of view.
  • the sideward optical image in the circumferential direction acquired by the compound optical system 41 is formed by the imaging optical system 42 as an optical image in a predetermined angle range (an annular optical image when the predetermined angle range is 360°) in a ring surrounding the circular optical image in the center explained above in the image pickup device 43 .
  • a second image pickup signal in a sideward field of view is generated.
  • an annular sideward image 32 A in an outer circumferential section of the forward image 31 A is formed by the image forming section 20 on the basis of the second image pickup signal in the sideward field of view.
  • the gap 35 occurs between the forward image 31 A and the sideward image 32 A because of a configuration, disposition, and the like of the compound optical system 41 .
  • the gap 35 is a dark portion in which an optical image of a subject is not formed on the image pickup device 43 . Therefore, the gap 35 assumes a black frame shape. Therefore, as explained above with reference to FIG. 9 , the gap 35 is an excluded region excluded from a target on which the gradation conversion is performed.
  • the image pickup device 43 includes a rectangular image pickup surface.
  • image circles of the compound optical system 41 and the imaging optical system 42 are regions surrounded by a circle smaller than an image pickup surface as explained with reference to FIG. 15 .
  • a size and a shape of an image circle on the image pickup device 43 are known as design values if a configuration of an optical system is determined. Therefore, when the image pickup device 43 is an imager such as a CMOS capable of reading out a pixel in a desired pixel position, when readout from the image pickup device 43 is performed, an increase in speed of the readout may be achieved by reading out only pixels in the image circle. Consequently, it is possible to improve a frame rate of an image and reduce power consumption during the readout.
  • readout from the image pickup device 43 may be not performed concerning a pixel corresponding to the gap 35 to achieve a further increase in the speed of the readout.
  • the image forming section 20 forms the forward image 31 A as a first image in a circular shape, forms the sideward image 32 A as a second image in a shape of a predetermined angle range in a ring surrounding the forward image 31 A, and forms an image.
  • necessary images among the first image and the second images may be individually subjected to the gradation conversion.
  • the gradation conversion is individually performed in the first image and the second images, it is sufficient to perform processing for disassembling and respectively cutting out a region of the forward image (the first image) and regions of the sideward images (the second images) from the image (the image signal), performing individual kinds of gradation conversion processing same as the gradation conversion processing explained in the first embodiment respectively on the forward image (the first image) and the sideward images (the second images), and combining the processed forward image (the first image) and the sideward images (the second images) to form the original one image.
  • FIG. 16 is a perspective view partially showing a configuration of an endoscope in the second modification in a state in which a sideward-image acquiring unit 50 is mounted.
  • FIG. 17 is a perspective view partially showing the configuration of the endoscope in the second modification in a state in which the sideward-image acquiring unit 50 is detached.
  • the endoscope in the modification includes, as shown in FIGS. 16 and 17 , an endoscope main body 1 A and the sideward-image acquiring unit 50 .
  • the endoscope main body 1 A includes the image pickup section 11 that acquires a first image pickup signal related to the forward image 31 , the illuminating section 15 that radiates light in a forward direction to an image pickup range by the image pickup section 11 , and a forceps channel 19 for inserting through a treatment instrument such as forceps.
  • the endoscope main body 1 A can be used as a general front-view type endoscope as well.
  • the sideward-image acquiring unit 50 is detachably attached to the endoscope main body 1 A.
  • the sideward-image acquiring unit 50 includes the image pickup section 12 that acquires a second image pickup signal related to the right sideward image 32 , the illuminating section 16 that radiates light to a right sideward direction to an image pickup range by the image pickup section 12 , the image pickup section 13 that acquires a second image pickup signal related to the left sideward image 33 , the illuminating section 17 that radiates light in a left sideward direction to an image pickup range by the image pickup section 13 , a fitting arm section 51 that fits in the endoscope main body 1 A to attach the sideward-image acquiring unit 50 to the endoscope main body 1 A, and a locking band 52 for locking a cord on a proximal end side of the sideward-image acquiring unit 50 to the endoscope main body 1 A.
  • the embodiment explained above is also applicable to an endoscope configured by combining the front-view endoscope main body 1 A and the side-view sideward-image acquiring unit 50 detachably attachable to the endoscope main body 1 A.
  • FIG. 18 is a diagram showing a modification of an endoscope system in which the illuminating sections 15 to 17 are configured by light emitting elements.
  • the illuminating sections 15 to 17 shown in FIG. 18 are configured using light emitting elements such as light emitting diodes.
  • a light-source control section 26 for adjusting light amounts of the illuminating sections 15 to 17 formed by the light emitting elements according to, for example, current control or pulse width modulation (PWM) is provided.
  • PWM pulse width modulation
  • the images in the plurality of fields of view are subjected to the gradation conversion such that the illumination range is fit within the predetermined luminance range (e.g., the proper luminance range). Therefore, it is possible to observe an image in a proper brightness range without spoiling shading and three-dimensional appearance of the image.
  • the predetermined luminance range e.g., the proper luminance range
  • the gradation conversion for not reversing the magnitude relation among the luminance values of the pixels forming the first image and the second images is performed. Therefore, it is possible to substantially accurately maintain shading and three-dimensional appearance during image acquisition.
  • the gradation conversion is performed according to the gradation conversion curve for converting the minimum luminance value into the proper lower limit luminance value, converting the maximum luminance value into the proper upper limit luminance value, and converting a luminance value between the minimum luminance value and the maximum luminance value into a luminance value between the proper lower limit luminance value and the proper upper limit luminance value with a monotonous increase along the input luminance value. Therefore, it is possible to fit the luminance range within the proper luminance range while maintaining the shading and the three-dimensional appearance without performing a complicated arithmetic operation.
  • equation 1 is a linear function, an advantage that a processing load of the arithmetic operation is light exists.
  • the adjustment of the luminance values is not performed concerning the pixels having the luminance values equal to or larger than the predetermined upper threshold Yu and the pixels having the luminance values equal to or smaller than the lower threshold Yd. Therefore, it is possible to give a wide dynamic range to pixels effective for the observation of the subject.
  • the adjustment of the luminance values is not performed concerning the pixels forming the boundaries between the first image and the second images adjacent to each other. Therefore, the gap 35 or the like between the images is maintained at a constant luminance value. The luminance value of the gap 35 or the like does not change for each of the images. Therefore, a clear image is obtained.
  • the images of the forward field of view and the sideward fields of view of the insertion section 1 a are acquired. Therefore, it is possible to realize a super-wide angle endoscope without using an expensive and large super-wide angle lens.
  • the image output section generates the display signal from the image (the image signal) processed by the image processing section. Therefore, it is possible to display and observe the image on the monitor 3 or the like.
  • the plurality of second subject-image acquiring sections disposed in the plurality of angle positions in the circumferential direction of the insertion section 1 a acquire the plurality of second image pickup signals.
  • the image forming section 20 forms, for example, as shown in FIG. 2 , the image in which the first image is present in the center and the plurality of second images are respectively arranged in the plurality of angle positions in the circumferential direction of the first image. Therefore, it is possible to observe an image coinciding with a direction of field of view during image pickup.
  • the first image pickup signal related to the forward subject image of the insertion section 1 a and the second image pickup signal related to the subject image in the predetermined angle range in the circumferential direction of the insertion section 1 a are acquired.
  • the image forming section 20 forms the first image in the circular shape on the basis of the first image pickup signal related to the forward subject image, forms the second images in the shape of the predetermined angle range in the ring surrounding the first image on the basis of the second image pickup signal related to the subject image in the circumferential direction, and forms the image signal (the image).
  • the image signal the image
  • the endoscope system including the wide angle endoscope including the front-view observation optical system and the side-view observation optical system, it is possible to observe an image in a proper brightness range without spoiling shading and three-dimensional appearance of the image.
  • the first subject-image acquiring section and the second subject-image acquiring section include, for example, the image pickup optical systems and the image pickup devices.
  • the image pickup optical systems may be disposed in the endoscope 1 (or the endoscope main body 1 A and the sideward-image acquiring unit 50 ).
  • the image pickup devices may be disposed in the video processor 2 . In this case, optical images formed by the image pickup optical systems only have to be transmitted to the image pickup devices in the video processor 2 via a transmission optical system or the like.
  • the endoscope system is mainly explained above.
  • the present invention may be an actuating method for actuating the endoscope system as explained above or may be a processing program for causing a computer to actuate the endoscope system as explained above, a computer-readable non-transitory recording medium that records the processing program, or the like.
  • the present invention is not limited to the embodiment explained above per se.
  • the constituent elements can be modified and embodied in a range not departing from the spirit of the present invention.
  • Various focus of inventions can be formed by appropriate combinations of the plurality of constituent elements disclosed in the embodiment. For example, several constituent elements may be deleted from all the constituent elements described in the embodiment. Further, the constituent elements described in different embodiments may be combined as appropriate. In this way, it goes without saying that various modifications and applications are possible within a range not departing from the spirit of the invention.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • General Health & Medical Sciences (AREA)
  • Radiology & Medical Imaging (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Pathology (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Astronomy & Astrophysics (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Endoscopes (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)
  • Closed-Circuit Television Systems (AREA)
US15/448,764 2014-09-08 2017-03-03 Endoscope system that adjusts luminance of frame image including images of a pluraliry of regions and actuating method for endoscope system Active 2036-10-14 US10602916B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2014-182417 2014-09-08
JP2014182417 2014-09-08
PCT/JP2015/075215 WO2016039269A1 (ja) 2014-09-08 2015-09-04 内視鏡システム、内視鏡システムの作動方法

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/075215 Continuation WO2016039269A1 (ja) 2014-09-08 2015-09-04 内視鏡システム、内視鏡システムの作動方法

Publications (2)

Publication Number Publication Date
US20170172392A1 US20170172392A1 (en) 2017-06-22
US10602916B2 true US10602916B2 (en) 2020-03-31

Family

ID=55459017

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/448,764 Active 2036-10-14 US10602916B2 (en) 2014-09-08 2017-03-03 Endoscope system that adjusts luminance of frame image including images of a pluraliry of regions and actuating method for endoscope system

Country Status (3)

Country Link
US (1) US10602916B2 (ja)
JP (1) JP6121058B2 (ja)
WO (1) WO2016039269A1 (ja)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111626958A (zh) * 2020-05-27 2020-09-04 重庆紫光华山智安科技有限公司 曝光调节方法、装置、计算机可读存储介质和电子设备

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6022109B2 (ja) * 2014-07-28 2016-11-09 オリンパス株式会社 内視鏡システム
JP6580778B2 (ja) * 2016-03-30 2019-09-25 富士フイルム株式会社 内視鏡画像信号処理装置およびプログラム
CN107726053B (zh) * 2016-08-12 2020-10-13 通用电气公司 探头系统和检测方法
EP3649913A4 (en) * 2017-08-03 2020-07-08 Sony Olympus Medical Solutions Inc. MEDICAL OBSERVATION DEVICE

Citations (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5940126A (en) * 1994-10-25 1999-08-17 Kabushiki Kaisha Toshiba Multiple image video camera apparatus
JP2001290103A (ja) 2001-02-13 2001-10-19 Olympus Optical Co Ltd 観察システム
JP2002017667A (ja) 1991-03-11 2002-01-22 Olympus Optical Co Ltd 画像処理装置
US20030020974A1 (en) * 2001-06-11 2003-01-30 Yuki Matsushima Image processing apparatus, image processing method and information recording medium
US6694051B1 (en) * 1998-06-24 2004-02-17 Canon Kabushiki Kaisha Image processing method, image processing apparatus and recording medium
US20050128539A1 (en) * 2003-05-09 2005-06-16 Konica Minolta Photo Imaging, Inc. Image processing method, image processing apparatus and image recording apparatus
EP1548646A2 (en) 2003-12-26 2005-06-29 Canon Kabushiki Kaisha Adjustment of dynamic range of image
US20050141002A1 (en) * 2003-12-26 2005-06-30 Konica Minolta Photo Imaging, Inc. Image-processing method, image-processing apparatus and image-recording apparatus
JP2006033520A (ja) 2004-07-16 2006-02-02 Sony Corp 画像処理装置、画像出力装置、画像処理方法、プログラム及び記録媒体
US20060023273A1 (en) * 2004-07-30 2006-02-02 Casio Computer Co., Ltd. Image pickup device with brightness correcting function and method of correcting brightness of image
US20060215908A1 (en) * 2005-03-24 2006-09-28 Konica Minolta Holdings, Inc. Image pickup apparatus and image processing method
US20070040914A1 (en) * 2005-08-16 2007-02-22 Konica Minolta Holdings, Inc. Image sensing apparatus and image processing method
JP2007190060A (ja) 2006-01-17 2007-08-02 Olympus Corp 内視鏡装置
US20090073287A1 (en) * 2007-09-18 2009-03-19 Olympus Corporation Image capturing device
US20090073469A1 (en) * 2007-09-14 2009-03-19 Canon Kabushiki Kaisha Color image forming apparatus and color adjustment method
WO2011055613A1 (ja) 2009-11-06 2011-05-12 オリンパスメディカルシステムズ株式会社 内視鏡システム
JP4782900B2 (ja) 2009-11-06 2011-09-28 オリンパスメディカルシステムズ株式会社 内視鏡
US20110267542A1 (en) * 2010-04-30 2011-11-03 Canon Kabushiki Kaisha Image processing apparatus and control method thereof
JP2012157577A (ja) 2011-02-01 2012-08-23 Olympus Medical Systems Corp 内視鏡
US20120253121A1 (en) * 2011-03-31 2012-10-04 Ryou Kitano Electronic endoscope
JP2013066648A (ja) 2011-09-26 2013-04-18 Olympus Corp 内視鏡用画像処理装置及び内視鏡装置
JP2013542467A (ja) 2010-10-28 2013-11-21 エンドチョイス イノベーション センター リミテッド マルチセンサ内視鏡のための光学系
US20140330078A1 (en) * 2013-05-03 2014-11-06 Samsung Electronics Co., Ltd. Endoscope and image processing apparatus using the same
US9019433B2 (en) * 2006-08-31 2015-04-28 Canon Kabushiki Kaisha Image processing apparatus and method
US20150356904A1 (en) * 2014-06-05 2015-12-10 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US20160015258A1 (en) * 2014-07-21 2016-01-21 Endochoice, Inc. Multi-Focal, Multi-Camera Endoscope Systems
US20160054969A1 (en) * 2014-08-21 2016-02-25 Canon Kabushiki Kaisha Display control apparatus controlling gradation characteristics of display apparatus, display system, and display control method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006115964A (ja) * 2004-10-20 2006-05-11 Fujinon Corp 電子内視鏡装置
JP2009100936A (ja) * 2007-10-23 2009-05-14 Fujinon Corp 画像処理装置
JP5762344B2 (ja) * 2012-03-28 2015-08-12 富士フイルム株式会社 画像処理装置及び内視鏡システム

Patent Citations (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002017667A (ja) 1991-03-11 2002-01-22 Olympus Optical Co Ltd 画像処理装置
US5940126A (en) * 1994-10-25 1999-08-17 Kabushiki Kaisha Toshiba Multiple image video camera apparatus
US6694051B1 (en) * 1998-06-24 2004-02-17 Canon Kabushiki Kaisha Image processing method, image processing apparatus and recording medium
JP2001290103A (ja) 2001-02-13 2001-10-19 Olympus Optical Co Ltd 観察システム
US20030020974A1 (en) * 2001-06-11 2003-01-30 Yuki Matsushima Image processing apparatus, image processing method and information recording medium
US20050128539A1 (en) * 2003-05-09 2005-06-16 Konica Minolta Photo Imaging, Inc. Image processing method, image processing apparatus and image recording apparatus
EP1548646A2 (en) 2003-12-26 2005-06-29 Canon Kabushiki Kaisha Adjustment of dynamic range of image
US20050141780A1 (en) * 2003-12-26 2005-06-30 Canon Kabushiki Kaisha Image processing apparatus, image processing method, program, and storage medium
US20050141002A1 (en) * 2003-12-26 2005-06-30 Konica Minolta Photo Imaging, Inc. Image-processing method, image-processing apparatus and image-recording apparatus
JP2005192154A (ja) 2003-12-26 2005-07-14 Canon Inc 画像処理装置、画像処理方法、プログラム及び記憶媒体
JP2006033520A (ja) 2004-07-16 2006-02-02 Sony Corp 画像処理装置、画像出力装置、画像処理方法、プログラム及び記録媒体
US20060023273A1 (en) * 2004-07-30 2006-02-02 Casio Computer Co., Ltd. Image pickup device with brightness correcting function and method of correcting brightness of image
US20060215908A1 (en) * 2005-03-24 2006-09-28 Konica Minolta Holdings, Inc. Image pickup apparatus and image processing method
US20070040914A1 (en) * 2005-08-16 2007-02-22 Konica Minolta Holdings, Inc. Image sensing apparatus and image processing method
JP2007190060A (ja) 2006-01-17 2007-08-02 Olympus Corp 内視鏡装置
US9019433B2 (en) * 2006-08-31 2015-04-28 Canon Kabushiki Kaisha Image processing apparatus and method
US20090073469A1 (en) * 2007-09-14 2009-03-19 Canon Kabushiki Kaisha Color image forming apparatus and color adjustment method
US20090073287A1 (en) * 2007-09-18 2009-03-19 Olympus Corporation Image capturing device
WO2011055613A1 (ja) 2009-11-06 2011-05-12 オリンパスメディカルシステムズ株式会社 内視鏡システム
JP4782900B2 (ja) 2009-11-06 2011-09-28 オリンパスメディカルシステムズ株式会社 内視鏡
US20110273549A1 (en) * 2009-11-06 2011-11-10 Olympus Medical Systems Corp. Endoscope system
US20110267542A1 (en) * 2010-04-30 2011-11-03 Canon Kabushiki Kaisha Image processing apparatus and control method thereof
JP2013542467A (ja) 2010-10-28 2013-11-21 エンドチョイス イノベーション センター リミテッド マルチセンサ内視鏡のための光学系
JP2012157577A (ja) 2011-02-01 2012-08-23 Olympus Medical Systems Corp 内視鏡
US20120253121A1 (en) * 2011-03-31 2012-10-04 Ryou Kitano Electronic endoscope
JP2013066648A (ja) 2011-09-26 2013-04-18 Olympus Corp 内視鏡用画像処理装置及び内視鏡装置
US20140330078A1 (en) * 2013-05-03 2014-11-06 Samsung Electronics Co., Ltd. Endoscope and image processing apparatus using the same
US20150356904A1 (en) * 2014-06-05 2015-12-10 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US20160015258A1 (en) * 2014-07-21 2016-01-21 Endochoice, Inc. Multi-Focal, Multi-Camera Endoscope Systems
US20160054969A1 (en) * 2014-08-21 2016-02-25 Canon Kabushiki Kaisha Display control apparatus controlling gradation characteristics of display apparatus, display system, and display control method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
International Search Report dated Dec. 8, 2015 issued in PCT/JP2015/075215.

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111626958A (zh) * 2020-05-27 2020-09-04 重庆紫光华山智安科技有限公司 曝光调节方法、装置、计算机可读存储介质和电子设备
CN111626958B (zh) * 2020-05-27 2021-01-26 重庆紫光华山智安科技有限公司 曝光调节方法、装置、计算机可读存储介质和电子设备

Also Published As

Publication number Publication date
WO2016039269A1 (ja) 2016-03-17
US20170172392A1 (en) 2017-06-22
JPWO2016039269A1 (ja) 2017-04-27
JP6121058B2 (ja) 2017-04-26

Similar Documents

Publication Publication Date Title
US10602916B2 (en) Endoscope system that adjusts luminance of frame image including images of a pluraliry of regions and actuating method for endoscope system
US9769394B2 (en) Image pickup apparatus, image processing apparatus, and computer-readable storage device
JP5814698B2 (ja) 自動露光制御装置、制御装置、内視鏡装置及び内視鏡装置の作動方法
JP5355799B2 (ja) 内視鏡装置および内視鏡装置の作動方法
US9215366B2 (en) Endoscope system, control method, and imaging device
US10548465B2 (en) Medical imaging apparatus and medical observation system
US8545399B2 (en) Medical instrument
WO2012105445A1 (ja) 蛍光観察装置
US20200297184A1 (en) Medical light source device and medical observation system
US10893247B2 (en) Medical signal processing device and medical observation system
US10021312B2 (en) Endoscope system and method for operating endoscope system
US10901199B2 (en) Endoscope system having variable focal length lens that switches between two or more values
US20220225857A1 (en) Medical control device and medical observation system
US11700456B2 (en) Medical control device and medical observation system
US11743596B1 (en) Adaptive brightness non-uniformity correction in endoscope visualization
US20210290037A1 (en) Medical image processing apparatus and medical observation system
JP2007020762A (ja) プロセッサ及び電子内視鏡システム
JP2010005339A (ja) 内視鏡装置

Legal Events

Date Code Title Description
AS Assignment

Owner name: OLYMPUS CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ITO, TAKASHI;HONDA, KAZUKI;REEL/FRAME:041455/0623

Effective date: 20170127

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4