WO2016170604A1 - 内視鏡装置 - Google Patents
内視鏡装置 Download PDFInfo
- Publication number
- WO2016170604A1 WO2016170604A1 PCT/JP2015/062149 JP2015062149W WO2016170604A1 WO 2016170604 A1 WO2016170604 A1 WO 2016170604A1 JP 2015062149 W JP2015062149 W JP 2015062149W WO 2016170604 A1 WO2016170604 A1 WO 2016170604A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- unit
- color
- exposure time
- imaging
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0638—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00039—Operational features of endoscopes provided with input arrangements for the user
- A61B1/00042—Operational features of endoscopes provided with input arrangements for the user for mechanical operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0646—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements with illumination filters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0655—Control therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0661—Endoscope light sources
- A61B1/0669—Endoscope light sources at proximal end of an endoscope
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/2407—Optical details
- G02B23/2461—Illumination
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B26/00—Optical devices or arrangements for the control of light using movable or deformable optical elements
- G02B26/007—Optical devices or arrangements for the control of light using movable or deformable optical elements the movable or deformable optical element controlling the colour, i.e. a spectral characteristic, of the light
- G02B26/008—Optical devices or arrangements for the control of light using movable or deformable optical elements the movable or deformable optical element controlling the colour, i.e. a spectral characteristic, of the light in the form of devices for effecting sequential colour changes, e.g. colour wheels
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/741—Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000096—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/2407—Optical details
- G02B23/2461—Illumination
- G02B23/2469—Illumination using optical fibres
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/555—Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes
Definitions
- the present invention relates to an endoscope apparatus.
- an endoscope apparatus that obtains an endoscopic image with an expanded dynamic range by photographing a subject multiple times with different exposure times and synthesizing a plurality of acquired images (for example, Patent Documents). 1).
- the endoscope apparatus of Patent Literature 1 uses a color CCD to photograph a subject twice with a long exposure time (1/60 seconds) and a short exposure time (1/240 seconds), and two digital signals obtained. Are separated into R, G, B signals. Next, an R signal with an expanded dynamic range is generated by combining an R signal with a short exposure time and an R signal with a long exposure time. The G signal and the B signal also expand the dynamic range in the same manner as the R signal. Next, a color endoscope image having a dynamic range wider than the dynamic range of the CCD is generated using the R signal, G signal, and B signal whose dynamic range is expanded.
- the endoscope apparatus of Patent Document 1 is suitable for photographing a subject having a large difference in brightness, such as a cylindrical digestive tract.
- a method may be used in which a blue pigment having a high contrast with respect to the reddish color of the living tissue is dispersed in the diagnosis region to enhance the unevenness of the diagnosis region of the living tissue.
- the color of the endoscopic image is often biased to red or blue, and red and blue information is particularly important in endoscopic image diagnosis.
- the endoscope apparatus of Patent Document 1 uniformly adjusts the dynamic range of the R signal, the G signal, and the B signal based on the brightness of the entire image, and the dynamic range of the R signal, the G signal, and the B signal. Cannot be adjusted individually. In this case, color saturation occurs in a dark red or dark blue region, or blackout occurs in a dark region. Therefore, there is a problem that red and blue differences in biological tissue, which are important in endoscopic image diagnosis, cannot be accurately reproduced in an endoscopic image.
- the present invention has been made in view of the above-described circumstances, and an object of the present invention is to provide an endoscope apparatus capable of obtaining an endoscopic image that accurately reproduces a difference in color of a living tissue.
- the present invention provides the following means.
- the present invention provides an illumination unit that sequentially irradiates a subject with illumination light of three colors of red, green, and blue, an imaging unit that captures the illumination light reflected by the subject and acquires an image, and the illumination unit Control that causes the imaging unit to sequentially acquire component images of three colors of red, green, and blue by controlling the imaging unit to perform imaging in synchronization with irradiation of the three colors of illumination light from A dynamic range enlarging unit that generates an enlarged component image obtained by enlarging a dynamic range for a component image of at least one color other than green among the three color component images acquired by the imaging unit, and the dynamic range An image generation unit configured to combine the at least one color expansion component image generated by the expansion unit and the other color component image to generate a color endoscope image, and the control unit includes the least Controlling the imaging unit to shoot illumination light of one color a plurality of times with different exposure times, thereby causing the imaging unit to acquire a pluralit
- the three-color component image is acquired by the imaging unit performing imaging of the subject in synchronization with the illumination light irradiated from the illumination unit to the subject being switched between red, green, and blue.
- An RGB-format endoscopic image is generated from the acquired three-color component images by the image generation unit.
- the control unit causes the imaging unit to perform photographing of the same color illumination light a plurality of times with different exposure times, thereby allowing a plurality of components having different brightnesses.
- An image is acquired.
- the dynamic range expanding unit generates a red or / and blue expanded component image having a dynamic range wider than the dynamic range of the green component image by combining a plurality of component images of the same color with different brightness. .
- a red or / and blue component image having a wide dynamic range it is possible to obtain an endoscopic image that accurately reproduces the color of a living tissue, particularly red or / and blue. .
- control unit controls the imaging unit so that each of the red and blue illumination lights is photographed a plurality of times with different exposure times, and the exposure time in photographing the red illumination light
- the exposure time in photographing the blue illumination light may be controlled independently of each other.
- an exposure time setting unit sets an exposure time in the next photographing of the illumination light of at least one color based on a distribution of gradation values of the plurality of component images of at least one color. It may be.
- the exposure time setting unit determines whether the exposure time is excessive or insufficient based on the gradation value distribution. If the exposure time is insufficient, the exposure time is set longer in the next shooting, and the exposure time is excessive. In some cases, the exposure time in the next shooting is set shorter. Thereby, in the next photographing, a component image having an appropriate contrast can be acquired.
- an attention area setting section that sets an attention area within a photographing range of the component image by the imaging section
- the exposure time setting section includes the attention area among the plurality of component images of at least one color. Based on the distribution of gradation values in the region of interest set by the region setting unit, the exposure time in the next shooting of the at least one color illumination light may be set. By doing so, the dynamic range of the red or / and blue enlarged component image is optimized with respect to the color and brightness of the region of interest. Therefore, high color reproducibility of the attention area can be ensured in the endoscopic image.
- FIG. 1 is an overall configuration diagram of an endoscope apparatus according to a first embodiment of the present invention. It is a front view of the color variable filter in the illumination unit of the endoscope apparatus of FIG. It is a timing chart which shows the timing of irradiation of illumination light, and exposure of an image pick-up element. It is a figure explaining the process in the dynamic range expansion part and compression part of the endoscope apparatus of FIG. It is a flowchart which shows operation
- An example of an endoscopic image (upper stage), an image signal having a long exposure time along the line AA of the endoscopic image (middle stage), and an image signal after extending the long exposure time (lower stage) are shown.
- An example of an endoscopic image (upper stage), an image signal having a short exposure time along the line AA of the endoscopic image (middle stage), and an image signal after reducing the short exposure time (lower stage) are shown. It is a graph which shows the relationship between the brightness of a to-be-photographed object, and the gradation value of the image signal of long exposure time. It is a graph which shows the relationship between a to-be-photographed object's brightness, and the gradation value of the image signal of short exposure time.
- FIG. 1 It is a block diagram of the image processor image processor in the endoscope apparatus which concerns on the 3rd Embodiment of this invention. It is an example of the endoscopic image in which the attention area was set. It is a flowchart which shows the exposure time setting routine in operation
- the endoscope apparatus 1 sequentially irradiates a living tissue (subject) with illumination light of three colors of red (R), green (G), and blue (B). This is a frame sequential method in which three color image signals are acquired in order and a color endoscopic image is generated from the acquired three color image signals.
- the endoscope apparatus 1 includes an elongated insertion portion 2 that is inserted into a living body, and an illumination unit 3 and an image processor 4 that are connected to the proximal end of the insertion portion 2.
- the insertion unit 2 includes an illumination lens 5 and an objective lens 6 provided on the distal end surface of the insertion unit 2, a condenser lens 7 provided on the proximal end surface of the insertion unit 2, and the illumination lens 5 and the condenser lens 7. And a light guide 8 disposed along the longitudinal direction and an image sensor (imaging unit) 9 disposed on the proximal end side of the objective lens 6.
- the condenser lens 7 condenses the illumination light incident from the illumination unit 3 on the base end surface of the light guide 8.
- the light guide 8 guides the illumination light incident on the base end surface from the condenser lens 7 to the front end surface and emits the light toward the illumination lens 5 from the front end surface.
- the illumination lens 5 diffuses the illumination light incident from the light guide 8 and irradiates the living tissue S.
- the objective lens 6 images the illumination light reflected by the living tissue S and incident on the objective lens 6 on the imaging surface of the imaging element 9.
- the image sensor 9 is a monochrome CCD image sensor or a monochrome CMOS image sensor.
- the imaging device 9 as will be described later, the illumination light L R to the living body tissue S, L G, which is controlled by the control unit 14 to perform synchronization with shot irradiation of L B. After the exposure, the image sensor 9 generates an image signal by photoelectric conversion, and transmits the generated image signal to an image memory 15 (described later) in the image processor 4.
- the flexible insertion portion 2 is provided with the image sensor 9 at the distal end portion.
- an image formed by the objective lens 6 is relayed to the proximal end side of the objective lens 6.
- a rigid insertion portion including a relay optical system may be used.
- an imaging element is arrange
- the illumination unit 3 includes a light source (for example, a xenon lamp) 10 that emits white light, two condenser lenses 11 and 12 disposed on the output optical axis of the light source 10, and the two condenser lenses 11. , 12 and a color variable filter 13 disposed between the two.
- the condensing lens 11 condenses the light emitted from the light source 10 and enters the color variable filter 13.
- the condensing lens 12 condenses the light transmitted through the color variable filter 13 and enters the condensing lens 7 of the insertion unit 2.
- the color variable filter 13 includes three color filters 13R, 13G, and 13B that are evenly arranged around a rotation axis 13a that is arranged in parallel to the output optical axis of the light source 10. .
- R filter 13R is allowed to transmit only R light L R
- G filter 13G is allowed to transmit only G light L G
- B filter 13B is to transmit only the B light L B.
- Filter 13R by the color variable filter 13 is rotated to the rotating shaft 13a around, 13G, 13B is disposed on the output optical axis in order, the condenser lens 7, R light L R, G light L G and B light L B is made incident from the color variable filter 13 in order.
- the rotation speed of the color variable filter 13 is constant, and the three filters 13R, 13G, and 13B all have the same shape and dimensions. Accordingly, from the illumination lens 5, as shown in FIG. 3, R light L R, the G light L G and B light L B is irradiated sequentially in living tissue S at predetermined time intervals, and, R light L R, irradiation time per one G light L G and B light L B are equal to each other.
- the rotation speed of the color variable filter 13 is preferably not less than 30 rps and not more than 60 rps so that the frame rate of the endoscopic image is not less than 30 fps and not more than 60 fps suitable for moving images.
- the image processor 4 includes a control unit 14 that controls the image sensor 9, an image memory 15 that temporarily stores the image signals S RL , S RS , S G , S BL , and S BS received from the image sensor 9, and R
- a dynamic range expansion unit 16 that performs dynamic range expansion processing on the image signals S RL and S RS and the B image signals S BL and S BS , and an R expanded image signal S RL + S RS and B expansion whose dynamic range is expanded
- a compression unit 17 that compresses the gradation value of the image signal S BL + S BS and an image generation unit 18 that generates an endoscopic image from the image signals S RL '+ S RS ', S G , S BL '+ S BS '. I have.
- Control unit 14 acquires the R light L R, the information of the timing of the irradiation of the G light L G and B light L B from the illumination unit 3. Control unit 14, based on the information of the acquired timing, as shown in FIG. 3, R light L R, in synchronization with the irradiation of the G light L G and B light L B, preset exposure time T The imaging element 9 is caused to perform imaging with RL , T RS , T G , T BL , and T BS . Thus, the control unit 14, in one frame period, R light L R, to execute imaging of G light L G and B light L B on the imaging device 9 in this order.
- the control unit 14, during the illumination period of the G light L G is performed only on the image pickup device 9 once taken with exposure time T G. Thereby, one G image signal S G is acquired by the image sensor 9 during one frame period.
- the control unit 14 during the illumination period of the R light L R, the long exposure time T RL, taken to execute twice shorter than the long exposure time T RL short exposure time T RS. Accordingly, two R image signals S RL and S RS having different exposure times are sequentially acquired by the image sensor 9 during one frame period.
- two B image signals S BL and S BS having different exposure times are sequentially acquired by the image sensor 9 during one frame period. As shown in FIG.
- the image signals S RL and S BL having a long exposure time are image signals in which a dark region of the living tissue S is clearly captured with high contrast.
- the short exposure time image signals S RS and S BS are image signals in which a bright region of the living tissue S is clearly captured with high contrast.
- the exposure times T RL , T RS , T G , T BL , and T BS are input to the control unit 14 when an observer inputs an arbitrary value using an input device (not shown) connected to the image processor 4, for example. It is set up.
- R image signal S RL, the exposure time for the S RS T RL, and T RS, B image signals S BL, the exposure time for the S BS T BL, and T BS can be set independently of each other.
- the exposure time T G is set to 15 msec
- the long exposure time T RL is set to 10 msec
- the short exposure times T RS and T BS are set to 5 msec.
- the image memory 15 sequentially receives the R image signal S RL , the R image signal S RS , the G image signal S G , the B image signal S BL , and the B image signal S BS during one frame period.
- the image memory 15 transmits only the G image signal S G constituting the G component image to the image generating unit 18, and the R image signals S RL and S RS constituting the R component image and the B image signal constituting the B component image.
- S BS and S BL are transmitted to the dynamic range expansion unit 16.
- FIG. 4 shows processing of the R image signals S RL and S RS in the dynamic range expansion unit 16 and the compression unit 17.
- FIG. 4 shows only R image signals S RL , S RS , S RL + S RS , S RL '+ S RS ' as an example, but B image signals S BL , S BS , S BL + S BS , S BL '+ S BS ' has the same characteristics.
- the dynamic range expansion unit 16 When the dynamic range expansion unit 16 receives two R image signals S RL and S RS from the image memory 15, the gradation value of each pixel in the R image signal S RL and the gradation value of each pixel in the R image signal S RS Are added to generate an R enlarged image signal S RL + S RS constituting an R enlarged component image.
- the dynamic range expansion unit 16 receives the gradation value of each pixel in the B image signal S BL and the pixel values in the B image signal S BS . By adding the gradation value, the B enlarged image signal S BL + S BS constituting the B enlarged component image is generated.
- Enlarged image signals S RL + S RS, S BL + S BS has a wide dynamic range than the dynamic range of the image pickup device 9, the image signal S RL, S RS, S G , S BL, 2 times the floor S BS Has a logarithm.
- the dynamic range expansion unit 16 transmits the generated R expanded image signal S RL + S RS and B expanded image signal S BL + S BS to the compression unit 17.
- the compression unit 17 compresses the number of gradations of the R enlarged image signal S RL + S RS and the B enlarged image signal S BL + S BS in half. Accordingly, the number of gradations of the R enlarged image signal S RL + S RS and the B enlarged image signal S BL + S BS becomes equal to the number of gradations of the G image signal S G.
- the compressing unit 17 transmits the compressed R enlarged image signal S RL '+ S RS ' and B enlarged image signal S BL '+ S BS ' to the image generating unit 18.
- Image generating unit 18 a G image signal S G unprocessed received from the image memory 15, received from the compression unit 17 R enlarged image signals S RL '+ S RS' and B enlarged image signals S BL '+ S BS' and A color endoscope image is generated by synthesizing RGB.
- the image generation unit 18 transmits the generated endoscopic image to the display unit 24.
- the display unit 24 sequentially displays the received endoscopic images.
- exposure times T RL , T RS , T G , T BL , and T BS are initialized by, for example, an observer (step S1). Then, when the illumination unit 3 starts to operate, is incident on the R light L R, the light guide 8 in the insertion portion 2 is G light L G and B light L B via the condenser lens 12 and 7 in order, insertion portion 2 of the tip toward the body tissue S from R light L R, the G light L G and B light L B is repeatedly irradiated in order (step S2).
- step S3 In the irradiation period of the G light L G (YES in step S3), and the control unit 14 by executing the shooting only once on the image pickup device 9 (step S4), and one of the G image signal S G Obtained (step S5).
- step S6 In the irradiation period of the R light L R by executing (NO in step S3), and a shooting photography and short exposure time of the long exposure time in order to control unit 14 is an imaging device 9 (Step S6) Two R image signals S RL and S RS are acquired (step S7).
- step S6 Two B image signals SBL and SBS are acquired (step S7).
- the two R image signals S RL and S RS are added to each other in the dynamic range expanding unit 16 to generate an R expanded image signal S RL + S RS with an expanded dynamic range (step S8).
- the two B image signals S BL and S BS are added to each other in the dynamic range expanding unit 16 to generate a B expanded image signal S BL + S BS with an expanded dynamic range (step S8).
- the R enlarged image signal S RL + S RS and the B enlarged image signal S BL + S BS are transmitted to the image generation unit 18 after the number of gradations is compressed by the compression unit 17 (step S9).
- the G image signal S G is input from the image sensor 9 via the image memory 15, and the R enlarged image signal S RL '+ S RS ' and the B enlarged image signal S BL '+ S BS are input from the compression unit 17.
- the compression unit 17 When 'is input (YES in step S10), three color image signals S G , S RL ' + S RS ', S BL ' + S BS 'are combined to generate a color endoscopic image. (Step S11).
- the generated endoscopic images are sequentially displayed on the display unit 24 as moving images (step S12).
- the brightness of the entire endoscopic image is ensured by the G image signal S G photographed with an exposure time TG longer than the R image signals S RL and S RS and the B image signals S BL and S BS.
- the image processor 4 is controlled by the image processor 4 and the image signals S RL , S RS , S G , S BL , S Only the processing of the BS needs to be changed from the conventional endoscope apparatus. Therefore, there is an advantage that an endoscopic image with high color reproducibility can be obtained without complicating the configuration and maintaining the high resolution and high frame rate of the conventional endoscope apparatus.
- an endoscope apparatus according to a second embodiment of the present invention will be described with reference to FIGS.
- the endoscope apparatus imaging of the R image signals S RL, S RS and B image signals S BL, S BS of the next R light L R and B light L B based on the distribution of gradation values Is different from the endoscope apparatus 1 of the first embodiment in that the exposure times T RL , T RS , T BL , and T BS are feedback-controlled.
- the endoscope apparatus of the present embodiment includes an image processor 41 in FIG. 6 instead of the image processor 4.
- the configuration other than the image processor 41 is the same as that of the endoscope apparatus 1 according to the first embodiment shown in FIG.
- the image processor 41 further includes a threshold processing unit 19 and an exposure time setting unit 20.
- the image memory 15 transmits the R image signals S RL and S RS and the B image signals S BL and S BS to the threshold processing unit 19 in addition to the dynamic range expansion unit 16.
- the threshold processing unit 19 includes a threshold value ⁇ RL for the tone value of the R image signal S RL , a threshold value ⁇ RS for the tone value of the R image signal S RS , and a threshold value ⁇ BL for the tone value of the B image signal S BL , , And a threshold value ⁇ BS for the gradation value of the B image signal S BS .
- Threshold alpha RL, alpha BL is the minimum gradation value 0 for the long exposure time R image signals S RL and B image signals S BL, or, to a value in the vicinity of large said minimum gradation value than said minimum gradation value Is set.
- Threshold processing unit 19 receives the R image signals S RL from the image memory 15, among all the pixels of the R image signals S RL, counting the number M R of pixels having a gradation value equal to or smaller than the threshold value alpha RL. Further, when the threshold value processing unit 19 receives the R image signal S RS from the image memory 15, the threshold value processing unit 19 measures the number N R of pixels having a gradation value equal to or greater than the threshold value ⁇ RS among all the pixels of the R image signal S RS. To do. Similarly, the threshold processing unit 19 receives the B image signal S BL from the image memory 15, among all the pixels of the B image signal, counts the number M B of pixels having a threshold alpha BL following gradation value . Further, when receiving the B image signal S BS from the image memory 15, the threshold processing unit 19 measures the number N B of pixels having a gradation value equal to or higher than the threshold ⁇ BS among all the pixels of the B image signal S BS. To do.
- FIG. 7 and 8 show an example of an endoscopic image (upper stage) and the distribution of gradation values (middle and lower stages) along the line AA of the endoscopic image.
- 9 and 10 brightness and image signals S RL of the living tissue S, S RS, S BL, which represents the relationship between the tone value of S BS.
- the region of the convex portion in the endoscopic image 25 becomes relatively bright and the region of the concave portion is relatively Becomes darker.
- the threshold processing unit 19 underexposure number M of pixels in a region R having the minimum gradation value 0, and M B, the number of pixels in the overexposed area having the maximum tone value 255 N R, and N B measure.
- Exposure time setting unit 20 receives the number M R of the pixels, based on the number M R of pixels to calculate the extension time of the long exposure time T RL, an extended time calculated in the current long exposure time T RL By adding, the next long exposure time TRL is calculated.
- a first lookup table LUT in which the number of pixels MR and the extension time are associated in advance is used.
- the number M R and the extended time of the pixel for example, as shown in FIG. 11, a zero extension time when the number M R of the pixel is zero, the extension time increases in proportion to the number M R of pixels As described above, the correspondence is made in the first LUT.
- the exposure time setting unit 20 calculates a shortening time of the short exposure time T RS based on the number N R of pixels, and uses the calculated shortening time as the current short exposure time T R. By subtracting from RS , the next short exposure time TRS is calculated.
- a second LUT in which the number of pixels N R and the shortening time are associated in advance is used.
- the number N R and shorten the time of the pixel for example, as shown in FIG. 12, a zero shortened time when the number N R of the pixel is zero, increased time savings in proportion to the number N R of the pixel As shown, the second LUT is associated.
- Exposure time setting unit 20 receives the number M B of the pixel, in the same manner as the long exposure time T RL, based on the number M B of pixels to calculate the following long exposure time T BL, the number of pixels N B , The next short exposure time T BS is calculated based on the number of pixels N B in the same manner as the short exposure time T RS .
- the exposure time setting unit 20 transmits the calculated next exposure times T RL , T RS , T BL , and T BS to the control unit 14.
- Control unit 14 the exposure time T RL received from the exposure time setting unit 20, T BL, T RS, the T BS, sets the exposure time in the next photographing of the R light L R and B light L B.
- step S13 two of the R image signals S RL in step S7, after the S RS is acquired, the R image signals S RL, the S RS based on the following R image signals S RL, exposure time T RL for the acquisition of S RS, T RS is set (step S13).
- the threshold processing unit 19 among all the pixels included in the R image signals S RL of the long exposure time T RL, the number M R of the pixel having the minimum gradation value 0 Measurement is performed (step S131).
- the number M R of the measured pixel represents the size of the black solid areas in the R image signals S RL, the larger the underexposure region, the number M R of the pixel is increased.
- Exposure time setting unit 20, the larger the number M R of the pixel, so that the next long exposure time T RL becomes longer, and calculates the following long exposure time T RL (step S132), the calculated the next The long exposure time TRL is set in the control unit 14 (step S133).
- Step S4 photographing of the R light L R is performed at longer long exposure time T RL (Step S4).
- the blackout is eliminated, and an image signal SRL having contrast in a dark region is acquired.
- the current long exposure time T RL is as it follows the long exposure time T RL Calculated and set.
- the threshold processing unit 19 measures the number N R of pixels having the maximum gradation value 255 among all the pixels included in the R image signal S RS having the short exposure time T RS (step S134).
- the number N R of the measured pixel represents the size of the overexposed region in the R image signal S RS, the larger the overexposed area, the number N R of the pixel is increased.
- step S4 photographing of the R light L R is executed in a shorter short exposure time T RS (step S4).
- T RS a shorter short exposure time
- an input unit 21 may be further provided for the observer to select which of the overriding and overexposure is prioritized.
- the exposure time setting unit 20 As a result of calculating the next long exposure times T RL and T BL and the short exposure times T RS and T BS as described above, the sum of the long exposure time and the short exposure time T RL + T RS and T BL + T BS is illumination light L R, it is possible that exceeds the irradiation time per one L B. In such a case (YES in step S137), the exposure time setting unit 20, as shown in FIG.
- step S139 depends on which one of the elimination of overexposure and the elimination of underexposure is selected by the input unit 21 (step S139), the long exposure times T RL and T BL and the short exposure times T RS and T BS are determined (steps S138 to S143).
- the exposure time setting unit 20 determines the long exposure time T RL preferentially (step S140), short exposure time T RS, the long and short exposure from the irradiation time determining a the time subtracting the time T RL (step S141).
- the exposure time setting unit 20 determines preferentially (step S142), the long exposure time T RL is short-exposure from the irradiation time determining a the time subtracting the time T RS (step S143).
- the endoscope apparatus according to this embodiment is a modification of the endoscope apparatus according to the second embodiment, and is not an entire pixel of the image signals S RL , S RS , S BL , S BS , but a region of interest. based on the distribution of the gradation values of the pixel in the in that feedback control next R light L R and B light L exposure time T RL in shooting B, T RS, T BL, a T BS, a second embodiment It is different from the endoscope apparatus of the form.
- the endoscope apparatus of this embodiment includes an image processor 42 in FIG. 18 instead of the image processor 4.
- the configuration other than the image processor 42 is the same as that of the endoscope apparatus 1 according to the first embodiment shown in FIG.
- the image processor 42 further includes an attention area input section (attention area setting section) 22 and a position information setting section 23.
- the attention area input unit 22 is, for example, a pointing device such as a stylus pen or a mouse that can specify a position on an endoscopic image displayed on the display unit 24. As shown in FIG. 19, the observer can designate any region within the imaging range of the endoscopic image 25 displayed on the display unit 24 as the attention region B using the attention region input unit 22. It has become.
- the position information setting unit 23 acquires the position of the specified attention area B from the attention area input section 22, converts the acquired position into the address of the pixel in the endoscopic image 25, and converts the address to the threshold processing section 19. Send to.
- the threshold processing unit 19 includes the pixels in the attention area B according to the address received from the position information setting unit 23 among the pixels of the R image signals S RL and S RS and the B image signals S BL and S BS received from the image memory 15. Select. Next, the threshold processing unit 19, a threshold alpha RL tone values of the selected pixels, alpha RS, by comparing the alpha BL or alpha BS, the number M R of pixels, N R, M B or N B Measure.
- Exposure time setting unit 20 determines the number of pixels M R, based on the M B to determine the next long exposure time T RL, T BL, the number N R, on the basis of the N B following short exposure time T RS pixels , T BS is calculated.
- the number of pixels M R, N R, M B , the maximum value of N B is different depending on the size of the region of interest B. Therefore, the exposure time setting unit 20 determines the ratio C / C B of the total number C B of pixels existing in the attention area B with respect to the total number C of pixels in the entire endoscopic image.
- the number of pixels M R , N R , M B by multiplying the N B, the number M R of pixels, N R, M B, the correction value M R ⁇ C / C B of N B, N R ⁇ C / C B, M B ⁇ C / C B, N B ⁇ C / C B is obtained. Then, the exposure time setting unit 20 calculates the next exposure times T RL , T BL , T RS , T BS from the LUTs of FIGS. 11 and 12 using the obtained correction value instead of M or N.
- the exposure time setting unit 20 the breadth of the region of interest B, i.e. may hold a plurality of LUT corresponding to the total number of pixels C B.
- the plurality of LUT are those where the number of pixels M R, N R, M B , the relationship between N B and the extension time or less time already corrected according to the total number of C B.
- the exposure time setting unit 20 can calculate the next exposure times T RL , T BL , T RS , and T BS by selecting an optimum LUT according to the total number C B.
- the main routine of this embodiment is the same as the main routine of the second embodiment of FIG. 13, and the contents of the exposure time setting routine (step S13) are different from those of the second embodiment.
- the exposure time setting routine S13 the following R image signals S RL, exposure time for the acquisition of S RS T RL, T RS is set.
- step S144 it is determined whether or not the attention area B is set.
- the attention area B is not set (NO in step S144)
- the next exposure times T RL , T RS , T BL , and T BS are set according to the same procedure as in the second embodiment (steps S131 to S136). ).
- the threshold processing unit 19 among the pixels constituting the region of interest B, the number M R of the pixel having the minimum gradation value 0 is measured (step S145 ), the number M R of the measured pixels is corrected according to the total number C B of the pixel in the target region R (step S146), on the basis of the correction value M R ⁇ C / C B, the following long exposure time T RL Calculated and set (steps S147 and S148). Subsequently, the threshold processing unit 19 measures the number N R of pixels having the maximum gradation value 255 among the pixels constituting the attention area B (step S149), and the measured number N R of the pixels is the attention area.
- step S150 is corrected according to the total number C B of R pixel (step S150), the correction value N R ⁇ based on C / C B, the following short exposure time T RS is calculated and set (step S151, S152).
- step S151, S152 Regarding the B image signals S BL and S BS , similarly to the R image signals S RL and S RS , the next long exposure time T BL and the next short exposure time T BS are set by steps S145 to S152.
- the next exposure time T RL , T RS , and the following exposure time T RL , T RS , T BL and T BS are adjusted.
- the endoscopic image 25 with higher contrast in the attention area B can be obtained, and there is an advantage that the attention area B can be observed more accurately.
- shooting is performed twice with different exposure times within one irradiation period of R light and B light. Instead, shooting is performed three times or more. May be. In this case, it is preferable that the exposure times in the three shootings are all different.
- the dynamic range of both the R image signal and the B image signal is expanded. Instead, only one of the R image signal and the B image signal is used. The dynamic range may be expanded. In this case, only a plurality of image signals for expanding the dynamic range may be acquired by a plurality of shootings.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- Biomedical Technology (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Astronomy & Astrophysics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Endoscopes (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
- Studio Devices (AREA)
Abstract
Description
本発明は、赤、緑および青の3色の照明光を順番に被写体に照射する照明部と、前記被写体によって反射された前記照明光を撮影して画像を取得する撮像部と、前記照明部からの前記3色の照明光の照射と同期して撮影を実行するように前記撮像部を制御することによって、赤、緑および青の3色の成分画像を順番に前記撮像部に取得させる制御部と、前記撮像部によって取得された前記3色の成分画像の内、緑以外の少なくとも1色の成分画像について、ダイナミックレンジを拡大した拡大成分画像を生成するダイナミックレンジ拡大部と、該ダイナミックレンジ拡大部によって生成された前記少なくとも1色の拡大成分画像と他の色の成分画像とを合成してカラーの内視鏡画像を生成する画像生成部とを備え、前記制御部が、前記少なくとも1色の照明光を異なる露光時間で複数回撮影するように前記撮像部を制御することによって、前記少なくとも1色の複数の成分画像を撮像部に取得させ、前記ダイナミックレンジ拡大部が、前記少なくとも1色の複数の成分画像同士を合成することによって前記拡大成分画像を生成する内視鏡装置を提供する。
この場合に、赤または/および青の照明光の撮影において、制御部が、同一色の照明光を異なる露光時間で複数回、撮像部に撮影を実行させることによって、明るさの異なる複数の成分画像が取得される。
このようにすることで、赤の拡大成分画像と青の拡大成分画像のダイナミックレンジを互いに独立に制御し、生体組織のさらに高い色再現性を有する内視鏡画像を得ることができる。
露光時間が不足している場合、階調値の分布が最低階調値側に偏り、露光時間が過剰である場合、階調値の分布が最高階調値側に偏る。露光時間設定部は、階調値の分布に基づいて露光時間の過不足を判断し、露光時間が不足している場合には次の撮影における露光時間をより長く設定し、露光時間が過剰である場合には次の撮影における露光時間をより短く設定する。これにより、次の撮影においては、適切なコントラストを有する成分画像を取得することができる。
このようにすることで、注目領域の色および明るさに対して赤または/および青の拡大成分画像のダイナミックレンジが最適化される。したがって、内視鏡画像の内、注目領域の高い色再現性を確保することができる。
以下に、本発明の第1の実施形態に係る内視鏡装置1について図1から図5を参照して説明する。
本実施形態に係る内視鏡装置1は、赤(R)、緑(G)および青(B)の3色の照明光を順番に生体組織(被写体)に照射してR、GおよびBの3色の画像信号を順番に取得し、取得された3色の画像信号からカラーの内視鏡画像を生成する面順次式である。
挿入部2は、該挿入部2の先端面に設けられた照明レンズ5および対物レンズ6と、挿入部2の基端面に設けられた集光レンズ7と、照明レンズ5と集光レンズ7との間に長手方向に沿って配置されたライトガイド8と、対物レンズ6の基端側に配置された撮像素子(撮像部)9とを備えている。
ライトガイド8は、集光レンズ7から基端面に入射された照明光を先端面まで導光し、該先端面から照明レンズ5へ向かって射出する。
照明レンズ5は、ライトガイド8から入射された照明光を拡散させて生体組織Sに照射する。
対物レンズ6は、生体組織Sにおいて反射されて対物レンズ6に入射した照明光を撮像素子9の撮像面に結像する。
集光レンズ11は、光源10から発せられた光を集光させて色可変フィルタ13に入射する。集光レンズ12は、色可変フィルタ13を透過した光を集光させて挿入部2の集光レンズ7に入射する。
ここで、制御部14は、G光LGの照射期間中には、露光時間TGで撮影を1回だけ撮像素子9に実行させる。これにより、1フレーム期間中に1つのG画像信号SGが撮像素子9によって取得される。
表示部24は、受信した内視鏡画像を順番に表示する。
最初に、露光時間TRL,TRS,TG,TBL,TBSが、例えば観察者によって初期設定される(ステップS1)。次に、照明ユニット3が作動を開始すると、R光LR、G光LGおよびB光LBが順番に集光レンズ12,7を介して挿入部2内のライトガイド8に入射し、挿入部2の先端から生体組織Sに向かってR光LR、G光LGおよびB光LBが順番に繰り返し照射される(ステップS2)。生体組織Sにおいて反射されたR光LR、G光LGおよびB光LBは対物レンズ6によって集光されて撮像素子9によって順番に撮影され、画像信号SRL,SRS,SGL,SBL,SBSが順番に取得される(ステップS3~S7)。
一方、R光LRの照射期間中には(ステップS3のNO)、制御部14が撮像素子9に長露光時間の撮影と短露光時間の撮影とを順番に実行させることによって(ステップS6)、2つのR画像信号SRL,SRSが取得される(ステップS7)。同様に、B光LBの照射期間中には(ステップS3のNO)、制御部14が撮像素子9に長露光時間の撮影と短露光時間の撮影とを順番に実行させることによって(ステップS6)、2つのB画像信号SBL,SBSが取得される(ステップS7)。
次に、本発明の第2の実施形態に係る内視鏡装置について図6から図17を参照して説明する。
本実施形態に係る内視鏡装置は、R画像信号SRL,SRSおよびB画像信号SBL,SBSの階調値の分布に基づいて次のR光LRおよびB光LBの撮影における露光時間TRL,TRS,TBL,TBSをフィードバック制御する点で、第1の実施形態の内視鏡装置1と異なっている。
画像プロセッサ41は、図6に示されるように、閾値処理部19と、露光時間設定部20とをさらに備えている。
本実施形態において、画像メモリ15は、R画像信号SRL,SRSおよびB画像信号SBL,SBSをダイナミックレンジ拡大部16に加えて閾値処理部19にも送信する。
同様に、閾値処理部19は、画像メモリ15からB画像信号SBLを受信すると、B画像信号の全ての画素の内、閾値αBL以下の階調値を有する画素の数MBを計測する。また、閾値処理部19は、画像メモリ15からB画像信号SBSを受信すると、B画像信号SBSの全ての画素の内、閾値αBS以上の階調値を有する画素の数NBを計測する。
図7および図8に示されるように、生体組織Sが凸部および凹部を有している場合、内視鏡画像25の内、凸部の領域が相対的に明るくなり、凹部の領域が相対的に暗くなる。
露光時間設定部20は、算出された次の露光時間TRL,TRS,TBL,TBSを制御部14へ送信する。
制御部14は、露光時間設定部20から受信した露光時間TRL,TBL,TRS,TBSを、次のR光LRおよびB光LBの撮影における露光時間として設定する。
本実施形態に係る内視鏡装置によれば、図13に示されるように、ステップS7において2つのR画像信号SRL,SRSが取得された後、該R画像信号SRL,SRSに基づいて、次のR画像信号SRL,SRSの取得のための露光時間TRL,TRSが設定される(ステップS13)。
上記のようにして次の長露光時間TRL,TBLおよび短露光時間TRS,TBSを算出した結果、長露光時間と短露光時間との和TRL+TRS,TBL+TBSが、照明光LR,LBの1回当たりの照射時間を超えてしまうことがあり得る。このような場合(ステップS137のYES)、露光時間設定部20は、図17に示されるように、白とびの解消と黒つぶれの解消のうちいずれが入力部21によって選択されているかに従って(ステップS139)、長露光時間TRL,TBLおよび短露光時間TRS,TBSを決定する(ステップS138~S143)。
B画像信号用の露光時間TBL,TBSに関しても同様である。
次に、本発明の第3の実施形態に係る内視鏡装置について図18から図20を参照して説明する。
本実施形態に係る内視鏡装置は、第2の実施形態の内視鏡装置を変形したものであって、画像信号SRL,SRS,SBL,SBSの全画素ではなく、注目領域内の画素の階調値の分布に基づいて次のR光LRおよびB光LBの撮影における露光時間TRL,TRS,TBL,TBSをフィードバック制御する点で、第2の実施形態の内視鏡装置と異なっている。
画像プロセッサ42は、図18に示されるように、注目領域入力部(注目領域設定部)22と、位置情報設定部23とをさらに備えている。
位置情報設定部23は、指定された注目領域Bの位置を注目領域入力部22から取得し、取得された位置を内視鏡画像25内の画素のアドレスに変換し、アドレスを閾値処理部19に送信する。
本実施形態のメインルーチンは、図13の第2の実施形態のメインルーチンと同一であり、露光時間設定ルーチン(ステップS13)の内容が第2の実施形態と異なっている。
本実施形態に係る内視鏡装置によれば、第2の実施形態と同様に、ステップS7において2つのR画像信号SRL,SRSが取得された後、露光時間設定ルーチンS13において、次のR画像信号SRL,SRSの取得のための露光時間TRL,TRSが設定される。
注目領域Bが設定されていない場合(ステップS144のNO)、第2の実施形態と同一の手順に従って次の露光時間TRL,TRS,TBL,TBSが設定される(ステップS131~S136)。
また、第1から第3の実施形態においては、R画像信号およびB画像信号の両方のダイナミックレンジを拡大することとしたが、これに代えて、R画像信号およびB画像信号のうち一方のみのダイナミックレンジを拡大することとしてもよい。この場合、ダイナミックレンジを拡大する画像信号のみ、複数回の撮影によって複数個取得すればよい。
2 挿入部
3 照明ユニット(照明部)
4,41,42 画像プロセッサ
5 照明レンズ
6 対物レンズ
7,11,12 集光レンズ
8 ライトガイド
9 撮像素子(撮像部)
10 光源
13 色可変フィルタ
14 制御部
15 画像メモリ
16 ダイナミックレンジ拡大部
17 圧縮部
18 画像生成部
19 閾値処理部
20 露光時間設定部
21 入力部
22 注目領域入力部(注目領域設定部)
23 位置情報設定部
24 表示部
25 内視鏡画像
Claims (4)
- 赤、緑および青の3色の照明光を順番に被写体に照射する照明部と、
前記被写体によって反射された前記照明光を撮影して画像を取得する撮像部と、
前記照明部からの前記3色の照明光の照射と同期して撮影を実行するように前記撮像部を制御することによって、赤、緑および青の3色の成分画像を順番に前記撮像部に取得させる制御部と、
前記撮像部によって取得された前記3色の成分画像の内、緑以外の少なくとも1色の成分画像について、ダイナミックレンジを拡大した拡大成分画像を生成するダイナミックレンジ拡大部と、
該ダイナミックレンジ拡大部によって生成された前記少なくとも1色の拡大成分画像と他の色の成分画像とを合成してカラーの内視鏡画像を生成する画像生成部とを備え、
前記制御部が、前記少なくとも1色の照明光を異なる露光時間で複数回撮影するように前記撮像部を制御することによって、前記少なくとも1色の複数の成分画像を撮像部に取得させ、
前記ダイナミックレンジ拡大部が、前記少なくとも1色の複数の成分画像同士を合成することによって前記拡大成分画像を生成する内視鏡装置。 - 前記制御部が、赤および青の前記照明光の各々を異なる露光時間で複数回撮影するように前記撮像部を制御するとともに、前記赤の照明光の撮影における前記露光時間と、前記青の照明光の撮影における前記露光時間とを互いに独立に制御する請求項1に記載の内視鏡装置。
- 前記少なくとも1色の複数の成分画像の階調値の分布に基づいて、次の前記少なくとも1色の照明光の複数回の撮影における露光時間を設定する露光時間設定部を備える請求項1または請求項2に記載の内視鏡装置。
- 前記撮像部による前記成分画像の撮影範囲内に注目領域を設定する注目領域設定部を備え、
前記露光時間設定部が、前記少なくとも1色の複数の成分画像の内、前記注目領域設定部によって設定された注目領域における階調値の分布に基づいて、次の前記少なくとも1色の照明光の複数回の撮影における露光時間を設定する請求項3に記載の内視鏡装置。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112015006338.2T DE112015006338T5 (de) | 2015-04-21 | 2015-04-21 | Endoskopvorrichtung |
JP2017513870A JPWO2016170604A1 (ja) | 2015-04-21 | 2015-04-21 | 内視鏡装置 |
CN201580078819.2A CN107529961A (zh) | 2015-04-21 | 2015-04-21 | 内窥镜装置 |
PCT/JP2015/062149 WO2016170604A1 (ja) | 2015-04-21 | 2015-04-21 | 内視鏡装置 |
US15/786,675 US20180049632A1 (en) | 2015-04-21 | 2017-10-18 | Endoscope apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/062149 WO2016170604A1 (ja) | 2015-04-21 | 2015-04-21 | 内視鏡装置 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/786,675 Continuation US20180049632A1 (en) | 2015-04-21 | 2017-10-18 | Endoscope apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016170604A1 true WO2016170604A1 (ja) | 2016-10-27 |
Family
ID=57143843
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/062149 WO2016170604A1 (ja) | 2015-04-21 | 2015-04-21 | 内視鏡装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20180049632A1 (ja) |
JP (1) | JPWO2016170604A1 (ja) |
CN (1) | CN107529961A (ja) |
DE (1) | DE112015006338T5 (ja) |
WO (1) | WO2016170604A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020170568A1 (ja) * | 2019-02-22 | 2020-08-27 | パナソニックIpマネジメント株式会社 | 加熱調理器 |
JPWO2019087790A1 (ja) * | 2017-10-31 | 2020-11-12 | 富士フイルム株式会社 | 検査支援装置、内視鏡装置、検査支援方法、及び検査支援プログラム |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3424403B1 (en) * | 2016-03-03 | 2024-04-24 | Sony Group Corporation | Medical image processing device, system, method, and program |
WO2019234831A1 (ja) * | 2018-06-05 | 2019-12-12 | オリンパス株式会社 | 内視鏡システム |
CN110996016A (zh) * | 2019-12-11 | 2020-04-10 | 苏州新光维医疗科技有限公司 | 一种内窥镜图像色彩调整方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0438093A (ja) * | 1990-06-04 | 1992-02-07 | Fuji Photo Optical Co Ltd | 電子内視鏡装置 |
JPH11151203A (ja) * | 1997-11-20 | 1999-06-08 | Olympus Optical Co Ltd | 内視鏡撮像装置 |
JPH11155808A (ja) * | 1997-11-27 | 1999-06-15 | Olympus Optical Co Ltd | 内視鏡撮像装置 |
JPH11305144A (ja) * | 1998-04-27 | 1999-11-05 | Olympus Optical Co Ltd | 内視鏡装置 |
JP2002306411A (ja) * | 2001-04-10 | 2002-10-22 | Asahi Optical Co Ltd | 電子スコープ用プロセッサ |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0912047B1 (en) * | 1997-10-23 | 2004-04-07 | Olympus Optical Co., Ltd. | Imaging apparatus comprising means for expanding the dynamic range |
JP4033565B2 (ja) * | 1997-12-03 | 2008-01-16 | オリンパス株式会社 | 内視鏡装置 |
US6635011B1 (en) * | 2000-01-14 | 2003-10-21 | Pentax Corporation | Electronic endoscope system |
JP4294440B2 (ja) * | 2003-10-30 | 2009-07-15 | オリンパス株式会社 | 画像処理装置 |
CN103258199A (zh) * | 2013-06-07 | 2013-08-21 | 浙江大学 | 一种获取完整的手掌静脉图像的系统及其方法 |
-
2015
- 2015-04-21 JP JP2017513870A patent/JPWO2016170604A1/ja active Pending
- 2015-04-21 WO PCT/JP2015/062149 patent/WO2016170604A1/ja active Application Filing
- 2015-04-21 CN CN201580078819.2A patent/CN107529961A/zh active Pending
- 2015-04-21 DE DE112015006338.2T patent/DE112015006338T5/de not_active Withdrawn
-
2017
- 2017-10-18 US US15/786,675 patent/US20180049632A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0438093A (ja) * | 1990-06-04 | 1992-02-07 | Fuji Photo Optical Co Ltd | 電子内視鏡装置 |
JPH11151203A (ja) * | 1997-11-20 | 1999-06-08 | Olympus Optical Co Ltd | 内視鏡撮像装置 |
JPH11155808A (ja) * | 1997-11-27 | 1999-06-15 | Olympus Optical Co Ltd | 内視鏡撮像装置 |
JPH11305144A (ja) * | 1998-04-27 | 1999-11-05 | Olympus Optical Co Ltd | 内視鏡装置 |
JP2002306411A (ja) * | 2001-04-10 | 2002-10-22 | Asahi Optical Co Ltd | 電子スコープ用プロセッサ |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2019087790A1 (ja) * | 2017-10-31 | 2020-11-12 | 富士フイルム株式会社 | 検査支援装置、内視鏡装置、検査支援方法、及び検査支援プログラム |
US11302092B2 (en) | 2017-10-31 | 2022-04-12 | Fujifilm Corporation | Inspection support device, endoscope device, inspection support method, and inspection support program |
WO2020170568A1 (ja) * | 2019-02-22 | 2020-08-27 | パナソニックIpマネジメント株式会社 | 加熱調理器 |
JP7470903B2 (ja) | 2019-02-22 | 2024-04-19 | パナソニックIpマネジメント株式会社 | 加熱調理器 |
Also Published As
Publication number | Publication date |
---|---|
DE112015006338T5 (de) | 2017-11-30 |
US20180049632A1 (en) | 2018-02-22 |
CN107529961A (zh) | 2018-01-02 |
JPWO2016170604A1 (ja) | 2018-03-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9675238B2 (en) | Endoscopic device | |
EP2926718B1 (en) | Endoscope system | |
WO2016170604A1 (ja) | 内視鏡装置 | |
US9595084B2 (en) | Medical skin examination device and method for enhancing and displaying lesion in photographed image | |
CN107529975B (zh) | 光源控制装置、光源控制方法和成像系统 | |
WO2013027455A1 (ja) | 自動露光制御装置、制御装置、内視鏡装置及び自動露光制御方法 | |
US9232153B2 (en) | Flicker compensation method using two frames | |
JP5699482B2 (ja) | 画像処理装置、画像処理方法及び撮像装置 | |
CN107079107B (zh) | 用于获得物体的高动态范围合成图像的显微镜和方法 | |
JP2014230708A (ja) | 内視鏡 | |
WO2017022324A1 (ja) | 画像信号処理方法、画像信号処理装置および画像信号処理プログラム | |
JP2007028088A (ja) | 撮像装置及び画像処理方法 | |
JP2013106169A (ja) | 画像合成装置、及びプログラム | |
JP2016086246A (ja) | 画像処理装置及び方法、及び撮像装置 | |
JP6203452B1 (ja) | 撮像システム | |
JP5244164B2 (ja) | 内視鏡装置 | |
JP2007215907A (ja) | 内視鏡プロセッサ、内視鏡システム、及びブラックバランス調整プログラム | |
JP6458205B1 (ja) | 画像処理装置、画像処理方法および画像処理プログラム | |
JP2000197604A (ja) | 内視鏡装置 | |
TWI594630B (zh) | Night photography system and its method | |
JP2001203910A (ja) | 映像信号処理装置 | |
JP7174064B2 (ja) | 画像信号処理装置、画像信号処理方法、プログラム | |
JP2009300811A (ja) | 被写体情報測定方法及び被写体情報測定装置、並びに露光制御方法及び、露光制御装置 | |
JP2016170768A (ja) | コード読取装置、コード読取方法、およびプログラム | |
JP2005164349A (ja) | 距離検出方法及び装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15889844 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017513870 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112015006338 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15889844 Country of ref document: EP Kind code of ref document: A1 |