WO2016129162A1 - 画像処理装置、画像処理方法、プログラムおよび画像処理システム - Google Patents
画像処理装置、画像処理方法、プログラムおよび画像処理システム Download PDFInfo
- Publication number
- WO2016129162A1 WO2016129162A1 PCT/JP2015/082324 JP2015082324W WO2016129162A1 WO 2016129162 A1 WO2016129162 A1 WO 2016129162A1 JP 2015082324 W JP2015082324 W JP 2015082324W WO 2016129162 A1 WO2016129162 A1 WO 2016129162A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- image processing
- processing apparatus
- unit
- size
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000095—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope for image enhancement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0655—Control therefor
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/555—Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/72—Combination of two or more compensation controls
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0661—Endoscope light sources
- A61B1/0669—Endoscope light sources at proximal end of an endoscope
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/12—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
Definitions
- the present disclosure relates to an information processing apparatus, an information processing method, a program, and an image processing system.
- observation area an area other than the black area
- an area extraction unit that extracts an area corresponding to the size of the insertion unit as an extraction area from an endoscopic image based on imaging by an image sensor, and exposure based on the image sensor output value in the extraction area
- An image processing apparatus includes an exposure control unit that performs control.
- a region corresponding to the size of the insertion unit is extracted as an extraction region from an endoscopic image based on imaging by an image sensor, and a processor is used to extract the region based on the image sensor output value in the extraction region.
- a processor is used to extract the region based on the image sensor output value in the extraction region.
- the computer extracts an area corresponding to the size of the insertion section as an extraction area from an endoscopic image based on an image captured by an image sensor, and outputs the image sensor output value in the extraction area.
- a program for causing an image processing apparatus to function as an image processing apparatus is provided.
- the light source unit that emits light
- the image sensor that receives the reflected light of the light emitted by the light source unit and captures an endoscopic image
- An image processing apparatus comprising: an area extraction unit that extracts an area corresponding to the size of the insertion unit as an extraction area; and an exposure control unit that performs exposure control based on the image sensor output value in the extraction area.
- a plurality of constituent elements having substantially the same functional configuration may be distinguished by attaching different alphabets or numbers after the same reference numeral.
- it is not necessary to particularly distinguish each of a plurality of constituent elements having substantially the same functional configuration only the same reference numerals are given.
- FIG. 1 is a diagram illustrating a configuration example of an image processing system according to an embodiment of the present disclosure.
- the image processing system 1 includes an image processing apparatus 100, an insertion unit 200, a light source unit 300, a display unit 400, and an operation unit 500.
- the light source unit 300 includes a white light source 310 and a condenser lens 320.
- the white light source 310 emits white light.
- an example using white light is mainly described, but the color of light is not particularly limited. Therefore, a light source that emits visible light other than white may be used instead of the white light source 310 (for example, an RGB laser capable of variably controlling RGB output may be used instead of the white light source 310).
- the condensing lens 320 condenses the light emitted by the white light source 310 on the light guide 210 described later.
- the insertion unit 200 may correspond to a scope that is inserted into the body. Specifically, the insertion unit 200 may be a rigid mirror or a flexible mirror.
- the insertion unit 200 includes a light guide 210, an illumination lens 220, an imaging unit 230, and a memory 240.
- the imaging unit 230 includes an objective lens 231, an image sensor (imaging device) 232, and an A / D (analog / digital) conversion unit 233.
- the light guide 210 guides the light collected by the light source unit 300 to the tip of the insertion unit 200.
- the illumination lens 220 diffuses the light guided to the tip by the light guide 210 and irradiates the observation target (subject Su).
- the objective lens 231 forms an image of reflected light returning from the observation target (subject Su) on the image sensor 232.
- the image sensor 232 outputs an analog signal (endoscopic image) captured by receiving the reflected light to the A / D conversion unit 233.
- the image sensor 232 has, for example, a primary color Bayer array.
- the endoscopic image obtained by the image sensor 232 is a primary color Bayer image.
- the primary color Bayer image is an image in which each pixel has one of RGB signals, and is an image in which the RGB pixels are arranged in a checkered pattern.
- the image sensor 232 is not limited to the primary color Bayer array. That is, the endoscopic image is not limited to a primary color Bayer image.
- the endoscopic image may be an image acquired by an endoscopic imaging method other than the primary color Bayer such as complementary color or frame sequential.
- the A / D conversion unit 233 converts an analog signal (endoscopic image) output from the image sensor 232 into a digital signal based on a control signal output from the control unit 130 (to be described later). (Scope image) is output to the image processing apparatus 100.
- the memory 240 stores a program for realizing the functions of the image processing apparatus 100 by being executed by an arithmetic device (not shown).
- the insertion unit 200 may be appropriately described as a scope.
- a type of scope corresponding to the diagnosis site can be used.
- Each scope is given an identification number for specifying a target diagnostic region and a function such as a zoom function.
- the identification number may be referred to as a scope ID.
- the memory 240 stores a scope ID.
- the image processing apparatus 100 includes an automatic exposure control unit 110, an image processing unit 120, and a control unit 130.
- the endoscopic image acquired by the imaging unit 230 is output to the automatic exposure control unit 110 and the image processing unit 120.
- the automatic exposure control unit 110 is connected to the white light source 310 and the image sensor 232 and controls the white light source 310 and the image sensor 232.
- the image processing unit 120 is connected to the display unit 400.
- the control unit 130 is bidirectionally connected to the imaging unit 230, the image processing unit 120, the display unit 400, and the operation unit 500, and controls these components.
- the automatic exposure control unit 110 automatically exposes the image sensor 232 so that the luminance of the endoscopic image acquired by the imaging unit 230 becomes a value suitable for observation (hereinafter also referred to as “appropriate value”). Take control. Details of the automatic exposure control unit 110 will be described later.
- the image processing unit 120 performs image processing on the endoscopic image captured by the imaging unit 230. For example, the image processing unit 120 performs gradation conversion processing and noise reduction processing.
- the image processing unit 120 outputs the image after image processing to the display unit 400.
- the control unit 130 is connected to the imaging unit 230, the image processing unit 120, the display unit 400, and the operation unit 500, and outputs a control signal for controlling them.
- the display unit 400 outputs the endoscopic image output by the image processing unit 120 to an image display device such as an endoscopic monitor.
- the operation unit 500 is an interface for accepting an operation from the user.
- the operation unit 500 includes a power switch for turning on / off the power, a shutter button for starting imaging, a mode switching button for switching a shooting mode and various other modes.
- FIGS. 2 and 3 are diagrams for describing a specific example of exposure control.
- the analog signal captured by the image sensor 232 is converted into a digital signal (endoscopic image) by the A / D converter 233.
- the output value from the image sensor 232 is shown on the vertical axis.
- the horizontal axis indicates the image plane illuminance of the image sensor 232 corresponding to each output value.
- the output value from the image sensor 232 may be an average value of output values corresponding to each pixel.
- the appropriate value of the output value from the image sensor 232 is indicated as “U0”, and the image plane illuminance of the image sensor 232 corresponding to the appropriate value U0 is indicated as “L0”. .
- U0 the output value of the output value from the image sensor 232
- L0 the image plane illuminance of the image sensor 232 corresponding to the appropriate value U0.
- the appropriate value of the output value from the image sensor 232 is shown as “U0”, and the image plane illuminance of the image sensor 232 corresponding to the appropriate value U0 is “L0”. ".
- the output value U2 from the image sensor 232 is smaller than the appropriate value U0.
- exposure control can be performed by adjusting parameters for controlling exposure.
- Various parameters are assumed as parameters for controlling the exposure.
- the parameter for controlling the exposure may include at least one of an electronic shutter speed of the image sensor 232 and a gain multiplied by an analog signal captured by the image sensor 232.
- the parameters for controlling the exposure may include the brightness of the white light source 310 (or when an RGB laser is used instead of the white light source 310, the exposure control is performed by light source control by changing the output of each RGB. May be done).
- the exposure control for decreasing the output value from the image sensor 232 by dU1 shown in FIG. 2 may be executed by increasing the electronic shutter speed by an amount corresponding to dU1, or by the image sensor 232. It may be executed by reducing the gain multiplied by the imaged analog signal by an amount corresponding to dU1.
- exposure control that reduces the output value from the image sensor 232 may be executed by reducing the brightness of the white light source 310 by an amount corresponding to dU1.
- the exposure control for increasing the output value from the image sensor 232 by dU2 shown in FIG. 3 may be executed by decreasing the electronic shutter speed by an amount corresponding to dU2, or by the image sensor 232. This may be executed by increasing the gain multiplied by the imaged analog signal by an amount corresponding to dU2.
- exposure control that increases the output value from the image sensor 232 may be executed by increasing the brightness of the white light source 310 by an amount corresponding to dU2.
- FIG. 4 is a block diagram illustrating a detailed functional configuration example of the automatic exposure control unit 110.
- the automatic exposure control unit 110 includes a size acquisition unit 111, a region extraction unit 112, and an exposure control unit 112.
- the size acquisition unit 111 acquires an endoscopic image from the imaging unit 230.
- FIG. 5 is a diagram showing an example of an endoscopic image.
- each pixel is arranged in a grid pattern.
- the endoscopic image Im1 includes a black region Rb1 in addition to the observation region Rs1.
- a line Hm indicates a boundary line between the black region Rb1 and the observation region Rs1.
- the color depth of each pixel represents the height of luminance of each pixel.
- the region extraction unit 112 extracts a region corresponding to the size of the insertion unit 200 as an extraction region from the endoscopic image Im1 based on the imaging by the image sensor 232. Then, the exposure control unit 113 performs exposure control based on the image sensor output value in the extraction region. Then, it is possible to more appropriately adjust the luminance of the endoscopic image Im1 by reducing the possibility that the observation region Rs1 becomes excessively bright.
- the exposure control unit 113 may perform exposure control based on the image sensor output value in the extraction region. More specifically, when the exposure control unit 113 determines that the shadow of the insertion unit 200 is reflected in the endoscopic image Im1, the exposure control unit 113 controls the exposure based on the image sensor output value in the extraction region. The parameters may be adjusted.
- the exposure control unit 113 determines that the shadow of the insertion unit 200 does not appear in the endoscopic image Im1, the exposure control unit 113 performs exposure control based on the image sensor output value in the entire endoscopic image Im1. Just do it. Note that the determination as to whether or not the shadow of the insertion unit 200 is reflected in the endoscopic image Im1 may be made in any manner.
- the exposure control unit 113 determines that the shadow of the insertion unit 200 is reflected in the endoscopic image Im1 when the luminance difference between the central region Ce and the peripheral regions N1 to N4 exceeds the first threshold. You can do it.
- the exposure control unit 113 determines that the shadow of the insertion unit 200 is not reflected in the endoscopic image Im1 when the luminance difference between the central region Ce and the peripheral regions N1 to N4 is less than the first threshold value. You can do it. If the luminance difference between the central region Ce and the peripheral regions N1 to N4 is equal to the first threshold value, any determination may be made.
- the exposure control unit 113 detects the first peak and the second peak in order from the low luminance side from the pixel number distribution for each luminance of the endoscope image, and the luminance of the first peak and the second peak. When the difference exceeds the upper limit value, it may be determined that the shadow of the insertion unit 200 is reflected in the endoscope image Im1. Further, the exposure control unit 113 determines that the shadow of the insertion unit 200 is not reflected in the endoscopic image Im1 when the luminance difference between the first peak and the second peak does not exceed the upper limit value. Also good.
- FIG. 6 is a diagram illustrating another example of the peripheral region. As shown in FIG. 6, instead of the peripheral areas N1 to N4, pixel columns existing on the outermost side of the endoscopic image Im1 may be used as the peripheral areas N5 and N6.
- the size acquisition unit 111 acquires the size of the insertion unit 200.
- the scope diameter (the diameter of the insertion part 200) is used as the size of the insertion part 200, but other lengths of the insertion part 200 (for example, the radius of the insertion part 200, etc.) are used instead of the scope diameter. ) May be used.
- the size acquisition unit 111 may acquire the scope diameter from a predetermined location.
- the size acquisition unit 111 may acquire the scope diameter (or scope information including the scope diameter) through communication with the scope body or an external device.
- the size acquisition unit 111 may acquire the scope diameter by calculation based on the endoscope image Im1.
- the method for obtaining the scope diameter by calculation is not particularly limited.
- the size acquisition unit 111 scans in the first scanning direction from the first start position of the endoscopic image Im1 toward the first target position, the size relationship between the luminance and the second threshold value is The scope diameter may be calculated based on the first pixel position switched first. At this time, the size acquisition unit 111 may calculate a scope diameter that is twice the first distance between the first pixel position and the center position.
- the second threshold value may be a fixed value, but may be set to a value according to the situation of the endoscopic image Im1.
- the size acquisition unit 111 may set the second threshold based on the pixel number distribution for each luminance of the endoscopic image Im1. More specifically, the size acquisition unit 111 may specify the median value of the pixel number distribution for each luminance of the endoscopic image Im1, and set the median value as the second threshold value.
- FIG. 7 and 8 are diagrams for explaining a calculation example of the scope diameter when the scanning direction is two directions.
- the size acquisition unit 111 scans in the first scanning direction Dr from the first start position of the endoscopic image Im1
- the size relationship between the luminance and the second threshold value is the first.
- the switched first pixel position is calculated.
- the size acquisition unit 111 scans in the second scanning direction Dl from the second start position of the endoscopic image Im1, the size relationship between the luminance and the second threshold value is The second pixel position that is switched first is calculated. Then, the size acquisition unit 111 calculates the scope diameter based on the first pixel position and the second pixel position.
- the size acquisition unit 111 determines that the difference between the first distance Wr between the first pixel position and the center position and the second distance Wl between the second pixel position and the center position is When the value is less than the threshold value 3 (predetermined distance), the sum of the first distance Wr and the second distance Wl may be calculated as the scope diameter. On the other hand, when the difference between the first distance Wr and the second distance Wl exceeds the third threshold, the size acquisition unit 111 has a larger distance between the first distance Wr and the second distance Wl. What is necessary is just to calculate 2 times as a scope diameter.
- the first start position may be the center position of the endoscopic image Im1, and the first target position may be the end position of the endoscopic image Im1.
- the second start position may also be the center position of the endoscopic image Im1, and the second target position may be the end position of the endoscopic image Im1. Since there is a high possibility that the subject Su appears at high brightness inside the endoscopic image Im1, more accurate calculation is possible when scanning from the inside to the outside of the endoscopic image Im1 It becomes.
- the first start position, the second start position, the first target position, and the second target position are not limited to the examples shown in FIGS.
- the first start position may be the end position of the endoscopic image Im1
- the first target position may be the center position of the endoscopic image Im1. Since there is a high possibility that a black area exists only at the end of the endoscopic image Im1, when scanning is performed from the outside to the inside of the endoscopic image Im1, the calculation can be completed earlier. It becomes.
- the first scanning direction is the right direction and the second scanning direction is the left direction.
- the first scanning direction and the second scanning direction are not limited to this example.
- one of the first scanning direction and the second scanning direction may be the upward direction and the other may be the downward direction.
- one of the first scanning direction and the second scanning direction is an oblique direction (a direction from upper right to lower left or a direction from lower left to upper right), and the other is an oblique direction (a direction from upper left to lower right). Or the direction from the lower right to the upper left).
- the region extraction unit 112 extracts a region corresponding to the scope diameter as an extraction region.
- a map indicating an area to be extracted in advance is associated with the scope diameter (hereinafter also referred to as “black area frame map”)
- the area extracting unit 112 acquires the scope diameter acquired by the size acquiring unit 111.
- the black area frame map corresponding to is acquired.
- FIG. 9 is a diagram illustrating an example of a black area frame map. Referring to FIG. 9, in the black area frame map Mp, the extracted area is indicated as “1”, and the area to be excluded is indicated as “0”.
- the region extraction unit 112 extracts pixels set as regions to be extracted in the black region frame map Mp from the endoscope image Im1. 7 and 8 show the extraction region Rw2 and the exclusion region Rb2. Then, the exposure control unit 113 performs exposure control based on the image sensor output value in the extraction region extracted by the region extraction unit 112.
- the output value from the image sensor 232 in the extraction area may be an average value of output values corresponding to each pixel in the extraction area. Thereby, it becomes possible to adjust the luminance of the endoscope image Im1 more appropriately by reducing the possibility that the observation region Rs1 becomes excessively bright.
- FIG. 10 is a flowchart showing an example of exposure control operation according to the scope diameter.
- the example of the operation shown in FIG. 10 is a flowchart showing an example of the exposure control operation according to the scope diameter. Therefore, the exposure control operation according to the scope diameter is not limited to the operation example shown in FIG. In FIG. 10, the scope diameter is shown as the black area frame diameter.
- the region extracting unit 112 when the luminance difference between the central region Ce and the peripheral regions N1 to N4 (or N5, N6) in the endoscopic image Im1 is lower than the first threshold (S11). In “No”, the operation is terminated. On the other hand, when the luminance difference between the central region Ce and the peripheral regions N1 to N4 (or N5 and N6) exceeds the first threshold in the endoscopic image Im1, the region extracting unit 112 (“Yes” in S11). Then, each pixel of the endoscope image Im1 is binarized by the second threshold value (S12).
- the region extraction unit 112 performs line scanning on the binarized image (S13).
- the region extraction unit 112 determines the difference between the scan result in the right direction (hereinafter also referred to as “right scan result”) and the scan result in the left direction (hereinafter also referred to as “left scan result”).
- the third threshold value (“Yes” in S14)
- the sum of the right scan result and the left scan result is set as the black region frame diameter (S15), and the operation is shifted to S17.
- the region extraction unit 112 sets twice the long scan result for the black region frame diameter. (S16) The operation is shifted to S17. Subsequently, the region extraction unit 112 selects a black region frame map Mp corresponding to the black region frame diameter (S17), and multiplies corresponding pixel values in the black region frame map Mp and the endoscope image Im1 ( S18). By such multiplication, the extraction region Rw2 is extracted.
- the exposure control unit 113 performs exposure control based on the multiplication result (S19).
- the exposure control unit 113 performs exposure control based on the image sensor output value in the extraction region Rw2 extracted by the region extraction unit 112. Thereby, it becomes possible to adjust the luminance of the endoscope image Im1 more appropriately by reducing the possibility that the observation region Rs1 becomes excessively bright.
- the region extraction unit 112 that extracts, as an extraction region, a region corresponding to the size of the insertion unit 200 from the endoscopic image Im1 based on imaging by the image sensor 232;
- An image processing apparatus 100 is provided that includes an exposure control unit 113 that performs exposure control based on an image sensor output value in an extraction region. According to such a configuration, it is possible to adjust the luminance of the endoscopic image Im1 more appropriately by reducing the possibility that the observation region Rs1 becomes excessively bright.
- An area extraction unit that extracts an area corresponding to the size of the insertion unit as an extraction area from an endoscopic image based on imaging by an image sensor;
- An exposure control unit that performs exposure control based on the output value of the image sensor in the extraction region;
- An image processing apparatus comprising: (2) The image processing apparatus includes a size acquisition unit that acquires a size of the insertion unit. The image processing apparatus according to (1).
- the size acquisition unit acquires the size of the insertion unit from the insertion unit; The image processing apparatus according to (2).
- the size acquisition unit acquires the size of the insertion unit based on the endoscope image by calculation, The image processing apparatus according to (2).
- the size acquisition unit scans in the first scanning direction from the first start position of the endoscopic image toward the first target position, the size relationship between the brightness and the threshold is first switched. Calculating the size of the insert based on the pixel position of 1;
- the first start position is a center position of the endoscopic image, and the first target position is an end position of the endoscopic image.
- the first start position is an end position of the endoscopic image, and the first target position is a center position of the endoscopic image;
- the image processing apparatus according to (5) is
- the size acquisition unit calculates a first distance between the first pixel position and the center position or twice the first distance as a size of the insertion unit;
- the image processing apparatus according to (6) or (7).
- the first scanning direction is an upward direction, a downward direction, a left direction, a right direction, or an oblique direction.
- the image processing apparatus according to any one of (5) to (8).
- the size acquisition unit when scanning in the second scanning direction from the second start position of the endoscopic image, the second pixel position where the magnitude relationship between the brightness and the threshold is switched first, and the first Calculating the size of the insertion portion based on the pixel position of 1;
- the image processing apparatus according to (6).
- the size acquisition unit may be configured such that a difference between a first distance between the first pixel position and the center position and a second distance between the second pixel position and the center position is less than a predetermined distance. The sum of the first distance and the second distance is calculated as the size of the insertion portion.
- (12) The size acquisition unit is configured such that a difference between a first distance between the first pixel position and the center position and a second distance between the second pixel position and the center position exceeds a predetermined distance. In this case, two times the larger distance between the first distance and the second distance is calculated as the size of the insertion portion.
- the size acquisition unit sets the threshold based on a distribution of the number of pixels for each luminance of the endoscopic image;
- the image processing apparatus according to any one of (5) to (12).
- the exposure control unit adjusts a parameter for controlling exposure based on the output value of the image sensor in the extraction region;
- the image processing apparatus according to any one of (1) to (13).
- the parameter includes at least one of an electronic shutter speed of the image sensor and a gain multiplied by an analog signal captured by the image sensor.
- the parameter includes the brightness of the light source, The image processing apparatus according to any one of (14).
- the exposure control unit determines that the shadow of the insertion unit is reflected in the endoscopic image
- the exposure control unit performs the exposure control based on an image sensor output value in the extraction region.
- the image processing apparatus according to any one of (1) to (16). (18) Extracting an area corresponding to the size of the insertion portion as an extraction area from an endoscopic image based on imaging by an image sensor; Performing exposure control based on the image sensor output value in the extraction region by a processor; An image processing apparatus.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Signal Processing (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Optics & Photonics (AREA)
- Medical Informatics (AREA)
- Veterinary Medicine (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Pathology (AREA)
- General Physics & Mathematics (AREA)
- Geometry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Astronomy & Astrophysics (AREA)
- Endoscopes (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
- Closed-Circuit Television Systems (AREA)
- Studio Devices (AREA)
- Exposure Control For Cameras (AREA)
Abstract
Description
1.本開示の実施形態
1.1.システム構成例
1.2.機能構成例
1.3.自動露光制御部の機能詳細
2.むすび
[1.1.システム構成例]
まず、図面を参照しながら本開示の実施形態に係る画像処理システムの構成例について説明する。図1は、本開示の実施形態に係る画像処理システムの構成例を示す図である。図1に示すように、画像処理システム1は、画像処理装置100と、挿入部200と、光源部300と、表示部400と、操作部500とを有する。
続いて、自動露光制御部110による露光制御の具体例について説明する。図2および図3は、露光制御の具体例について説明するための図である。上記したように、イメージセンサ232によって撮像されたアナログ信号は、A/D変換部233によってデジタル信号(内視鏡画像)に変換される。図2および図3には、イメージセンサ232からの出力値が縦軸に示されている。また、各出力値に対応するイメージセンサ232の像面照度が横軸に示されている。なお、イメージセンサ232からの出力値は、各画素に対応する出力値の平均値であってよい。
続いて、自動露光制御部110の詳細な機能について説明する。図4は、自動露光制御部110の詳細な機能構成例を示すブロック図である。図4に示すように、自動露光制御部110は、サイズ取得部111と、領域抽出部112と、露光制御部112とを備える。以下においては、これらのサイズ取得部111、領域抽出部112および露光制御部113それぞれの詳細な機能について説明する。まず、サイズ取得部111は、撮像部230から内視鏡画像を取得する。
以上説明したように、本開示の実施形態によれば、イメージセンサ232による撮像に基づく内視鏡画像Im1から、挿入部200のサイズに応じた領域を抽出領域として抽出する領域抽出部112と、抽出領域におけるイメージセンサ出力値に基づいて露光制御を行う露光制御部113と、を備える、画像処理装置100が提供される。かかる構成によれば、観察領域Rs1が過剰に明るくなってしまう可能性を低減することによって、内視鏡画像Im1の輝度をより適切に調整することが可能となる。
(1)
イメージセンサによる撮像に基づく内視鏡画像から、挿入部のサイズに応じた領域を抽出領域として抽出する領域抽出部と、
前記抽出領域における前記イメージセンサ出力値に基づいて露光制御を行う露光制御部と、
を備える、画像処理装置。
(2)
前記画像処理装置は、前記挿入部のサイズを取得するサイズ取得部を備える、
前記(1)に記載の画像処理装置。
(3)
前記サイズ取得部は、前記挿入部のサイズを前記挿入部から取得する、
前記(2)に記載の画像処理装置。
(4)
前記サイズ取得部は、前記内視鏡画像に基づいて前記挿入部のサイズを計算によって取得する、
前記(2)に記載の画像処理装置。
(5)
前記サイズ取得部は、前記内視鏡画像の第1の開始位置から第1の目標位置に向けて第1の走査方向にスキャンした場合に、輝度と閾値との大小関係が最初に切り替わった第1の画素位置に基づいて前記挿入部のサイズを計算する、
前記(4)に記載の画像処理装置。
(6)
前記第1の開始位置は、前記内視鏡画像の中心位置であり、前記第1の目標位置は、前記内視鏡画像の端部位置である、
前記(5)に記載の画像処理装置。
(7)
前記第1の開始位置は、前記内視鏡画像の端部位置であり、前記第1の目標位置は、前記内視鏡画像の中心位置である、
前記(5)に記載の画像処理装置。
(8)
前記サイズ取得部は、前記第1の画素位置と前記中心位置との第1の距離または前記第1の距離の2倍を前記挿入部のサイズとして算出する、
前記(6)または(7)に記載の画像処理装置。
(9)
前記第1の走査方向は、上方向、下方向、左方向、右方向または斜め方向である、
前記(5)~(8)のいずれか一項に記載の画像処理装置。
(10)
前記サイズ取得部は、前記内視鏡画像の第2の開始位置から第2の走査方向にスキャンした場合に、輝度と閾値との大小関係が最初に切り替わった第2の画素位置と、前記第1の画素位置とに基づいて、前記挿入部のサイズを計算する、
前記(6)に記載の画像処理装置。
(11)
前記サイズ取得部は、前記第1の画素位置と前記中心位置との第1の距離と、前記第2の画素位置と前記中心位置との第2の距離との差が所定の距離を下回る場合には、前記第1の距離と前記第2の距離との和を前記挿入部のサイズとして算出する、
前記(10)に記載の画像処理装置。
(12)
前記サイズ取得部は、前記第1の画素位置と前記中心位置との第1の距離と、前記第2の画素位置と前記中心位置との第2の距離との差が所定の距離を上回る場合には、前記第1の距離と前記第2の距離とのうち大きい距離の2倍を前記挿入部のサイズとして算出する、
前記(10)に記載の画像処理装置。
(13)
前記サイズ取得部は、前記内視鏡画像の輝度ごとの画素数分布に基づいて前記閾値を設定する、
前記(5)~(12)のいずれか一項に記載の画像処理装置。
(14)
前記露光制御部は、前記抽出領域における前記イメージセンサ出力値に基づいて、露光を制御するためのパラメータを調整する、
前記(1)~(13)のいずれか一項に記載の画像処理装置。
(15)
前記パラメータは、前記イメージセンサの電子シャッタ速度、前記イメージセンサによって撮像されたアナログ信号に対して乗じられるゲインのうちの少なくともいずれか一つを含む、
前記(14)に記載の画像処理装置。
(16)
前記パラメータは、光源の明るさを含む、
前記(14)のいずれか一項に記載の画像処理装置。
(17)
前記露光制御部は、前記内視鏡画像に前記挿入部の影が写り込んでいると判断した場合には、前記抽出領域におけるイメージセンサ出力値に基づいた前記露光制御を行う、
前記(1)~(16)のいずれか一項に記載の画像処理装置。
(18)
イメージセンサによる撮像に基づく内視鏡画像から、挿入部のサイズに応じた領域を抽出領域として抽出することと、
プロセッサにより、前記抽出領域における前記イメージセンサ出力値に基づいて露光制御を行うことと、
を含む、画像処理装置。
(19)
コンピュータを、
イメージセンサによる撮像に基づく内視鏡画像から、挿入部のサイズに応じた領域を抽出領域として抽出する領域抽出部と、
前記抽出領域における前記イメージセンサ出力値に基づいて露光制御を行う露光制御部と、
を備える画像処理装置として機能させるためのプログラム。
(20)
光を発する光源部と、
前記光源部により発せられた前記光の反射光を受光して内視鏡画像を撮像するイメージセンサと、を有し、
前記内視鏡画像から、挿入部のサイズに応じた領域を抽出領域として抽出する領域抽出部と、
前記抽出領域における前記イメージセンサ出力値に基づいて露光制御を行う露光制御部と、
を備える、画像処理装置を有する、画像処理システム。
100 画像処理装置
110 自動露光制御部
111 サイズ取得部
112 領域抽出部
112 露光制御部
113 露光制御部
120 画像処理部
130 制御部
200 挿入部
210 ライトガイド
220 照明レンズ
230 撮像部
231 対物レンズ
232 イメージセンサ
240 メモリ
300 光源部
310 白色光源
320 集光レンズ
400 表示部
500 操作部
Ce 中心領域
Dl 第2の走査方向
Dr 第1の走査方向
Im1 内視鏡画像
Mp 黒領域枠マップ
N1~N6 周辺領域
Rb1 黒領域
Rb2 除外領域
Rs1 観察領域
Rw2 抽出領域
Su 被写体
Wl 第2の距離
Wr 第1の距離
Claims (20)
- イメージセンサによる撮像に基づく内視鏡画像から、挿入部のサイズに応じた領域を抽出領域として抽出する領域抽出部と、
前記抽出領域における前記イメージセンサ出力値に基づいて露光制御を行う露光制御部と、
を備える、画像処理装置。 - 前記画像処理装置は、前記挿入部のサイズを取得するサイズ取得部を備える、
請求項1に記載の画像処理装置。 - 前記サイズ取得部は、前記挿入部のサイズを前記挿入部から取得する、
請求項2に記載の画像処理装置。 - 前記サイズ取得部は、前記内視鏡画像に基づいて前記挿入部のサイズを計算によって取得する、
請求項2に記載の画像処理装置。 - 前記サイズ取得部は、前記内視鏡画像の第1の開始位置から第1の目標位置に向けて第1の走査方向にスキャンした場合に、輝度と閾値との大小関係が最初に切り替わった第1の画素位置に基づいて前記挿入部のサイズを計算する、
請求項4に記載の画像処理装置。 - 前記第1の開始位置は、前記内視鏡画像の中心位置であり、前記第1の目標位置は、前記内視鏡画像の端部位置である、
請求項5に記載の画像処理装置。 - 前記第1の開始位置は、前記内視鏡画像の端部位置であり、前記第1の目標位置は、前記内視鏡画像の中心位置である、
請求項5に記載の画像処理装置。 - 前記サイズ取得部は、前記第1の画素位置と前記中心位置との第1の距離または前記第1の距離の2倍を前記挿入部のサイズとして算出する、
請求項6に記載の画像処理装置。 - 前記第1の走査方向は、上方向、下方向、左方向、右方向または斜め方向である、
請求項5に記載の画像処理装置。 - 前記サイズ取得部は、前記内視鏡画像の第2の開始位置から第2の走査方向にスキャンした場合に、輝度と閾値との大小関係が最初に切り替わった第2の画素位置と、前記第1の画素位置とに基づいて、前記挿入部のサイズを計算する、
請求項6に記載の画像処理装置。 - 前記サイズ取得部は、前記第1の画素位置と前記中心位置との第1の距離と、前記第2の画素位置と前記中心位置との第2の距離との差が所定の距離を下回る場合には、前記第1の距離と前記第2の距離との和を前記挿入部のサイズとして算出する、
請求項10に記載の画像処理装置。 - 前記サイズ取得部は、前記第1の画素位置と前記中心位置との第1の距離と、前記第2の画素位置と前記中心位置との第2の距離との差が所定の距離を上回る場合には、前記第1の距離と前記第2の距離とのうち大きい距離の2倍を前記挿入部のサイズとして算出する、
請求項10に記載の画像処理装置。 - 前記サイズ取得部は、前記内視鏡画像の輝度ごとの画素数分布に基づいて前記閾値を設定する、
請求項5に記載の画像処理装置。 - 前記露光制御部は、前記抽出領域における前記イメージセンサ出力値に基づいて、露光を制御するためのパラメータを調整する、
請求項1に記載の画像処理装置。 - 前記パラメータは、前記イメージセンサの電子シャッタ速度、前記イメージセンサによって撮像されたアナログ信号に対して乗じられるゲインのうちの少なくともいずれか一つを含む、
請求項14に記載の画像処理装置。 - 前記パラメータは、光源の明るさを含む、
請求項14のいずれか一項に記載の画像処理装置。 - 前記露光制御部は、前記内視鏡画像に前記挿入部の影が写り込んでいると判断した場合には、前記抽出領域におけるイメージセンサ出力値に基づいた前記露光制御を行う、
請求項1に記載の画像処理装置。 - イメージセンサによる撮像に基づく内視鏡画像から、挿入部のサイズに応じた領域を抽出領域として抽出することと、
プロセッサにより、前記抽出領域における前記イメージセンサ出力値に基づいて露光制御を行うことと、
を含む、画像処理装置。 - コンピュータを、
イメージセンサによる撮像に基づく内視鏡画像から、挿入部のサイズに応じた領域を抽出領域として抽出する領域抽出部と、
前記抽出領域における前記イメージセンサ出力値に基づいて露光制御を行う露光制御部と、
を備える画像処理装置として機能させるためのプログラム。 - 光を発する光源部と、
前記光源部により発せられた前記光の反射光を受光して内視鏡画像を撮像するイメージセンサと、を有し、
前記内視鏡画像から、挿入部のサイズに応じた領域を抽出領域として抽出する領域抽出部と、
前記抽出領域における前記イメージセンサ出力値に基づいて露光制御を行う露光制御部と、
を備える、画像処理装置を有する、画像処理システム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/546,111 US10623651B2 (en) | 2015-02-12 | 2015-11-17 | Image processing device, image processing method, and image processing system |
JP2016574627A JP6711286B2 (ja) | 2015-02-12 | 2015-11-17 | 画像処理装置、画像処理方法、プログラムおよび画像処理システム |
US16/823,848 US11228718B2 (en) | 2015-02-12 | 2020-03-19 | Image processing device, image processing method, and image processing system |
US17/567,389 US11737647B2 (en) | 2015-02-12 | 2022-01-03 | Image processing device, image processing method, program and image processing system |
US18/356,284 US20230363620A1 (en) | 2015-02-12 | 2023-07-21 | Image processing device, image processing method, program and image processing system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015025133 | 2015-02-12 | ||
JP2015-025133 | 2015-02-12 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/546,111 A-371-Of-International US10623651B2 (en) | 2015-02-12 | 2015-11-17 | Image processing device, image processing method, and image processing system |
US16/823,848 Continuation US11228718B2 (en) | 2015-02-12 | 2020-03-19 | Image processing device, image processing method, and image processing system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016129162A1 true WO2016129162A1 (ja) | 2016-08-18 |
Family
ID=56615181
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/082324 WO2016129162A1 (ja) | 2015-02-12 | 2015-11-17 | 画像処理装置、画像処理方法、プログラムおよび画像処理システム |
Country Status (3)
Country | Link |
---|---|
US (4) | US10623651B2 (ja) |
JP (3) | JP6711286B2 (ja) |
WO (1) | WO2016129162A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018235608A1 (ja) * | 2017-06-21 | 2018-12-27 | ソニー株式会社 | 手術システムおよび手術用撮像装置 |
WO2020100184A1 (ja) * | 2018-11-12 | 2020-05-22 | オリンパス株式会社 | 内視鏡用光源装置、内視鏡装置及び内視鏡用光源装置の作動方法 |
CN111343387A (zh) * | 2019-03-06 | 2020-06-26 | 杭州海康慧影科技有限公司 | 一种摄像设备的自动曝光方法及装置 |
WO2022172733A1 (ja) | 2021-02-12 | 2022-08-18 | ソニーグループ株式会社 | 医療用観察装置、観察装置、観察方法及びアダプタ |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7454409B2 (ja) * | 2020-03-03 | 2024-03-22 | ソニー・オリンパスメディカルソリューションズ株式会社 | 医療用制御装置及び医療用制御装置の制御方法 |
US20210275000A1 (en) * | 2020-03-05 | 2021-09-09 | Stryker Corporation | Systems and methods for endoscope type detection |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08211306A (ja) * | 1995-02-01 | 1996-08-20 | Olympus Optical Co Ltd | 写真撮影装置 |
JPH08336160A (ja) * | 1995-06-09 | 1996-12-17 | Olympus Optical Co Ltd | 画像ファイル装置 |
JP2003180631A (ja) * | 2001-10-01 | 2003-07-02 | Pentax Corp | 内視鏡用自動調光装置および電子内視鏡装置 |
JP2004147924A (ja) * | 2002-10-31 | 2004-05-27 | Pentax Corp | 内視鏡用自動調光装置および電子内視鏡装置 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07194528A (ja) | 1993-12-30 | 1995-08-01 | Olympus Optical Co Ltd | 撮像装置 |
JP2000287202A (ja) | 1999-03-30 | 2000-10-13 | Fuji Photo Optical Co Ltd | 電子内視鏡装置 |
US6773393B1 (en) * | 1999-08-05 | 2004-08-10 | Olympus Optical Co., Ltd. | Apparatus and method for detecting and displaying form of insertion part of endoscope |
JP2002263064A (ja) | 2001-03-07 | 2002-09-17 | Asahi Optical Co Ltd | 内視鏡用自動調光装置およびそれを含む電子内視鏡装置のプロセッサ |
US6980227B2 (en) * | 2001-10-01 | 2005-12-27 | Pentax Corporation | Electronic endoscope with light-amount adjustment apparatus |
JP2005006856A (ja) * | 2003-06-18 | 2005-01-13 | Olympus Corp | 内視鏡装置 |
JP2005131129A (ja) | 2003-10-30 | 2005-05-26 | Olympus Corp | 撮像装置及び内視鏡装置 |
JP2005279253A (ja) | 2004-03-02 | 2005-10-13 | Olympus Corp | 内視鏡 |
US8373748B2 (en) | 2005-12-14 | 2013-02-12 | Stryker Corporation | Automatic endoscope recognition and selection of image processing and display settings |
JP2010000183A (ja) | 2008-06-19 | 2010-01-07 | Fujinon Corp | 電子内視鏡用プロセッサ装置 |
JP5162374B2 (ja) * | 2008-08-21 | 2013-03-13 | 富士フイルム株式会社 | 内視鏡画像のズレ量測定装置及び方法、並びに電子内視鏡及び内視鏡用画像処理装置 |
JP5814698B2 (ja) | 2011-08-25 | 2015-11-17 | オリンパス株式会社 | 自動露光制御装置、制御装置、内視鏡装置及び内視鏡装置の作動方法 |
CN104135908B (zh) * | 2012-03-28 | 2016-07-06 | 富士胶片株式会社 | 摄像装置以及具备其的内窥镜装置 |
-
2015
- 2015-11-17 WO PCT/JP2015/082324 patent/WO2016129162A1/ja active Application Filing
- 2015-11-17 US US15/546,111 patent/US10623651B2/en active Active
- 2015-11-17 JP JP2016574627A patent/JP6711286B2/ja active Active
-
2020
- 2020-03-19 US US16/823,848 patent/US11228718B2/en active Active
- 2020-05-27 JP JP2020092701A patent/JP7010330B2/ja active Active
-
2022
- 2022-01-03 US US17/567,389 patent/US11737647B2/en active Active
- 2022-01-11 JP JP2022002189A patent/JP7484939B2/ja active Active
-
2023
- 2023-07-21 US US18/356,284 patent/US20230363620A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08211306A (ja) * | 1995-02-01 | 1996-08-20 | Olympus Optical Co Ltd | 写真撮影装置 |
JPH08336160A (ja) * | 1995-06-09 | 1996-12-17 | Olympus Optical Co Ltd | 画像ファイル装置 |
JP2003180631A (ja) * | 2001-10-01 | 2003-07-02 | Pentax Corp | 内視鏡用自動調光装置および電子内視鏡装置 |
JP2004147924A (ja) * | 2002-10-31 | 2004-05-27 | Pentax Corp | 内視鏡用自動調光装置および電子内視鏡装置 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018235608A1 (ja) * | 2017-06-21 | 2018-12-27 | ソニー株式会社 | 手術システムおよび手術用撮像装置 |
US11503980B2 (en) | 2017-06-21 | 2022-11-22 | Sony Corporation | Surgical system and surgical imaging device |
WO2020100184A1 (ja) * | 2018-11-12 | 2020-05-22 | オリンパス株式会社 | 内視鏡用光源装置、内視鏡装置及び内視鏡用光源装置の作動方法 |
CN113260297A (zh) * | 2018-11-12 | 2021-08-13 | 奥林巴斯株式会社 | 内窥镜用光源装置、内窥镜装置及内窥镜用光源装置的工作方法 |
CN113260297B (zh) * | 2018-11-12 | 2024-06-07 | 奥林巴斯株式会社 | 内窥镜装置、内窥镜用光源装置及其工作方法以及光源调整方法 |
CN111343387A (zh) * | 2019-03-06 | 2020-06-26 | 杭州海康慧影科技有限公司 | 一种摄像设备的自动曝光方法及装置 |
CN111343387B (zh) * | 2019-03-06 | 2022-01-21 | 杭州海康慧影科技有限公司 | 一种摄像设备的自动曝光方法及装置 |
WO2022172733A1 (ja) | 2021-02-12 | 2022-08-18 | ソニーグループ株式会社 | 医療用観察装置、観察装置、観察方法及びアダプタ |
Also Published As
Publication number | Publication date |
---|---|
JPWO2016129162A1 (ja) | 2017-11-24 |
US11737647B2 (en) | 2023-08-29 |
US20200221006A1 (en) | 2020-07-09 |
US10623651B2 (en) | 2020-04-14 |
JP2022044639A (ja) | 2022-03-17 |
US20220124236A1 (en) | 2022-04-21 |
JP7484939B2 (ja) | 2024-05-16 |
US20230363620A1 (en) | 2023-11-16 |
JP2020151492A (ja) | 2020-09-24 |
JP7010330B2 (ja) | 2022-01-26 |
US11228718B2 (en) | 2022-01-18 |
JP6711286B2 (ja) | 2020-06-17 |
US20180027165A1 (en) | 2018-01-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7010330B2 (ja) | 画像処理システム、画像処理方法、画像処理装置およびプログラム | |
JP5814698B2 (ja) | 自動露光制御装置、制御装置、内視鏡装置及び内視鏡装置の作動方法 | |
US10306151B2 (en) | Image processing device, image processing method, program and image processing system | |
JP5698476B2 (ja) | 内視鏡システム、内視鏡システムの作動方法及び撮像装置 | |
JP5124102B2 (ja) | 内視鏡プロセッサ、画像処理プログラム、および内視鏡システム | |
US9113045B2 (en) | Electronic endoscopic apparatus and control method thereof | |
JP4701111B2 (ja) | パターンマッチングシステム及び被写体追尾システム | |
JP6430880B2 (ja) | 内視鏡システム、及び、内視鏡システムの作動方法 | |
JP2020537456A (ja) | 撮像素子、撮像機器及び画像情報処理方法 | |
WO2016088628A1 (ja) | 画像評価装置、内視鏡システム、画像評価装置の作動方法および画像評価装置の作動プログラム | |
US10739573B2 (en) | Image processing apparatus and image processing method for achieving visibility of an entire image used for capturing a microscopic image and accuracy of edge detection | |
JP2012020028A (ja) | 電子内視鏡用プロセッサ | |
US7822247B2 (en) | Endoscope processor, computer program product, endoscope system, and endoscope image playback apparatus | |
WO2015194204A1 (ja) | 内視鏡装置 | |
KR101551427B1 (ko) | 홍채 인식 기능을 지원하는 이동 단말기 | |
JP2017123997A (ja) | 撮像システムおよび処理装置 | |
JP2012120595A (ja) | 電子内視鏡用プロセッサ | |
JP2014130442A (ja) | 画像処理装置、撮像装置、および画像処理プログラム | |
JP2011234789A (ja) | 電子内視鏡、内視鏡プロセッサ、および内視鏡ユニット | |
JP2012020027A (ja) | 電子内視鏡用プロセッサ | |
JP2014039100A (ja) | 画像処理装置、撮像装置、および画像処理プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15882036 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016574627 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15546111 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15882036 Country of ref document: EP Kind code of ref document: A1 |