WO2016088628A1 - Dispositif d'évaluation d'image, système d'endoscope, procédé et programme de commande d'un dispositif d'évaluation d'image - Google Patents

Dispositif d'évaluation d'image, système d'endoscope, procédé et programme de commande d'un dispositif d'évaluation d'image Download PDF

Info

Publication number
WO2016088628A1
WO2016088628A1 PCT/JP2015/083140 JP2015083140W WO2016088628A1 WO 2016088628 A1 WO2016088628 A1 WO 2016088628A1 JP 2015083140 W JP2015083140 W JP 2015083140W WO 2016088628 A1 WO2016088628 A1 WO 2016088628A1
Authority
WO
WIPO (PCT)
Prior art keywords
evaluation value
image
evaluation
blur
unit
Prior art date
Application number
PCT/JP2015/083140
Other languages
English (en)
Japanese (ja)
Inventor
山崎 隆一
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to JP2016533674A priority Critical patent/JPWO2016088628A1/ja
Publication of WO2016088628A1 publication Critical patent/WO2016088628A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/75Circuitry for compensating brightness variation in the scene by influencing optical camera components

Definitions

  • the present invention relates to an image evaluation apparatus that performs image focusing and blur evaluation, an endoscope system including the image evaluation apparatus, an operation method of the image evaluation apparatus, and an operation program of the image evaluation apparatus.
  • an endoscope system is used to observe an organ of a subject such as a patient.
  • An endoscope system is provided with, for example, an imaging device at a distal end, and is connected to an endoscope having an insertion portion to be inserted into a body cavity of a subject, and a proximal end side of the insertion portion via a cable.
  • a processing device that sequentially performs in-vivo image processing according to the generated imaging signal and displays the in-vivo image live on a display unit or the like.
  • an operator such as a doctor selects a desired in-vivo image for diagnosis or the like from the in-vivo images displayed live, and freeze-displays it on the display unit.
  • a technique for performing such freeze display when a freeze instruction is input, an image is evaluated based on a contrast value (focus value) and a color shift amount of images of a plurality of frames stored in the frame memory, A technique for extracting an in-vivo image to be frozen based on an evaluation result is disclosed (see, for example, Patent Document 1).
  • Patent Document 1 for a plurality of frames of images stored in a frame memory, the contrast value of a region of interest having brightness that is equal to or greater than a threshold value is calculated, and the amount of blurring of the image between consecutive frames is calculated as a color shift.
  • the weighted average is calculated based on the contrast value and the blur amount, and the in-vivo image to be frozen is extracted from the calculated weighted average.
  • the present invention has been made in view of the above, and is an image evaluation apparatus, an endoscope system, and an image evaluation apparatus that can accurately perform focusing and blurring evaluation of an image and extract an image to be frozen. It is an object of the present invention to provide an operation method and an operation program for an image evaluation apparatus.
  • an image evaluation apparatus is an image evaluation apparatus that acquires a signal including an image of a subject image and performs focus evaluation and blur evaluation of the image.
  • An area setting unit for setting an evaluation target area for performing the focus evaluation and blur evaluation in the image, and calculating a focus evaluation value representing a focus degree of the evaluation target area set by the area setting unit.
  • a focus evaluation value calculation unit that calculates a blur evaluation value that represents a blur amount of the subject image in the evaluation target region set by the region setting unit, and the focus evaluation value calculation unit calculates And an overall evaluation value calculation unit that calculates an overall evaluation value based on the in-focus evaluation value and the shake evaluation value calculated by the shake evaluation value calculation unit.
  • the image evaluation apparatus is characterized in that, in the above invention, the evaluation target area is composed of a plurality of small areas.
  • the focus evaluation value calculation unit calculates a focus evaluation value for each of the small regions, and the representative focus based on the calculated plurality of focus evaluation values.
  • An evaluation value is output to the comprehensive evaluation value calculation unit as a focus evaluation value, and the blur evaluation value calculation unit calculates a blur evaluation value for each of the small areas, and representative blur evaluation based on the calculated plurality of blur evaluation values
  • a value is output as a blur evaluation value to the comprehensive evaluation value calculation unit.
  • the focus evaluation value calculation unit calculates the maximum value, the average value, or the mode value of the focus evaluation values for each of the plurality of small regions as a representative focus evaluation.
  • the blur evaluation value calculation unit uses the maximum value, the average value, or the mode value of the blur evaluation values for each of the plurality of small regions as a representative blur evaluation value.
  • the area setting unit determines whether to exclude each small area from the evaluation target area based on the luminance value of each small area. It is characterized by that.
  • the signal is an imaging signal acquired by an endoscope
  • the region setting unit is reflected in an in-vivo image corresponding to the imaging signal.
  • the foreign matter protrudes from an insertion portion of the endoscope.
  • the signal is acquired by irradiation with light of different wavelength bands emitted in time series.
  • the in-focus evaluation value and the blur evaluation value are normalized values having the same maximum value and minimum value, respectively.
  • the image evaluation apparatus is the image evaluation apparatus according to the above invention, wherein the comprehensive evaluation value calculating unit weights the in-focus evaluation value and the blur evaluation value to obtain a sum. It is characterized by calculating.
  • the comprehensive evaluation value calculation unit can change a weight for the in-focus evaluation value and the blur evaluation value.
  • An endoscope system includes an endoscope having an imaging unit that is inserted into a subject and captures an in-vivo image in the subject, a storage unit that stores a plurality of the in-vivo images, A region setting unit for setting an evaluation target region for performing in-focus evaluation and blur evaluation in an in-vivo image, and a focus evaluation value calculating unit for calculating a focus evaluation value representing the degree of focus of the region set by the region setting unit A blur evaluation value calculation unit that calculates a blur evaluation value that represents a blur amount of a subject image in a region set by the region setting unit, a focus evaluation value calculated by the focus evaluation value calculation unit, and the blur evaluation
  • the comprehensive evaluation value calculation unit calculates the comprehensive evaluation value calculation unit that calculates the comprehensive evaluation value based on the shake evaluation value calculated by the value calculation unit, and the plurality of in-vivo images stored in the storage unit. At least one in-vivo image based on the comprehensive evaluation value Characterized by comprising an image selection section for selecting, a.
  • An operation method of an image evaluation apparatus is an operation method of an image evaluation apparatus that obtains a signal including an image of a subject image and performs focus evaluation and blur evaluation of the image, and includes an area setting unit Are a region setting step for setting an evaluation target region for performing the focus evaluation and the blur evaluation in the image, and a focus evaluation value calculation unit indicates a focus degree of the region set in the region setting step.
  • the comprehensive evaluation value is calculated by the comprehensive evaluation value calculation unit based on the focus evaluation value calculated in the focus evaluation value calculation step and the blur evaluation value calculated in the blur evaluation value calculation step.
  • An operation program for an image evaluation apparatus is an operation program for an image evaluation apparatus that acquires a signal including an image of a subject image and performs focus evaluation and blur evaluation on the image, and includes an area setting unit Is a region setting procedure for setting an evaluation target region for performing the focus evaluation and the blur evaluation in the image, and a focus evaluation value calculation unit indicates a focus degree of the region set in the region setting procedure.
  • the comprehensive evaluation value calculation unit calculates a comprehensive evaluation value based on the focus evaluation value calculated in the focus evaluation value calculation procedure and the blur evaluation value calculated in the blur evaluation value calculation procedure.
  • the present invention it is possible to extract the image to be frozen by accurately performing the focusing and blurring evaluation of the image.
  • FIG. 1 is a diagram illustrating a schematic configuration of an endoscope system according to the first embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating a schematic configuration of the endoscope system according to the first embodiment of the present invention.
  • FIG. 3 is a flowchart for explaining an evaluation value calculation process by the image processing unit of the endoscope system according to the first embodiment of the present invention.
  • FIG. 4 is a schematic diagram illustrating an evaluation value calculation process performed by the image processing unit of the endoscope system according to the second embodiment of the present invention.
  • FIG. 5 is a flowchart for explaining an evaluation value calculation process by the image processing unit of the endoscope system according to the second embodiment of the present invention.
  • FIG. 1 is a diagram illustrating a schematic configuration of an endoscope system according to the first embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating a schematic configuration of the endoscope system according to the first embodiment of the present invention.
  • FIG. 3 is a flowchart for explaining
  • FIG. 6 is a schematic diagram for explaining an evaluation value calculation process by the image processing unit of the endoscope system according to the third embodiment of the present invention.
  • FIG. 7 is a flowchart for explaining an evaluation value calculation process by the image processing unit of the endoscope system according to the third embodiment of the present invention.
  • FIG. 8 is a schematic diagram for explaining an evaluation value calculation process by the image processing unit of the endoscope system according to the fourth embodiment of the present invention.
  • FIG. 9 is a schematic diagram for explaining an evaluation value calculation process by the image processing unit of the endoscope system according to the fifth embodiment of the present invention.
  • FIG. 1 is a diagram illustrating a schematic configuration of an endoscope system according to the first embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating a schematic configuration of the endoscope system according to the first embodiment.
  • An endoscope system 1 shown in FIGS. 1 and 2 inserts a distal end portion into a body cavity of a subject to capture an in-vivo image of the subject, and emits the light from the distal end of the endoscope 2.
  • a light source device 3 that generates illumination light
  • a processing device 4 that performs predetermined signal processing on an image signal captured by the endoscope 2, and controls the overall operation of the endoscope system 1, and a processing device 4
  • a display device 5 for displaying the in-vivo image generated by the signal processing.
  • the endoscope 2 includes an insertion portion 21 having an elongated shape having flexibility, an operation portion 22 that is connected to a proximal end side of the insertion portion 21 and receives input of various operation signals, and an insertion portion from the operation portion 22. And a universal cord 23 that includes various cables that extend in a direction different from the direction in which 21 extends and connect to the light source device 3 and the processing device 4.
  • the insertion unit 21 includes a distal end portion 24 (imaging unit) including an imaging element 244 in which pixels that generate signals by receiving light and performing photoelectric conversion are arranged in a two-dimensional manner, and a plurality of bending pieces.
  • the bendable bending portion 25 is connected to the proximal end side of the bending portion 25 and has a long flexible tube portion 26 having flexibility.
  • the distal end portion 24 is configured using a glass fiber or the like, and forms a light guide path for light emitted from the light source device 3.
  • An illumination lens 242 provided at the distal end of the light guide 241.
  • an image sensor 244 that is provided at an image forming position of the optical system 243, receives light collected by the optical system 243, photoelectrically converts the light into an electrical signal, and performs predetermined signal processing.
  • the optical system 243 is configured by using one or a plurality of lenses, and has an optical zoom function for changing the angle of view and a focus function for changing the focus.
  • the image sensor 244 photoelectrically converts light from the optical system 243 to generate an electrical signal (imaging signal).
  • imaging element 244 a plurality of pixels each having a photodiode that accumulates electric charge according to the amount of light, a capacitor that converts electric charge transferred from the photodiode into a voltage level, and the like are arranged in a matrix, A light receiving unit 244a in which each pixel photoelectrically converts light from the optical system 243 to generate an electric signal, and an electric signal generated by a pixel arbitrarily set as a reading target among a plurality of pixels of the light receiving unit 244a is sequentially read out And a reading unit 244b for outputting as an imaging signal.
  • the image sensor 244 controls various operations of the distal end portion 24 in accordance with the drive signal received from the processing device 4.
  • the image sensor 244 is realized using, for example, a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor.
  • the operation section 22 includes a bending knob 221 that bends the bending section 25 in the vertical direction and the left-right direction, a treatment instrument insertion section 222 that inserts a treatment instrument such as a biological forceps, an electric knife, and a test probe into the body cavity of the subject.
  • a treatment instrument such as a biological forceps, an electric knife, and a test probe into the body cavity of the subject.
  • it has a plurality of switches 223 which are operation input units for inputting operation instruction signals of peripheral devices such as air supply means, water supply means, and screen display control.
  • the treatment tool inserted from the treatment tool insertion portion 222 is exposed from the opening (not shown) via the treatment tool channel (not shown) of the distal end portion 24.
  • the universal cord 23 includes at least a light guide 241 and a collective cable 245 in which one or a plurality of signal lines are collected.
  • the collective cable 245 is unique including a signal line for transmitting an imaging signal, a signal line for transmitting a driving signal for driving the imaging element 244, unique information regarding the endoscope 2 (imaging element 244), and the like.
  • a signal line for transmitting and receiving information is included.
  • the light source device 3 includes an illumination unit 31 and an illumination control unit 32.
  • the illumination unit 31 sequentially switches and emits a plurality of illumination lights having different wavelength bands to the subject (subject) under the control of the illumination control unit 32.
  • the illumination unit 31 includes a light source unit 31a, a light source driver 31b, a rotary filter 31c, a drive unit 31d, and a drive driver 31e.
  • the light source unit 31a is configured using a white LED and one or a plurality of lenses, and emits white light to the rotary filter 31c under the control of the light source driver 31b.
  • the white light generated by the light source unit 31a is emitted from the tip of the tip part 24 toward the subject via the rotary filter 31c and the light guide 241.
  • the light source driver 31b supplies white light to the light source unit 31a by supplying a current to the light source unit 31a under the control of the illumination control unit 32.
  • the rotary filter 31c is disposed on the optical path of white light emitted from the light source unit 31a and rotates to transmit only light in a predetermined wavelength band among the white light emitted from the light source unit 31a.
  • the rotary filter 31c includes a red filter 311, a green filter 312 and a blue filter 313 that transmit light having respective wavelength bands of red light (R), green light (G), and blue light (B).
  • the rotary filter 31c sequentially transmits light in the red, green, and blue wavelength bands (for example, red: 600 nm to 700 nm, green: 500 nm to 600 nm, blue: 400 nm to 500 nm) by rotating.
  • the white light (W illumination) emitted from the light source unit 31a is converted into any one of red light (R illumination), green light (G illumination), and blue light (B illumination) with a narrow band. Can be sequentially emitted (surface sequential method).
  • the drive unit 31d is configured using a stepping motor, a DC motor, or the like, and rotates the rotary filter 31c.
  • the drive driver 31e supplies a predetermined current to the drive unit 31d under the control of the illumination control unit 32.
  • the illumination control unit 32 controls the amount of current supplied to the light source unit 31 a based on a control signal from the control unit 45 of the processing device 4. In addition, the illumination control unit 32 rotates the rotary filter 31c by driving the drive unit 31d via the light source driver 31b under the control of the control unit 45.
  • the light source unit 31a may be composed of a red LED, a green LED, and a blue LED, and the light source driver 31b may sequentially emit red light, green light, or blue light by supplying current to each LED.
  • a white LED, a red LED, a green LED, and a blue LED emit light simultaneously, or a subject is irradiated with white light by a discharge lamp such as a laser or a xenon lamp to acquire an image. Also good.
  • the processing device 4 includes an A / D conversion unit 41, an image processing unit 42, an input unit 43, a recording unit 44, and a control unit 45.
  • the A / D conversion unit 41 performs A / D conversion on the imaging signal that is an analog signal output from the imaging element 244 and outputs the imaging signal converted into a digital signal to the image processing unit 42.
  • the image processing unit 42 generates an in-vivo image displayed by the display device 5 based on the imaging signal input from the A / D conversion unit 41.
  • the image processing unit 42 includes a preprocessing unit 421, a synchronization unit 422, an area setting unit 423, a color misregistration evaluation value calculation unit 424 (blur evaluation value calculation unit), and a contrast evaluation value calculation unit 425 (focus evaluation). Value calculation unit), a comprehensive evaluation value calculation unit 426, a freeze image selection unit 427, a post-processing unit 428, and a buffer 429 (storage unit).
  • the image evaluation apparatus of the present invention includes at least an area setting unit 423, a color misregistration evaluation value calculation unit 424, a contrast evaluation value calculation unit 425, a comprehensive evaluation value calculation unit 426, and a buffer 429.
  • the pre-processing unit 421 performs OB (Optical Black) clamp processing, gain correction processing, and white balance correction processing on the imaging signal input from the A / D conversion unit 41. After performing the signal processing described above, the preprocessing unit 421 outputs the processed imaging signal to the synchronization unit 422.
  • OB Optical Black
  • the synchronization unit 422 synchronizes the frame-sequential imaging signal with the imaging signal input from the preprocessing unit 421. Specifically, the synchronization unit 422 accumulates the R component, G component, and B component imaging signals generated by the frame sequential method one frame at a time, and stores the accumulated imaging signals of the respective color components as one frame image signal. Synchronize as.
  • the synchronization unit 422 outputs the synchronized image signal to the region setting unit 423 and the buffer 429.
  • the region setting unit 423 sets an evaluation value calculation region for calculating a color misregistration evaluation value and a contrast evaluation value for the image signal output from the synchronization unit 422.
  • the region setting unit 423 outputs the set evaluation value calculation region to the color misregistration evaluation value calculation unit 424 and the contrast evaluation value calculation unit 425.
  • the color misregistration evaluation value calculation unit 424 obtains a correlation between the signals of the respective color components based on the detection signal including the comparison result of the difference signals of the respective color components, or the difference between the color signals. ) Is calculated and normalized, and the normalized value is calculated as a color misregistration evaluation value. The color misregistration evaluation value calculation unit 424 outputs the calculated color misregistration evaluation value to the comprehensive evaluation value calculation unit 426.
  • the above-described color misregistration amount can be obtained by using a known technique such as JP-A-2001-169300.
  • the contrast evaluation value calculation unit 425 calculates a contrast value in the set evaluation value calculation region, normalizes the calculated contrast value, and outputs the contrast value to the comprehensive evaluation value calculation unit 426 as a contrast evaluation value (focus evaluation value). .
  • the contrast evaluation value calculation unit 425 calculates, for example, the contrast value of the evaluation value calculation region in the G component imaging signal, normalizes the calculated contrast value, and acquires the contrast evaluation value.
  • the contrast evaluation value calculation unit 425 for example, performs a high-pass filter process on the pixel value generated by the pixel in the evaluation value calculation area, and calculates the contrast value by adding the high-pass filter output value.
  • the color misregistration evaluation value calculation unit 424 and the contrast evaluation value calculation unit 425 output values (color misregistration evaluation value and contrast evaluation value) obtained by normalizing each maximum value and minimum value as the same value. Specifically, the color misregistration evaluation value calculation unit 424 assigns a value of 0 to 1 normalized as a color misregistration evaluation value to each frame, for example. The color misregistration evaluation value calculation unit 424 gives a value close to 1 if the frame is small in color misalignment and close to 0 if the frame is large in color misregistration. In addition, the contrast evaluation value calculation unit 425 gives a value of 0 to 1 normalized as a contrast evaluation value to each frame. The contrast evaluation value calculation unit 425 gives a value close to 1 if the frame has a large contrast value and close to 0 if the frame has a small contrast value.
  • the total evaluation value calculation unit 426 calculates a total evaluation value based on the color misregistration evaluation value generated by the color misregistration evaluation value calculation unit 424 and the contrast evaluation value generated by the contrast evaluation value calculation unit 425.
  • the calculated overall evaluation value is output to the freeze image selection unit 427.
  • the comprehensive evaluation value calculation unit 426 calculates a comprehensive evaluation value by weighting the color misregistration evaluation value and the contrast evaluation value.
  • the combination of c 1 and c 2 can be arbitrarily set. The surgeon can set c 1 and c 2 according to the priority in the evaluation of color shift and contrast via the input unit 43.
  • the freeze image selection unit 427 outputs the image signal output from the comprehensive evaluation value calculation unit 426 to the post-processing unit 428 and, when an image freeze instruction is input, the freeze image selection target 427 based on the comprehensive evaluation value. Select the image signal.
  • the freeze image selection unit 427 selects an image signal having the highest overall evaluation value as a freeze target image and acquires it from the buffer 429.
  • the freeze image selection unit 427 outputs the acquired freeze target image (image signal) to the post-processing unit 428.
  • the post-processing unit 428 performs tone conversion processing, color processing on the image signal for live display (synchronized image signal) acquired from the buffer 429 or the image signal to be frozen output from the freeze image selection unit 427. Conversion processing and contour enhancement processing are performed. The post-processing unit 428 outputs the processed image signal to the display device 5.
  • the buffer 429 is realized using, for example, a ring buffer, and stores a fixed amount (predetermined number of frames) of imaging signals generated by the synchronization unit 422. When the capacity is insufficient (when a predetermined number of imaging signals are stored), the buffer 429 stores the latest imaging signal in a predetermined number of frames in time series by overwriting the oldest imaging signal with the latest imaging signal.
  • the input unit 43 receives input of various signals such as an operation instruction signal for instructing the operation of the endoscope system 1.
  • the recording unit 44 is realized by using a semiconductor memory such as a flash memory or a DRAM (Dynamic Random Access Memory).
  • the recording unit 44 records various programs for operating the endoscope system 1 and data including various parameters necessary for the operation of the endoscope system 1.
  • the recording unit 44 records identification information of the processing device 4.
  • the identification information includes unique information (ID) of the processing device 4, model year, specification information, and the like.
  • the control unit 45 is configured using a CPU or the like, and performs drive control of each component including the image sensor 244 and the light source device 3, input / output control of information with respect to each component, and the like.
  • the control unit 45 refers to control information data (for example, readout timing) for image capturing control recorded in the recording unit 44 and transmits the control information data to the image sensor 244 via a predetermined signal line included in the collective cable 245. To do.
  • the display device 5 receives and displays the in-vivo image corresponding to the video signal generated by the processing device 4 via the video cable.
  • the display device 5 is configured using a monitor such as a liquid crystal or an organic EL (Electro Luminescence).
  • FIG. 3 is a flowchart for explaining an evaluation value calculation process by the image processing unit of the endoscope system according to the first embodiment.
  • the imaging signal is input from the A / D conversion unit 41
  • the image processing unit 42 generates an in-vivo image to be displayed by the display device 5 based on the input imaging signal. Further, based on the input imaging signal, a comprehensive evaluation value based on evaluation values of color shift and contrast is calculated.
  • a process for selecting a freeze image and generating an image signal for freeze display when there is a freeze image display instruction will be described.
  • the preprocessing unit 421 performs the above-described signal processing (OB clamp processing or the like) as preprocessing on the imaging signal input from the A / D conversion unit 41 (step S101). After the signal processing, the preprocessing unit 421 outputs the processed image signal to the synchronization unit 422.
  • OB clamp processing or the like
  • the synchronization unit 422 synchronizes the frame-sequential imaging signal with the imaging signal input from the preprocessing unit 421 (step S102).
  • the synchronization unit 422 outputs the synchronized image signal to the region setting unit 423 and the buffer 429.
  • the region setting unit 423 sets an evaluation value calculation region for calculating a color misregistration evaluation value and a contrast evaluation value for the image signal output from the synchronization unit 422 (step S103: region setting step).
  • an effective pixel region excluding the OB region is set as an evaluation value calculation region, and information regarding the set evaluation value calculation region is used as the color misregistration evaluation value calculation unit 424 and the contrast evaluation value calculation unit 425.
  • the evaluation value calculation area can be set according to the observation target of the operator such as a region of interest (ROI) in addition to the effective pixel area.
  • ROI region of interest
  • the color misregistration evaluation value calculation unit 424 detects a color misregistration between frames (images) in the set evaluation value calculation area, calculates a color misregistration evaluation value based on the detected color misregistration, and determines a color misregistration evaluation value. As a result, a normalized value of 0 to 1 is assigned (step S104: blur evaluation value calculation step). The color misregistration evaluation value calculation unit 424 outputs the calculated color misregistration evaluation value to the comprehensive evaluation value calculation unit 426.
  • the contrast evaluation value calculation unit 425 calculates a contrast value in the set evaluation value calculation region, normalizes the calculated contrast value, and outputs it as a contrast evaluation value to the comprehensive evaluation value calculation unit 426 (step S105: Focus evaluation value calculation step). Note that the color misregistration evaluation value calculation process and the contrast evaluation value calculation process may be performed in reverse order or simultaneously.
  • the total evaluation value calculation unit 426 calculates a total evaluation value using the above-described equation (1), and the calculated total evaluation value is sent to the freeze image selection unit 427. (Step S106: Comprehensive evaluation value calculation step)
  • the freeze image selection unit 427 selects an image signal to be frozen based on the comprehensive evaluation value (step S107).
  • the freeze image selection unit 427 selects an image signal including a frame having the highest overall evaluation value from among the freeze candidate image signals stored in the buffer 429.
  • the freeze image selection unit 427 selects, for example, an image signal having the highest average value of the comprehensive evaluation of the three imaging elements of RGB components.
  • the freeze image selection unit 427 outputs the acquired freeze target image (image signal) to the post-processing unit 428.
  • the post-processing unit 428 performs the above-described signal processing (gradation conversion processing or the like) on the image signal to be frozen output from the freeze image selection unit 427, thereby generating a freeze image for display (step S108). .
  • the post-processing unit 428 outputs the processed image signal to the display device 5 as an image signal for freeze display.
  • the color misregistration evaluation value calculated by the color misregistration evaluation value calculation unit 424 and the contrast evaluation value calculation unit 425 calculate the evaluation value calculation area set by the area setting unit 423.
  • the comprehensive evaluation value calculation unit 426 calculates the comprehensive evaluation value using Expression (1), and selects the image signal to be frozen based on the calculated comprehensive evaluation value. Therefore, it is possible to accurately evaluate the focus and blur of the image and extract the image to be frozen.
  • the area setting unit 423 has been described as setting the effective pixel area excluding the OB area as the evaluation value calculation area, but may be set to a preset area. Alternatively, a predetermined size centered on a pixel region having a high contrast value may be set as the evaluation value calculation region.
  • FIG. 4 is a schematic diagram for explaining an evaluation value calculation process by the image processing unit of the endoscope system according to the second embodiment.
  • symbol is attached
  • the area setting unit 423 has been described as setting the effective pixel area excluding the OB area as the evaluation value calculation area.
  • the evaluation value calculation area is a plurality of small areas. Divide into
  • the display device 5 displays the in-vivo image generated by the post-processing unit 428 in the display area 100 as shown in FIG.
  • the area setting unit 423 sets the divided small areas 101 to 109, which are nine (3 ⁇ 3) areas, as evaluation value calculation areas for the displayed in-vivo image.
  • the small areas 101 to 109 are areas having a predetermined size (number of pixels) set in advance, and are recorded in the recording unit 44.
  • the area setting unit 423 reads out information related to the small area with reference to the recording unit 44 and sets the small area.
  • the area setting unit 423 selects an evaluation value calculation target small area for the set small areas 101 to 109.
  • the region setting unit 423 corresponds to the pixel value (luminance value) generated by the pixels included in the small regions 101 to 109 being equal to or less than the luminance value corresponding to the halation (for example, the halation Hp shown in FIG. 4). It is determined whether or not the luminance value is equal to or higher than the luminance value, that is, whether or not the pixel value generated by the determination target pixel has neither halation nor blackout.
  • the area setting unit 423 the luminance value of the determination target pixel P, threshold T H of the luminance values according to the halation, when the threshold value of such intensity values to underexposed and T B, T B ⁇ P ⁇ If T H , it is determined that the pixel is free of halation and blackout, and if T H ⁇ P, it is determined that the pixel is halation, and if P ⁇ T B , the pixel is black. It is determined that the pixel is crushed.
  • the region setting unit 423 extracts the pixels in which halation occurs and the pixels in which blackout occurs, and for the set small regions 101 to 109, the halation area (number of pixels) and the blackout area (number of pixels) Is calculated.
  • the region setting unit 423 compares the calculated halation area with a threshold value related to the halation area, and compares the calculated black area with a threshold value related to the black area.
  • the region setting unit 423 excludes a small region having an area larger than any one of the threshold values from the evaluation value calculation target.
  • the region setting unit 423 selects the small region as the evaluation value calculation target in this way, and outputs the selected evaluation value calculation target region to the color misregistration evaluation value calculation unit 424 and the contrast evaluation value calculation unit 425.
  • the color misregistration evaluation value calculation unit 424 and the contrast evaluation value calculation unit 425 calculate the color misregistration evaluation value and the contrast evaluation value in the selected small region, respectively.
  • the small areas selected for explanation are indicated by solid lines, and the excluded small areas are indicated by broken lines.
  • FIG. 5 is a flowchart for explaining an evaluation value calculation process by the image processing unit of the endoscope system according to the second embodiment.
  • a process for selecting a freeze image and generating an image signal for freeze display when there is a freeze image display instruction will be described.
  • the pre-processing unit 421 performs signal processing (OB clamp processing or the like) (step S201) and the synchronization unit 422 as in steps S101 and S102 described above. Performs the synchronization process (step S202).
  • the region setting unit 423 sets an evaluation value calculation region for calculating a color misregistration evaluation value and a contrast evaluation value for the image signal output from the synchronization unit 422 (step S203).
  • the recording unit 44 for example, nine (3 ⁇ 3) small areas 101 to 109 shown in FIG. 4 are set as evaluation value calculation areas.
  • the region setting unit 423 calculates the halation area and the blackout area as described above for each of the small regions 101 to 109 (step S204), and thresholds for the halation area and the blackout area are calculated. Referring to FIG. 5, a small region for evaluation value calculation is selected (step S205).
  • the color misregistration evaluation value calculation unit 424 detects a color misregistration between frames (images) in the selected small region, calculates a color misregistration evaluation value based on the detected color misregistration, and uses the standard as the color misregistration evaluation value.
  • the converted values of 0 to 1 are assigned (step S206).
  • the color misregistration evaluation value calculation unit 424 uses the average value of the plurality of color misregistration evaluation values calculated for each small region, the maximum value or the mode value of the plurality of color misregistration evaluation values as a representative color misregistration evaluation value, and the overall evaluation value. The result is output to the calculation unit 426.
  • the contrast evaluation value calculation unit 425 calculates a contrast value in the selected small region, normalizes the calculated contrast value, and outputs the representative contrast evaluation value as a contrast evaluation value to the comprehensive evaluation value calculation unit 426 ( Step S207).
  • the contrast evaluation value calculation unit 425 uses the average value of the plurality of contrast evaluation values calculated for each small region, the maximum value or the mode value among the plurality of contrast evaluation values, as the representative contrast evaluation value, as the representative evaluation value calculation unit 426. Output.
  • the comprehensive evaluation value calculation unit 426 calculates a comprehensive evaluation value using the above-described formula (1) based on the representative color misregistration evaluation value and the representative contrast evaluation value, and the calculated total evaluation value and the color misregistration evaluation.
  • the image signal output from the value calculation unit 424 is output to the freeze image selection unit 427 (step S208).
  • the freeze image selection unit 427 selects an image signal to be frozen based on the comprehensive evaluation value (step S209). Thereafter, the post-processing unit 428 performs the above-described signal processing (gradation conversion processing or the like) on the freeze target image signal output from the freeze image selection unit 427 (step S210). The post-processing unit 428 outputs the processed image signal to the display device 5 as an image signal for freeze display.
  • the post-processing unit 428 performs the above-described signal processing (gradation conversion processing or the like) on the freeze target image signal output from the freeze image selection unit 427 (step S210).
  • the post-processing unit 428 outputs the processed image signal to the display device 5 as an image signal for freeze display.
  • the comprehensive evaluation value calculation unit 426 calculates the overall evaluation value using Equation (1), and selects the image signal to be frozen based on the calculated overall evaluation value, so that the in-focus and blurring of the image is accurately evaluated.
  • the image to be frozen can be extracted.
  • the evaluation target area is divided into a plurality of small areas, and the color misregistration evaluation value and the contrast evaluation value are calculated for each small area, and the comprehensive evaluation value calculation unit 426 performs the comprehensive evaluation. Since the value is calculated and the image signal to be frozen is selected based on the calculated comprehensive evaluation value, the image to be frozen can be extracted by performing more detailed image evaluation.
  • a small area to be evaluated is selected based on the halation area and the blackout area for a plurality of set small areas, and each selected small area is selected. Since the color misregistration evaluation value and the contrast evaluation value are calculated, the comprehensive evaluation value calculating unit 426 calculates the comprehensive evaluation value, and the image signal to be frozen is selected based on the calculated comprehensive evaluation value. An image to be frozen can be extracted by performing accurate image evaluation.
  • the small areas 101 to 109 may be displayed on the display device 5.
  • the operator can confirm which part of the in-vivo image is being evaluated.
  • FIG. 6 is a schematic diagram for explaining an evaluation value calculation process by the image processing unit of the endoscope system according to the third embodiment.
  • symbol is attached
  • a treatment tool such as a biological forceps extends from the distal end portion 24 and the treatment tool is reflected in the in-vivo image, a small region is determined from the evaluation value calculation target according to the arrangement of the treatment tool. exclude.
  • the region setting unit 423 excludes the small regions 104, 107, and 108 located on the treatment instrument 110 from the evaluation value calculation target.
  • the region setting unit 423 sets the small regions 101 to 109, then refers to the luminance value for each pixel from the image signal output from the synchronization unit 422, and sets the insertion position of the treatment instrument 110 set in advance.
  • the luminance value exceeds the threshold value at, the reflection of the treatment instrument 110 is detected.
  • the area setting unit 423 performs a setting for excluding the small areas (the small areas 104, 107, and 108 in FIG. 6) from the evaluation value calculation target.
  • FIG. 7 is a flowchart for explaining evaluation value calculation processing by the image processing unit of the endoscope system according to the third embodiment.
  • a process for selecting a freeze image and generating an image signal for freeze display when there is a freeze image display instruction will be described.
  • the pre-processing unit 421 performs signal processing (such as OB clamping processing) (step S301) and the synchronization unit 422 as in steps S201 and S202 described above. Performs the synchronization process (step S302).
  • the region setting unit 423 sets an evaluation value calculation region for calculating a color misregistration evaluation value and a contrast evaluation value for the image signal output from the synchronization unit 422 (step S303).
  • nine (3 ⁇ 3) small areas 101 to 109 shown in FIG. 6 are set as evaluation value calculation areas.
  • the region setting unit 423 After setting the small regions 101 to 109, the region setting unit 423 performs detection processing to determine whether or not a treatment tool is present in the extracorporeal image (step S304). If the treatment tool is not detected in the in-vivo image (step S304: No), the region setting unit 423 proceeds to step S306. On the other hand, when the treatment tool is detected in the in-vivo image (step S304: Yes), the region setting unit 423 proceeds to step S305 and performs the small region exclusion process.
  • step S305 the region setting unit 423 performs setting to exclude the small regions (small regions 104, 107, and 108 in FIG. 6) that are set in advance from the insertion position of the treatment instrument 110 from the evaluation value calculation target.
  • step S306 the color misregistration evaluation value calculation unit 424 performs frame-to-frame (image) intervals in small regions (small regions 101 to 103, 105, 106, 109 in the third embodiment) excluding the small regions excluded in step S305. Color misregistration is detected, a color misregistration evaluation value is calculated based on the detected color misregistration, and a normalized value of 0 to 1 is given as the color misregistration evaluation value. The color misregistration evaluation value calculation unit 424 outputs the calculated color misregistration evaluation value to the comprehensive evaluation value calculation unit 426.
  • the contrast evaluation value calculation unit 425 calculates the contrast value in the small region excluding the small region excluded in step S305, normalizes the calculated contrast value, and outputs the contrast value to the comprehensive evaluation value calculation unit 426. (Step S307).
  • the total evaluation value calculation unit 426 calculates a total evaluation value using the above-described formula (1) based on the color misregistration evaluation value and the contrast evaluation value, and calculates the calculated total evaluation value and the color misregistration evaluation value.
  • the image signal output from the unit 424 is output to the freeze image selection unit 427 (step S308).
  • the comprehensive evaluation value calculated by the comprehensive evaluation value calculation unit 426 may be an average of the comprehensive evaluation values of the respective small regions, or may employ the largest evaluation value.
  • the freeze image selection unit 427 selects an image signal to be frozen based on the comprehensive evaluation value (step S309). Thereafter, the post-processing unit 428 performs the above-described signal processing (gradation conversion processing or the like) on the freeze target image signal output from the freeze image selection unit 427 (step S310). The post-processing unit 428 outputs the processed image signal to the display device 5 as an image signal for freeze display.
  • the post-processing unit 428 performs the above-described signal processing (gradation conversion processing or the like) on the freeze target image signal output from the freeze image selection unit 427 (step S310).
  • the post-processing unit 428 outputs the processed image signal to the display device 5 as an image signal for freeze display.
  • the comprehensive evaluation value calculation unit 426 is based on the color misregistration evaluation value calculated by the color misregistration evaluation value calculation unit 424 and the contrast evaluation value calculated by the contrast evaluation value calculation unit 425. Calculates the overall evaluation value using Equation (1), and selects the image signal to be frozen based on the calculated overall evaluation value, so that the in-focus and blurring of the image is accurately evaluated. The image to be frozen can be extracted.
  • the area setting unit 423 detects a treatment tool after setting a plurality of small areas
  • the small area corresponding to the insertion position of the treatment tool is determined from the evaluation target area. Since it is excluded, the image to be frozen can be extracted by accurately performing the image evaluation on the in-vivo image regardless of the presence or absence of the treatment tool.
  • FIG. 8 is a schematic diagram for explaining an evaluation value calculation process by the image processing unit of the endoscope system according to the fourth embodiment.
  • symbol is attached
  • the third embodiment described above it has been described that the presence / absence of the treatment tool in the in-vivo image is detected.
  • the reflection of the liquid fed from the distal end portion 24 on the in-vivo image is detected. .
  • the region setting unit 423 excludes the small regions 105 and 109 located on the liquid W from the evaluation value calculation target.
  • the region setting unit 423 refers to the luminance value for each pixel from the image signal output from the synchronization unit 422, and sets the liquid W at a preset water supply angle.
  • the luminance value exceeds the threshold value
  • the reflection of the liquid W is detected.
  • the region setting unit 423 performs a setting for excluding small regions (small regions 105 and 109 in FIG. 8) related to the water supply angle from the evaluation value calculation target.
  • the treatment instrument in step S304 in FIG. 7 is replaced with liquid.
  • the region setting unit 423 sets a plurality of small regions and then excludes the small region related to the water supply angle from the evaluation calculation target region when liquid is detected.
  • the image to be frozen can be extracted by accurately performing the image evaluation on the in-vivo image regardless of the presence or absence of liquid.
  • the body surface when a liquid is reflected in the in-vivo image, the body surface may be difficult to recognize due to the liquid. In this case, since the in-vivo image becomes an unclear image, acceptance of the freeze instruction may be prohibited.
  • FIG. 9 is a schematic diagram illustrating an evaluation value calculation process performed by the image processing unit of the endoscope system according to the fifth embodiment.
  • symbol is attached
  • the hood is attached to the distal end portion 24, for example, when performing magnified observation.
  • the image of the arrangement area of the hood becomes dark (a blurred image).
  • the region setting unit 423 excludes the small regions 101, 103, 107, and 109 located on the hood 120 from the evaluation value calculation target.
  • the region setting unit 423 calculates the difference in luminance value during calibration at the time of starting the processing device 4 (determines the contrast value), thereby installing the hood (the reflection of the hood 120). ) Is detected.
  • the area setting unit 423 detects the reflection of the hood, the area setting unit 423 is configured to exclude the small areas (the small areas 101, 103, 107, and 109 in FIG. 9) related to the hood arrangement area (reflection area) from the evaluation value calculation target. I do.
  • the specific evaluation value calculation process is replaced with a determination process for determining whether or not there is detection of a hood in step S304 in FIG.
  • the area setting unit 423 detects the reflection of the hood, the small area related to the hood arrangement area is excluded from the evaluation calculation target area. Regardless of the presence or absence of reflection, the in-vivo image can be accurately evaluated to extract the image to be frozen.
  • the calculation of the halation area and the blackout area and the evaluation target region selection processing according to the second embodiment may be combined. For example, it is possible to perform more accurate image evaluation by combining small area exclusion processing by detection of a treatment tool and evaluation target region selection processing by detection of halation and blackout.
  • the small area itself is excluded by detecting the treatment tool or the like.
  • the presence of the treatment tool in the small area based on the luminance value or the contrast value.
  • the evaluation value may be calculated by excluding only the region.
  • the treatment instrument is detected based on the luminance value.
  • the detection may be performed using pattern matching, a marker, a color difference, or the like. .
  • the plurality of small areas may be obtained by dividing the display area of the in-vivo image into a grid. Further, the division of the small area of the image is not limited to the division number in the above-described embodiment, and may be any division number.
  • the treatment tool, liquid, and hood protruding from the insertion portion of the endoscope are detected as foreign matters derived from the endoscope 2 and the small region is set.
  • the small region is set.
  • the color misregistration evaluation value and the contrast value are described as standardized values. However, the color misregistration amount and the contrast value itself may be used as the evaluation value.
  • the image to be frozen is selected based on the comprehensive evaluation value. However, the image to be stored is not limited to being used for freezing. Good.
  • the light source device 3 is described as being separate from the processing device 4.
  • the light source device 3 and the processing device 4 are integrated, for example, the processing device 4.
  • the illumination unit 31 and the illumination control unit 32 may be provided inside.
  • the image evaluation apparatus, the endoscope system, the operation method of the image evaluation apparatus, and the operation program of the image evaluation apparatus according to the present invention accurately perform the focusing and blurring evaluation of the image, and the image to be frozen. Is useful for extracting.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Optics & Photonics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Astronomy & Astrophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Endoscopes (AREA)
  • Studio Devices (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

L'invention concerne un dispositif d'évaluation d'image qui acquiert un signal comprenant une image d'un sujet, et met en oeuvre une évaluation de mise au point et une évaluation de flou de l'image. Ce dispositif d'évaluation d'image comprend : une unité de réglage de zone qui règle, dans l'image, une zone à évaluer dans laquelle l'évaluation de mise au point et l'évaluation de flou doivent être mises en oeuvre; une unité de calcul de valeur d'évaluation de mise au point, qui calcule une valeur d'évaluation de mise au point représentant le degré de mise au point de la zone évaluée, réglé par l'unité de réglage de zone; une unité de calcul de valeur d'évaluation de flou, qui calcule une valeur d'évaluation de flou représentant la quantité de flou du sujet dans la zone évaluée par l'unité de réglage de zone; et une unité de calcul de valeur d'évaluation détaillée, qui calcule une valeur d'évaluation détaillée sur la base de la valeur d'évaluation de mise au point calculée par l'unité de calcul de valeur d'évaluation de mise au point, et de la valeur d'évaluation de flou calculée par l'unité de calcul de valeur d'évaluation de flou.
PCT/JP2015/083140 2014-12-02 2015-11-25 Dispositif d'évaluation d'image, système d'endoscope, procédé et programme de commande d'un dispositif d'évaluation d'image WO2016088628A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2016533674A JPWO2016088628A1 (ja) 2014-12-02 2015-11-25 画像評価装置、内視鏡システム、画像評価装置の作動方法および画像評価装置の作動プログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-244119 2014-12-02
JP2014244119 2014-12-02

Publications (1)

Publication Number Publication Date
WO2016088628A1 true WO2016088628A1 (fr) 2016-06-09

Family

ID=56091574

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/083140 WO2016088628A1 (fr) 2014-12-02 2015-11-25 Dispositif d'évaluation d'image, système d'endoscope, procédé et programme de commande d'un dispositif d'évaluation d'image

Country Status (2)

Country Link
JP (1) JPWO2016088628A1 (fr)
WO (1) WO2016088628A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018038675A (ja) * 2016-09-08 2018-03-15 富士フイルム株式会社 内視鏡用光源装置及びその作動方法並びに内視鏡システム
WO2018211709A1 (fr) * 2017-05-19 2018-11-22 オリンパス株式会社 Dispositif de correction de flou, dispositif d'endoscope et procédé de correction de flou
JP2019162280A (ja) * 2018-03-20 2019-09-26 ソニー株式会社 内視鏡システム、制御方法、情報処理装置、およびプログラム
JP2020073047A (ja) * 2020-02-04 2020-05-14 富士フイルム株式会社 内視鏡用光源装置及び内視鏡システム

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04212591A (ja) * 1990-10-12 1992-08-04 Olympus Optical Co Ltd 画像フリーズ用信号処理装置
JP2004294788A (ja) * 2003-03-27 2004-10-21 Fuji Photo Optical Co Ltd オートフォーカス機能を備えた電子内視鏡装置
JP2006149483A (ja) * 2004-11-25 2006-06-15 Olympus Corp 電子内視鏡装置及び色ずれ補正装置
JP2010224112A (ja) * 2009-03-23 2010-10-07 Sony Corp 撮像装置、撮像装置の制御方法およびプログラム
JP2010220755A (ja) * 2009-03-23 2010-10-07 Fujifilm Corp 内視鏡用プロセッサ装置、およびその駆動方法
JP2012125293A (ja) * 2010-12-13 2012-07-05 Olympus Corp 制御装置、内視鏡装置及びフォーカス制御方法
JP2013016924A (ja) * 2011-06-30 2013-01-24 Nikon Corp 画像評価装置
JP2013041117A (ja) * 2011-08-16 2013-02-28 Pentax Ricoh Imaging Co Ltd 撮像装置および距離情報取得方法
JP2013042371A (ja) * 2011-08-16 2013-02-28 Pentax Ricoh Imaging Co Ltd 撮像装置および距離情報取得方法
JP2013230319A (ja) * 2012-05-02 2013-11-14 Olympus Corp 内視鏡装置及び内視鏡装置の制御方法
WO2013180147A1 (fr) * 2012-05-31 2013-12-05 オリンパス株式会社 Dispositif d'endoscope
JP2014145808A (ja) * 2013-01-28 2014-08-14 Olympus Corp 撮像装置及び撮像装置の制御方法
JP2015061137A (ja) * 2013-09-17 2015-03-30 キヤノン株式会社 撮像装置、その制御方法、及びプログラム
WO2015044996A1 (fr) * 2013-09-24 2015-04-02 オリンパス株式会社 Dispositif d'endoscope et procédé de commande de dispositif d'endoscope

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04212591A (ja) * 1990-10-12 1992-08-04 Olympus Optical Co Ltd 画像フリーズ用信号処理装置
JP2004294788A (ja) * 2003-03-27 2004-10-21 Fuji Photo Optical Co Ltd オートフォーカス機能を備えた電子内視鏡装置
JP2006149483A (ja) * 2004-11-25 2006-06-15 Olympus Corp 電子内視鏡装置及び色ずれ補正装置
JP2010224112A (ja) * 2009-03-23 2010-10-07 Sony Corp 撮像装置、撮像装置の制御方法およびプログラム
JP2010220755A (ja) * 2009-03-23 2010-10-07 Fujifilm Corp 内視鏡用プロセッサ装置、およびその駆動方法
JP2012125293A (ja) * 2010-12-13 2012-07-05 Olympus Corp 制御装置、内視鏡装置及びフォーカス制御方法
JP2013016924A (ja) * 2011-06-30 2013-01-24 Nikon Corp 画像評価装置
JP2013041117A (ja) * 2011-08-16 2013-02-28 Pentax Ricoh Imaging Co Ltd 撮像装置および距離情報取得方法
JP2013042371A (ja) * 2011-08-16 2013-02-28 Pentax Ricoh Imaging Co Ltd 撮像装置および距離情報取得方法
JP2013230319A (ja) * 2012-05-02 2013-11-14 Olympus Corp 内視鏡装置及び内視鏡装置の制御方法
WO2013180147A1 (fr) * 2012-05-31 2013-12-05 オリンパス株式会社 Dispositif d'endoscope
JP2014145808A (ja) * 2013-01-28 2014-08-14 Olympus Corp 撮像装置及び撮像装置の制御方法
JP2015061137A (ja) * 2013-09-17 2015-03-30 キヤノン株式会社 撮像装置、その制御方法、及びプログラム
WO2015044996A1 (fr) * 2013-09-24 2015-04-02 オリンパス株式会社 Dispositif d'endoscope et procédé de commande de dispositif d'endoscope

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018038675A (ja) * 2016-09-08 2018-03-15 富士フイルム株式会社 内視鏡用光源装置及びその作動方法並びに内視鏡システム
WO2018211709A1 (fr) * 2017-05-19 2018-11-22 オリンパス株式会社 Dispositif de correction de flou, dispositif d'endoscope et procédé de correction de flou
US11196926B2 (en) 2017-05-19 2021-12-07 Olympus Corporation Blur correction device, endoscope apparatus, and blur correction method
JP2019162280A (ja) * 2018-03-20 2019-09-26 ソニー株式会社 内視鏡システム、制御方法、情報処理装置、およびプログラム
US11455722B2 (en) 2018-03-20 2022-09-27 Sony Corporation System with endoscope and image sensor and method for processing medical images
JP7159577B2 (ja) 2018-03-20 2022-10-25 ソニーグループ株式会社 内視鏡システム、制御方法、情報処理装置、およびプログラム
US11986156B2 (en) 2018-03-20 2024-05-21 Sony Group Corporation System with endoscope and image sensor and method for processing medical images
JP2020073047A (ja) * 2020-02-04 2020-05-14 富士フイルム株式会社 内視鏡用光源装置及び内視鏡システム
JP7130011B2 (ja) 2020-02-04 2022-09-02 富士フイルム株式会社 内視鏡用光源装置及び内視鏡システム

Also Published As

Publication number Publication date
JPWO2016088628A1 (ja) 2017-04-27

Similar Documents

Publication Publication Date Title
US9498153B2 (en) Endoscope apparatus and shake correction processing method
JPWO2012081617A1 (ja) 撮像装置
WO2016079831A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image, programme de traitement d'image et dispositif d'endoscope
JP6109456B1 (ja) 画像処理装置および撮像システム
JP6049945B2 (ja) 撮像装置および処理装置
US20190082936A1 (en) Image processing apparatus
WO2016104386A1 (fr) Gradateur, système d'imagerie, procédé permettant de faire fonctionner un gradateur, et programme de fonctionnement pour gradateur
US20210307587A1 (en) Endoscope system, image processing device, total processing time detection method, and processing device
WO2017022324A1 (fr) Procédé de traitement d'un signal d'image, dispositif de traitement d'un signal d'image et programme de traitement d'un signal d'image
WO2016084257A1 (fr) Appareil d'endoscopie
WO2016088628A1 (fr) Dispositif d'évaluation d'image, système d'endoscope, procédé et programme de commande d'un dispositif d'évaluation d'image
CN109152520B (zh) 图像信号处理装置、图像信号处理方法以及记录介质
JP6137892B2 (ja) 撮像システム
US10462440B2 (en) Image processing apparatus
JP6346501B2 (ja) 内視鏡装置
WO2017203996A1 (fr) Dispositif de traitement de signal d'image, procédé de traitement de signal d'image et programme de traitement de signal d'image
JP6937902B2 (ja) 内視鏡システム
JP5815162B2 (ja) 撮像装置
JP2017123997A (ja) 撮像システムおよび処理装置
JP7234320B2 (ja) 画像処理装置および画像処理装置の作動方法
JP6801990B2 (ja) 画像処理システムおよび画像処理装置
WO2017022323A1 (fr) Procédé de traitement de signal d'image, dispositif de traitement de signal d'image et programme de traitement de signal d'image
JP2017221276A (ja) 画像処理装置

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2016533674

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15865445

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15865445

Country of ref document: EP

Kind code of ref document: A1