WO2013061939A1 - Endoscopic device and focus control method - Google Patents

Endoscopic device and focus control method Download PDF

Info

Publication number
WO2013061939A1
WO2013061939A1 PCT/JP2012/077282 JP2012077282W WO2013061939A1 WO 2013061939 A1 WO2013061939 A1 WO 2013061939A1 JP 2012077282 W JP2012077282 W JP 2012077282W WO 2013061939 A1 WO2013061939 A1 WO 2013061939A1
Authority
WO
WIPO (PCT)
Prior art keywords
focus
target
unit
focus position
region
Prior art date
Application number
PCT/JP2012/077282
Other languages
French (fr)
Japanese (ja)
Inventor
恵仁 森田
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Publication of WO2013061939A1 publication Critical patent/WO2013061939A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/2407Optical details
    • G02B23/2423Optical details of the distal end
    • G02B23/243Objectives for endoscopes
    • G02B23/2438Zoom objectives
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000094Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00188Optical arrangements with focusing or zooming features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/007Optical devices or arrangements for the control of light using movable or deformable optical elements the movable or deformable optical element controlling the colour, i.e. a spectral characteristic, of the light
    • G02B26/008Optical devices or arrangements for the control of light using movable or deformable optical elements the movable or deformable optical element controlling the colour, i.e. a spectral characteristic, of the light in the form of devices for effecting sequential colour changes, e.g. colour wheels
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals

Definitions

  • the present invention relates to an endoscope apparatus, a focus control method, and the like.
  • an imaging apparatus such as an endoscope
  • a pan-focus image is required in order not to interfere with the doctor's diagnosis.
  • the endoscope achieves such performance by using an optical system having a relatively large F number to increase the depth of field.
  • an image sensor having a high pixel number of about several hundred thousand pixels has been used.
  • the permissible circle of confusion decreases with the pixel pitch in an image sensor with a high pixel, it is necessary to reduce the F number, and the depth of field of the image pickup device becomes narrow.
  • an endoscope apparatus for providing an in-focus object position driving unit for driving the in-focus object position of the objective optical system in the imaging unit of the endoscope and performing autofocus (hereinafter referred to as AF) on the subject.
  • AF autofocus
  • a focus operation is easily and quickly performed on a focus area by performing focus control based on discrete focus object position control after detecting the focus area.
  • An endoscope apparatus and a focus control method to be performed can be provided.
  • an imaging unit that acquires an image signal, and switching that switches a focused object position determined by the state of the imaging unit to any one of a plurality of discretely set target focus position candidates And one of the plurality of target focus position candidates is selected as a target focus position, and the switching unit is controlled so that the selected target focus position becomes the focus object position.
  • a focus control unit that performs focusing control, and the focus control unit includes a target region detection unit that detects a target region on a subject from the image signal, and the focus control unit detects the detected target unit It is determined whether or not the region of interest is in focus. When it is determined that the target region is in focus, the target focus position candidate corresponding to the timing at which the determination is performed is determined as the target focus position.
  • the switching unit is controlled so that the target in-focus position candidate different from the target in-focus position candidate corresponding to the timing at which the determination is performed is set to the in-focus state.
  • the present invention relates to an endoscope apparatus that switches an object position.
  • control is performed so that the in-focus object position corresponding to the state of the imaging unit is a target in-focus position selected from a plurality of discrete target in-focus position candidates.
  • a target focus position candidate used as the target focus position is selected based on the determination as to whether or not the region of interest detected from the image signal is in focus.
  • Another aspect of the present invention is a focus control method in an objective optical system in which a plurality of target in-focus position candidates are discretely set, and an image signal from an imaging unit is acquired, and the object signal is obtained from the image signal.
  • the target area is detected, and it is determined whether or not the detected target area is in focus.
  • the target corresponding to the timing at which the determination is performed
  • the in-focus position candidate is selected as the in-focus position and it is determined that the in-focus position is not in focus
  • the target in-focus position different from the in-focus position candidate corresponding to the timing at which the determination is performed This is related to a focus control method for switching a focused object position to a candidate.
  • FIG. 1 is a configuration example of an endoscope apparatus according to the present embodiment.
  • FIG. 2 is a configuration example of a rotating color filter.
  • FIG. 3 shows an example of spectral characteristics of the rotating color filter.
  • FIG. 4 is a configuration example of the image processing unit.
  • FIG. 5 is a configuration example of the focus control unit.
  • FIG. 6 is a configuration example of a contrast value calculation unit.
  • FIG. 7 shows a configuration example of the attention area detection unit.
  • FIG. 8 shows an example of setting an evaluation area by dividing an area.
  • FIG. 9 shows another configuration example of the focus control unit.
  • FIG. 1 is a configuration example of an endoscope apparatus according to the present embodiment.
  • FIG. 2 is a configuration example of a rotating color filter.
  • FIG. 3 shows an example of spectral characteristics of the rotating color filter.
  • FIG. 4 is a configuration example of the image processing unit.
  • FIG. 5 is a configuration example of the focus control unit.
  • FIG. 6 is
  • FIG. 10 is a diagram for explaining a relationship between an imaging unit and a subject at the time of screening, a point away from the imaging unit corresponding to a focused object position by the best subject distance, and a range in which focus is achieved in the depth direction in the object scene.
  • FIG. 11 is a diagram for explaining the relationship between the imaging unit and the subject at the time of close-up observation, a point away from the imaging unit corresponding to the focused object position by the best subject distance, and a range focused in the depth direction in the object scene.
  • FIG. 12 shows an example of four-focus switching.
  • FIG. 13 shows an example of bifocal switching.
  • FIG. 14 is a flowchart for explaining a focused object position switching process in the first embodiment.
  • FIG. 15 is a flowchart illustrating focused object position switching processing according to the second embodiment.
  • the depth of field tends to be further reduced by increasing the imaging magnification of the imaging unit or shortening the distance from the imaging unit to the subject.
  • the focusing operation is very difficult.
  • the system automatically performs the focusing operation.
  • the normal AF has a structure that continuously switches the in-focus object position (for example, a structure that continuously switches the focus lens position). For this reason, the degree of freedom of selection of the target in-focus position is high and flexible handling is possible, but on the other hand, the time required for setting the target in-focus position becomes long, and in some cases, the mechanism becomes complicated. If the mechanism is complicated, the imaging unit itself becomes large, which is not preferable in an endoscope apparatus in which the imaging unit is inserted into a living body.
  • the focused object position is a relative position (object point) of the object with respect to the reference position when a system including the object, the optical system, the _GoBack_GoBack image plane, and the like is in a focused state.
  • the image plane coincides with the plane of the imaging element included in the imaging unit. Therefore, when the plane of the imaging element is fixed, the state of the optical system is determined. In this case, the in-focus object position can be determined.
  • the target in-focus position is a position where it is desired to focus at a certain point in time (becomes an in-focus target).
  • it indicates the relative position (object point) of the object to be focused on at a certain point in time.
  • examples of the reference position include the position of the surface of the image sensor and the position of the tip of the optical system.
  • the present applicant proposes a method for discretely controlling the focused object position. For example, one of a plurality of target focus position candidates set discretely may be selected as the target focus position, and control may be performed so that the selected target focus position becomes the focus object position.
  • processing for example, contrast value calculation processing
  • the focusing operation can be executed at high speed.
  • the mechanism can be simplified compared to the case of continuous in-focus object position control, and the imaging unit can be downsized. Can be expected.
  • the mechanism can be simplified as described above. Also, when it is not possible to determine that all target in-focus position candidates are in focus (for example, when the contrast value is small, etc.), unlike the first embodiment, the determination result by the determination unit 325 is used to determine the target. Determine the in-focus position.
  • FIG. 1 shows a configuration example of an endoscope apparatus according to a first embodiment.
  • the endoscope apparatus includes a light source unit 100, an imaging unit 200, a control device 300 (processor unit), a display unit 400, and an external I / F unit 500.
  • the light source unit 100 includes a white light source 110, a light source stop 120, a light source stop driving unit 130 for driving the light source stop 120, and a rotating color filter 140 having a plurality of spectral transmittance filters.
  • the light source unit 100 also includes a rotation driving unit 150 that drives the rotation color filter 140 and a condenser lens 160 that condenses the light transmitted through the rotation color filter 140 onto the incident end face of the light guide fiber 210.
  • the light source aperture driving unit 130 adjusts the amount of light by opening and closing the light source aperture 120 based on a control signal from the control unit 340 of the control device 300.
  • FIG. 2 shows a detailed configuration example of the rotation color filter 140.
  • the rotation color filter 140 includes three primary color red (hereinafter abbreviated as R) filter 701, green (hereinafter abbreviated as G) filter 702, blue (hereinafter abbreviated as B) filter 703, and a rotary motor 704. Yes.
  • FIG. 3 shows an example of spectral characteristics of these color filters 701 to 703.
  • the rotation driving unit 150 rotates the rotation color filter 140 at a predetermined number of rotations in synchronization with the imaging period of the imaging element 260 based on a control signal from the control unit 340. For example, if the rotating color filter 140 is rotated 20 times per second, each color filter crosses incident white light at 1/60 second intervals. In this case, the image sensor 260 completes imaging and transfer of image signals at 1/60 second intervals.
  • the image sensor 260 is, for example, a monochrome single-plate image sensor, and is configured by, for example, a CCD or a CMOS image sensor. That is, in the present embodiment, frame sequential imaging is performed in which images of light of each of the three primary colors (R, G, or B) are captured at 1/60 second intervals.
  • the imaging unit 200 is formed to be elongate and bendable, for example, to enable insertion into a body cavity.
  • the imaging unit 200 diffuses the light guide fiber 210 for guiding the light collected by the light source unit 100 to the illumination lens 220 and the light guided to the tip by the light guide fiber 210 to irradiate the observation target.
  • An illumination lens 220 is included.
  • the imaging unit 200 also switches the objective lens 230 that collects the reflected light returning from the observation target, the focus lens 240 for adjusting the focus object position, and the position of the focus lens 240 at discrete positions.
  • a switching unit 250 and an image sensor 260 for detecting the collected reflected light are provided.
  • the switching unit 250 is, for example, a VCM (Voice Coil Motor) and is connected to the focus lens 240.
  • VCM Vehicle Coil Motor
  • the switching unit 250 adjusts the focused object position by switching the position of the focus lens 240 at a plurality of discrete positions.
  • FIG. 12 shows the relationship between the position of the focus lens 240 and the best subject distance corresponding to the focused object position at this time in the present embodiment.
  • the best subject distance is the distance in the depth direction from the imaging unit in the object scene.
  • the subject image is focused on the image sensor (ideally, the subject has come out from one point on the subject).
  • Distance from the imaging unit 200 to the subject when the light beam converges to one point on the image sensor. That is, the best subject distance is a distance corresponding to the distance from the imaging unit 200 to the focused object position.
  • the “subject distance” in the present embodiment refers to a distance from the imaging unit 200 to the subject, and is not limited to a distance from a point having optical characteristics such as a principal point and a focal point.
  • the focus lens 240 takes discrete positions A, B, C, and D, and switches the best subject distance corresponding to the focused object position at this time in four stages.
  • a point away from the imaging unit by the best subject distance has a one-to-one correspondence with the focus lens positions A to D in order from a point close to the imaging unit 200 (hereinafter referred to as a near point) to a point far away (hereinafter referred to as a far point) To do.
  • the depth of field achieves the depth of field required for endoscopic observation by switching the position of the focus lens 240.
  • the in-focus range includes a range of 2 to 70 mm from the imaging unit.
  • the focus lens position is at a position that focuses on the near point, as shown in FIG. 11, it is suitable for observing a subject with no depth close to it.
  • the point away from the imaging unit by the best subject distance is on the far point side, the depth of field becomes deep. Therefore, as shown in FIG. 10, it is suitable for screening a luminal subject.
  • the control device 300 controls each part of the endoscope device and performs image processing.
  • the control device 300 includes an A / D conversion unit 310, a focus control unit 320, an image processing unit 330, and a control unit 340.
  • the image signal converted into a digital signal by the A / D conversion unit 310 is transferred to the image processing unit 330.
  • the image signal processed by the image processing unit 330 is transferred to the focus control unit 320 and the display unit 400.
  • the focus control unit 320 changes the position of the focus lens 240 by transferring a control signal to the switching unit 250.
  • the control unit 340 controls each unit of the endoscope apparatus. Specifically, the control unit 340 synchronizes the light source aperture driving unit 130, the focus control unit 320, and the image processing unit 330. Further, it is connected to the external I / F unit 500, and controls the focus control unit 320 and the image processing unit 330 based on the input from the external I / F unit 500.
  • the display unit 400 is a display device capable of displaying a moving image, and includes, for example, a CRT or a liquid crystal monitor.
  • the external I / F unit 500 is an interface for performing an input from an operator to the endoscope apparatus.
  • the external I / F unit 500 includes, for example, a power switch for turning on / off the power, a mode switching button for switching a photographing mode and other various modes.
  • the external I / F unit 500 transfers the input information to the control unit 340.
  • FIG. 4 shows a detailed configuration example of the image processing unit 330 according to the first embodiment.
  • the image processing unit 330 includes a preprocessing unit 331, a synchronization unit 332, and a postprocessing unit 333.
  • the A / D conversion unit 310 is connected to the preprocessing unit 331.
  • the preprocessing unit 331 is connected to the synchronization unit 332.
  • the synchronization unit 332 is connected to the post-processing unit 333 and the focus control unit 320.
  • the post-processing unit 333 is connected to the display unit 400.
  • the control unit 340 is bi-directionally connected to the pre-processing unit 331, the synchronization unit 332, and the post-processing unit 333, and performs these controls.
  • the preprocessing unit 331 uses the OB clamp value, gain correction value, and WB coefficient value stored in advance in the control unit 340 for the image signal input from the A / D conversion unit 310, Gain correction processing and WB correction processing are performed.
  • the preprocessing unit 331 transfers the preprocessed image signal to the synchronization unit 332.
  • the synchronizer 332 synchronizes the frame sequential image signal with the image signal processed by the preprocessor 331 based on the control signal of the controller 340. Specifically, the synchronizer 332 accumulates image signals of each color light (R, G, or B) input in frame order one frame at a time, and simultaneously reads the accumulated image signals of each color light. The synchronization unit 332 transfers the synchronized image signal to the post-processing unit 333 and the focus control unit 320.
  • the post-processing unit 333 uses the tone conversion coefficient, color conversion coefficient, and edge enhancement coefficient stored in advance in the control unit 340 for the image signal after synchronization, and performs tone conversion processing, color processing, and contour processing. Perform enhancement processing.
  • the post-processing unit 333 transfers the post-processed image signal to the display unit 400.
  • FIG. 5 shows a detailed configuration example of the focus control unit 320 in the first embodiment.
  • the focus control unit 320 includes an attention area detection unit 321 (abnormal part detection unit), an area setting unit 322, a contrast value calculation unit 323, and a switching control unit 324.
  • the image processing unit 330 is connected to the attention area detection unit 321 and the area setting unit 322.
  • the attention area detection unit 321 is connected to the area setting unit 322.
  • the region setting unit 322 is connected to the contrast value calculation unit 323.
  • the contrast value calculation unit 323 is connected to the switching control unit 324.
  • the switching control unit 324 is connected to the switching unit 250.
  • the control unit 340 is connected to the attention area detection unit 321, the area setting unit 322, the contrast value calculation unit 323, and the switching control unit 324, and performs these controls.
  • the attention area detection unit 321 detects the attention area of the subject from the image signal transferred from the image processing unit 330.
  • the attention area detection unit 321 transfers area information indicating the position of the attention area to the area setting unit 322.
  • a control signal indicating that the attention area does not exist is transferred to the switching control unit 324.
  • the attention area may be an abnormal part representing a lesion part or the like.
  • the attention area detection unit 321 is realized as an abnormal part detection unit. The processing performed by the abnormal part detection unit will be described in detail later.
  • the region setting unit 322 sets an evaluation region for calculating a contrast value for the image signal transferred from the image processing unit 330 when the attention region is detected by the attention region detection unit 321.
  • the attention area may be set as the evaluation area as it is, or as shown in FIG. 8, the image signal is divided in advance into a plurality of areas, and the area containing the most attention area is set as the evaluation area. You may do it. Thereafter, the region setting unit 322 transfers the set evaluation region information and the image signal to the contrast value calculation unit 323.
  • the contrast value calculation unit 323 calculates the contrast value of the evaluation area from the evaluation area information and the image signal.
  • the image signal transferred from the region setting unit 322 may calculate a contrast value for an arbitrary channel.
  • a luminance signal may be generated from pixel values of three channels of R, G, and B, and a contrast value may be calculated for the pixel value of the generated luminance signal.
  • the contrast value calculation unit 323 may perform arbitrary high-pass filter processing on all the pixels included in the evaluation region, and calculate the contrast value by adding all the high-pass filter output values of each pixel. Further, for example, as shown in FIG.
  • the contrast value calculation unit 323 may be configured to include a bright spot removal unit 3231 before the high-frequency extraction unit 3232 that performs high-pass filter processing.
  • the bright spot removing unit 3231 performs threshold processing on an arbitrary channel or pixel value of a luminance signal in all pixels included in the evaluation region, and determines that a pixel having a pixel value equal to or larger than the threshold is a bright spot. To do.
  • the pixel determined to be a bright spot can reduce the influence of the bright spot on the contrast value by transferring a control signal that sets the output value of the subsequent high-pass filter processing to 0. Thereafter, the contrast value calculation unit 323 transfers the contrast value of the evaluation region to the switching control unit 324.
  • the switching control unit 324 controls the switching unit 250 based on the contrast value calculated by the contrast value calculation unit 323.
  • an attention area (abnormally narrow part) is detected in the attention area detection unit 321 (S101), and when a control signal that does not include the attention area is transferred from the attention area detection unit 321, the switching control unit 324 transfers a control signal for moving the focus lens position to a position corresponding to a point corresponding to the longest subject distance (farthest point) from the imaging unit to the switching unit 250 (S102). This is because, at the farthest point, the depth of field is the deepest, and thus the focused area is wide.
  • the region setting unit 322 sets an evaluation region corresponding to the detected region of interest (S103). Then, the contrast value calculation unit 323 calculates the contrast value of the set evaluation area (S104). The switching control unit 324 determines whether the calculated contrast value is larger than the threshold value Tcon (S105). When the contrast value is equal to or less than the threshold value Tcon (including that value), the current focus lens position and the contrast value are stored in a memory (not shown) in the switching control unit, and the switching start flag F is set to 1 to perform the switching process (S106). To S108).
  • the position of the focus lens 240 is switched to a position (position A in FIG. 12) corresponding to the point with the longest subject distance (farthest point) from the imaging unit (S107). Thereafter, the switching start flag is set to 0, the stored focus lens position and contrast value are erased, and the process ends.
  • the focus lens 240 is moved to the focus lens position (one of them when there are a plurality of focus lens positions). (S108).
  • the switching control unit 324 sets the switching start flag F to 0 without moving the focus lens position, and erases the stored focus lens position and contrast value. Then, the process ends (S109).
  • the position of the focus lens 240 is switched to a position corresponding to the point (farthest point) with the longest subject distance from the imaging unit.
  • the focus lens position may be switched to a predetermined position designated by the surgeon or a focus lens position with the highest contrast value stored in a memory (not shown) in the switching control unit.
  • the focus lens position is set discretely, and the focus lens position and the target focus position candidate have a correspondence relationship.
  • the endoscope apparatus of this embodiment selects one of a plurality of discrete target focus position candidates set as discrete target focus positions of the objective optical system.
  • discrete target focus position candidates can be set by performing discrete focus lens position control. It can. That is, the focus lens position in FIG. 14 represents A to D in FIG. 12, E and F in FIG.
  • FIG. 7 shows a detailed configuration example of the attention area detection unit 321 in the first embodiment.
  • the attention area detection unit 321 includes a brightness / color signal calculation unit 3211, a reference signal creation unit 3212, a difference calculation unit 3213, an area division unit 3214, a candidate region detection unit 3215, and a feature amount calculation unit 3216.
  • FIG. 7 is a configuration example assuming that the attention area detection unit 321 is realized as an abnormal part detection unit that detects an abnormal part as an attention area, and an example of detecting an abnormal part is also described in the following description. is doing.
  • the present invention is not limited to this.
  • the image processing unit 330 is connected to a brightness / color signal calculation unit 3211 and an area division unit 3214.
  • the brightness / color signal calculation unit 3211 is connected to the reference signal creation unit 3212 and the difference calculation unit 3213.
  • the reference signal creation unit 3212 is connected to the difference calculation unit 3213.
  • the difference calculation unit 3213 is connected to the candidate area detection unit 3215.
  • the area dividing unit 3214 is connected to the candidate area detecting unit 3215.
  • the candidate area detection unit 3215 is connected to the feature amount calculation unit 3216 and the switching control unit 324.
  • the feature amount calculation unit 3216 is connected to the region setting unit 322.
  • the control unit 340 includes both a brightness / color signal calculation unit 3211, a reference signal creation unit 3212, a difference calculation unit 3213, a region division unit 3214, a candidate region detection unit 3215, and a feature amount calculation unit 3216. Are connected to each other and perform these controls.
  • the brightness / color signal calculation unit 3211 calculates a brightness signal and a color signal based on the image signal transferred from the image processing unit 330.
  • the color signal is a signal indicating the redness of the image signal.
  • the brightness signal Y and the color signal C are calculated from the image signals of the respective color lights included in the image signal using the following expressions (1) and (2).
  • R, G, and B indicate image signals of each color light included in the image signal. Further, a is a constant, and a value input in advance from the outside is used.
  • the calculated brightness signal and color signal are transferred to the reference signal creation unit 3212 and the difference calculation unit 3213.
  • the reference signal creation unit 3212 creates a reference signal based on the brightness signal and the color signal transferred from the brightness / color signal calculation unit 3211. Specifically, low-pass filter processing is performed on the brightness signal and the color signal, respectively.
  • the low-pass filter for example, a known method such as an average value filter, a Gaussian filter, or an edge-preserving smoothing filter is used.
  • the brightness signal and the color signal subjected to the low-pass filter processing are transferred to the difference calculation unit 3213 as the brightness reference signal and the color reference signal, respectively.
  • the reference signal is an image subjected to low-pass filter processing.
  • the present invention is not limited to this, and an approximate function is calculated for each of the brightness signal and the color signal, and the approximate value calculated from the approximate function is used as the reference signal. Also good.
  • the difference calculation unit 3213 calculates the difference between the brightness signal, the color signal, and each reference signal.
  • the calculated difference signals are transferred to the candidate area detection unit 3215 as a brightness difference signal and a color difference signal, respectively.
  • the region dividing unit 3214 divides the image signal transferred from the image processing unit 330 into local regions. Specifically, first, a contour pixel is detected from the image signal using a known contour tracking method. The area divided by the contour pixels is subjected to a known labeling process and transferred to the candidate area detection unit 3215.
  • the candidate area detection unit 3215 detects a candidate area for an abnormal part (a candidate area for attention that is a candidate area for the attention area in a broad sense) based on the difference signal for each local area. Specifically, for each pixel of the brightness difference signal, first, a pixel whose difference is equal to or greater than a threshold value TmaxY (including its value) is extracted as an outlier pixel. Excluding outlier pixels, the variance ⁇ Y of the brightness difference for each region is calculated. An area where the brightness variance ⁇ y is larger than the threshold value TsY is extracted as an abnormal brightness area (for example, corresponding to an uneven lesion).
  • a pixel having a difference equal to or greater than the threshold value Tmaxc is extracted as an outlier pixel.
  • the color variance ⁇ c of the color difference for each region is calculated.
  • a region where the variance ⁇ c is larger than the threshold value Tsc is extracted as a color abnormality region (for example, corresponding to redness and amber lesion).
  • the region extracted as the abnormal brightness region or the abnormal color region is transferred to the feature amount calculation unit 3216 as a candidate region for the abnormal portion. Further, a control signal indicating that an abnormal area is present is transferred to the switching section control 324.
  • a control signal indicating that there is no abnormality portion is transferred to the switching control portion 324.
  • the threshold values TmaxY, TsY, Tmaxc, and Tsc are constants, and values input in advance from the outside are used. Further, the calculated brightness variance ⁇ y and color variance ⁇ c are transferred to the feature amount calculation unit 3216.
  • the feature amount calculation unit 3216 calculates feature amounts for a plurality of candidate regions transferred from the candidate region detection unit 3215, and detects an abnormal portion based on the calculated feature amounts. Specifically, the degree of abnormality E of the candidate area is calculated from the brightness dispersion ⁇ y and the color dispersion ⁇ c of each candidate area. The degree of abnormality Ei is calculated by the following equation (3).
  • the abnormal brightness region (uneven lesion) may be more easily detected as an abnormal portion than the reddish abnormal region (redness / scarred lesion). This is because the region that the operator wants to pay more attention to is assumed to be an uneven lesion having a higher severity as a lesion than a redness / amber lesion.
  • the area with the highest degree of abnormality Ei calculated is transferred to the area setting unit 322 as an abnormal part.
  • the abnormal portion E is a weighted addition value of the brightness variance ⁇ y and the color variance ⁇ c
  • the feature amount is not limited thereto, and for example, the area of the candidate region may be obtained. In that case, the candidate area with the largest area is transferred to the area setting unit 322 as an abnormal part.
  • the focused object position is switched at discrete positions. This increases the speed at which the subject is focused compared to the AF operation in which the focused object position is changed in a continuous position. For this reason, it is possible to always obtain an image focused on the region of interest (abnormal part) of the operator.
  • the endoscope apparatus has an imaging unit 200 that acquires an image signal, and the state of the imaging unit 200 (in a narrow sense, including an objective optical system (for example, the focus lens 240).
  • a switching unit 250 that switches the focused object position determined by the state) among a plurality of discrete target focus position candidates, and a focus control unit 320 that controls the switching unit 250 to perform focusing control.
  • the focus control unit 320 selects any one of a plurality of target focus position candidates as the target focus position, and outputs the selection result to the switching unit 250.
  • the focus control unit 320 includes an attention area detection section 321 that detects an attention area on the subject from the image signal.
  • the focus control unit 320 determines whether or not the attention area detected by the attention area detection unit 321 is in focus. When it is determined that the focus area is in focus, the determination process is performed. A target focus position candidate at the performed timing (for example, a position determined according to the position of the focus lens 240 at the time of determination processing) is selected as the target focus position. If it is determined that the subject is not in focus, the switching unit 250 is controlled to switch the focused object position.
  • the target in-focus position candidate is a target in-focus position candidate.
  • one of the plurality of target in-focus position candidates is selected as the in-focus position.
  • Perform focus control For example, the black circles shown in FIGS. 12A to 12D are points corresponding to the target in-focus position candidate and away from the imaging unit by the best subject distance.
  • a method of setting a plurality of discrete target focus position candidates is arbitrary, but can be realized by, for example, driving the focus lens 240 discretely. Since the target focus position is determined by determining the position of the focus lens 240, the focus object position can also be controlled discretely by controlling the focus lens 240 discretely.
  • the attention area is an area where the priority of observation for the user is relatively higher than other areas. For example, when the user is a doctor and desires treatment, an area in which the mucosa or lesion is copied is displayed. Point to. As another example, if the object that the doctor desires to observe is a bubble or stool, the region of interest is a region in which the bubble or stool portion is copied. That is, the object that the user should pay attention to depends on the purpose of observation, but in any case, in the observation, an area that is relatively higher in observation priority for the user than other areas is the attention area.
  • discrete focus lens 240 position control discrete focus lens 240 position control
  • AF for example, contrast AF
  • the in-focus object position can be continuously controlled. Therefore, there are many target in-focus positions for calculating an AF evaluation value (for example, contrast value) for performing AF. As a result, the amount of calculation until focusing is increased, and the time required is also increased.
  • discrete N focus switching N is a small number to some extent, for example, 4 or 2
  • the number of AF evaluation value calculations can be suppressed to N times at most, and the amount of calculation is large.
  • a focusing operation can be performed at high speed. If the focus area is in focus, the target focus position candidate at that time becomes the target focus position. That is, since the focusing operation is completed at that time, it is possible to reduce the focusing determination processing to N times in some cases. Further, the processing can be switched based on whether or not the attention area has been detected and, if detected, whether or not the attention area is in focus. Therefore, it is possible to automatically switch the processing on and off according to the detection / focusing state of the attention area. In other words, since it is possible to appropriately perform the focusing operation without an instruction from the user (for example, operation of the AF start button), it is possible to realize a system that is easy for the user to use.
  • the attention area detection unit 321 detects the attention area on the subject from the image signal after switching, and the focus control unit 320 determines whether or not the detected attention area is in focus. Also good.
  • the focus control unit 320 may include a contrast value calculation unit 323 that calculates the contrast value of the region corresponding to the region of interest. Then, the focus control unit 320 determines whether or not the attention area is focused based on the contrast value, and controls the switching unit 250.
  • the area (evaluation area) on the image whose contrast value is to be calculated is set in correspondence with the attention area.
  • the evaluation area may be matched with the attention area, but is not limited thereto.
  • the calculation may be facilitated by using a rectangular area including the attention area.
  • the in-focus determination based on the contrast value may be determined, for example, by comparing with a given threshold value and determining that the in-focus state is greater than the threshold value.
  • the focus control unit 320 determines that the region of interest is not in focus, and controls the switching unit 250 to perform a focusing object of the objective optical system. The position may be switched.
  • the focus object position is switched by controlling the switching unit 250. Specifically, a position different from the current position may be selected from a plurality of target focus position candidates, and focus determination may be performed at the selected position. In order to avoid duplicate processing, the target focus for which focus determination has not been performed in a series of focus operations (processing from the previous focus completion to the next focus completion) A position candidate is selected.
  • the focus control unit 320 determines the predetermined target focus position as the target focus position of the objective optical system. Candidates may be selected.
  • a target focus position candidate corresponding to a point with the longest focus object distance from the imaging unit may be used, or an external interface unit (external I in FIG. 1). / Corresponding to the F unit 500) may be set based on the input information.
  • the in-focus subject distance refers to the distance from the imaging unit 200 to the subject when the subject image is focused on the image sensor 260. Therefore, if an optical condition is determined by selecting a certain target in-focus position candidate, the in-focus subject distance can also be obtained in association with the target in-focus position candidate.
  • the in-focus subject distance here may be a value having such a width, but in a narrow sense, it may indicate the above-mentioned best subject distance.
  • the focus control unit 320 may use the target focus position candidate having the highest contrast value when the plurality of contrast values respectively calculated for the plurality of target focus position candidates are all lower than a predetermined threshold value. Good.
  • the target focus position is set to the given position. Can be set.
  • the target focus position candidate with the highest contrast value is used, the target focus position can be determined as a position with less blur compared to other positions, although it cannot be said that it is in focus. .
  • the depth of field increases as the subject distance optically increases.
  • the depth of field changes not according to the distance from the imaging unit 200 to the subject but according to the distance from the front focal point to the subject, for example), and the depth of field compared to other positions. It becomes possible to take a wide area, and the possibility that the focused area on the image can be enlarged is increased. If the depth of field is wide, manual focusing can be performed. Further, the predetermined target in-focus position candidates are not limited to the above two, and may be arbitrarily set based on, for example, an input value from an external device or a user.
  • the attention area detection unit 321 includes a reference signal generation unit 3212 that generates a reference signal from an image signal, and a candidate area detection unit 3215 that detects an attention candidate area based on the image signal and the reference signal.
  • the feature amount calculation unit 3216 that calculates the feature amount of the candidate region of interest may be included. Then, the attention area detection unit 321 detects the attention area based on the feature amount.
  • the reference signal is a signal obtained based on the image signal.
  • the image signal is subjected to filter processing (specifically, low-pass filter processing) to extract a given spatial frequency component. It may be what you did.
  • the attention candidate region is a region that is a candidate for the attention region, and is detected based on the image signal and the reference signal. Specifically, it may be detected from the difference value between the image signal and the reference signal. In the above example, attention is paid to both the detection based on the difference between the luminance signals and the detection based on the difference between the color signals. Candidate areas.
  • an abnormal part such as a lesion part can be detected.
  • the shape for example, by taking the variance of the difference
  • a lesion can be detected.
  • the color (redness / dark blue color, etc.) is characterized based on the difference between the color component (color signal) of the image signal and the reference signal obtained from the color signal (for example, by taking the variance of the difference).
  • a lesion can be detected. Therefore, in this example, an area that is suspected of being a lesion due to features such as unevenness or redness / dark blue is detected as the attention candidate area. Since a plurality of attention candidate areas may be detected, which attention candidate area is selected as the attention area is determined by the feature value. By switching the feature amount calculation method, it is possible to switch the characteristics of the area detected as the attention area.
  • the feature amount calculation unit 3216 may calculate the feature amount of the candidate region of interest based on the difference between the image signal and the reference signal. Then, the attention area detection unit 321 detects an attention candidate area having the largest feature amount as the attention area.
  • the same signal as that used for detection of the candidate region of interest may be used, or a different signal may be used. That is, any of R, G, and B may be used as the image signal, Y and C in the above equations (1) and (2) may be used, and other image signals may be used. Good.
  • the feature amount calculation unit 3216 may use a luminance signal difference and a color signal difference as the differences. Then, a value obtained by weighted addition of the luminance signal difference and the color signal difference may be calculated as the feature amount.
  • the luminance signal difference and the color signal difference as the difference between the image signal and the reference signal.
  • the luminance signal difference corresponds to an uneven lesion or the like
  • the color signal difference corresponds to a redness / amber lesion or the like.
  • Ei in the above equation (3) may be used to calculate the feature amount at this time.
  • either one of the weighting coefficients ⁇ e and ⁇ e may be zero. That is, the value obtained by weighted addition of the luminance signal difference and the color signal difference in the present embodiment includes a pattern of only the luminance signal difference and a pattern of only the color signal difference.
  • the feature amount calculation unit 3216 may calculate the feature amount using a value larger than the weight of the color signal difference as the weight of the luminance signal difference.
  • the feature amount calculation unit 3216 may calculate the feature amount of the attention candidate region based on the area of the attention candidate region. Then, the attention area detection unit 321 detects an attention candidate area having the largest feature amount as the attention area.
  • the large area means that there are distributions of objects to be noticed over such a wide range (for example, lesions and bubbles, etc.
  • the objects to be noticed change depending on the detection method of the candidate region of interest). It is considered that the priority of observation is high.
  • the attention area detection unit 321 may detect an abnormal part of the subject as the attention area.
  • an abnormal part for example, a lesion part
  • In-vivo observation with an endoscopic device is assumed to be used in the medical field.
  • the object to be observed is a lesion (such as a tumor or a dense blood vessel).
  • the abnormal part is detected as the attention area.
  • the number of target in-focus position candidates to be switched as the in-focus object position by the switching unit 250 may be 4 or less (including that value).
  • N is a large value
  • the load of focusing processing for example, the number of times of switching the in-focus object position, the number of times of calculation of contrast values, etc.
  • the focusing operation cannot be performed at a high speed as compared with a typical focus switching. Therefore, in order to use the method of this embodiment effectively, the value of N needs to be small to some extent, and specifically, N ⁇ 4 may be satisfied.
  • the number of target focus position candidates is not limited to four as long as the focus operation can be performed at a higher speed than continuous focus switching.
  • it may be determined from the condition of the focusing range in the depth direction in the object scene.
  • the first to Nth object scenes corresponding to the first to Nth target in-focus positions corresponding to a plurality of points away from the imaging unit by the best subject distance, and the corresponding first to Nth object scenes, respectively.
  • the in-focus range in the depth direction in the i-th (1 ⁇ i ⁇ N ⁇ 2) object field is in-focus in the depth direction in the (i + 1) -th object field.
  • the target focus position candidate is set so that the range in the depth direction in the i-th object field does not overlap with the range in the depth direction in the i + 2 object field. It is possible. By doing so, the in-focus range in the depth direction overlaps between adjacent target focus position candidates (a certain position is in-depth focus in the i-th object field). And the end point of the in-focus range in the first object field (for example, FIG. 12). And the end point of the focus range in the depth direction in the Nth scene (for example, the depth direction in the scene at D in FIG. 12). In the range of the distance from the left end of the in-focus range, an arbitrary position is included in any of the in-focus ranges in the depth direction in the first to Nth scenes.
  • this distance range it is possible to always focus by appropriately setting the focused object position, and it is not necessary to adjust the distance between the imaging unit 200 and the subject.
  • it does not overlap with the range in the depth direction in the two adjacent scenes. This is because the predetermined distance range can be covered by overlapping the neighbors so that it is not necessary to increase the number of target in-focus position candidates any further. If the range of focus in the depth direction in the i-th object field overlaps the range of focus in the depth direction of the i + 2 object field, in the i + 1-th object field between them.
  • An arbitrary position within the in-focus range in the depth direction is included in at least one of the in-focus range in the depth direction in the i-th object field and the in-depth range in the i + 2 object field. Therefore, the advantage of setting the i + 1th target focus position candidate is not great.
  • the width of the region may be limited (for example, an upper limit value is set).
  • Second Embodiment A configuration example of an endoscope apparatus according to the second embodiment has the same configuration as that of the first embodiment. Only the parts different from the first embodiment will be described below.
  • the function of the switching unit 250 is different.
  • the focus lens position is set to point E corresponding to a point (far point) where the best subject distance from the imaging unit is long and point where the best subject distance from the imaging unit is short ( Switching is performed in two stages of F point corresponding to (near point).
  • the focus lens position is at point E, the depth of field is suitable for screening a luminal subject in the body cavity, and when the focus lens position is at point F, the subject is closely approached for close examination. Suitable depth of field.
  • FIG. 9 shows a detailed configuration example of the focus control unit 320 in the second embodiment.
  • the determination unit 325 is added from the first embodiment.
  • the image processing unit 330 is connected to the region setting unit 322, the attention region detection unit 321, and the determination unit 325.
  • the determination unit 325 is connected to the switching control unit 324.
  • Determining unit 325 performs determination related to focusing of the attention area.
  • the determination unit 325 may include an estimation unit 3251, and the estimation unit 3251 estimates rough distance information about the attention area transferred from the attention area detection unit 321. Then, the determination unit 325 determines whether the attention area is at the near point or the far point based on the estimation result in the estimation unit 3251. Specifically, first, the brightness Ye of the area corresponding to the attention area is calculated from the image signal. Next, the brightness Ye of the attention area is corrected by the light source aperture L and the gain correction value Gain. The following equation (4) is used for correction.
  • L is the opening of the light source aperture 120 obtained through the control unit 340
  • Gain is a gain correction value obtained through the control unit 340
  • ⁇ y and ⁇ y are weighting constants.
  • the determination unit 325 transfers the determination result to the switching control unit 324.
  • S201 to S206 are the same as S101 to S106 in FIG. 14, and S208 to S209 are the same as S108 to S109 in FIG.
  • the processing in the case where the contrast value is equal to or lower than Tcon (including the value) at all focus lens positions in the first embodiment, the process of S107 in FIG. 14).
  • the processing is different from S210 to S212 in FIG.
  • the determination unit 325 compares the brightness information Yc of the attention area with the threshold value Tr (S210), and Yc is If it is larger than Tr, the switching control unit 324 moves the focus lens position to a position for focusing on the near point (S211). On the other hand, if YC is equal to or smaller than Tr (including the value thereof), the switching control unit 324 moves the focus lens position to a position for focusing on the far point (S212).
  • the focus control unit 320 includes the estimation unit 3251 that estimates distance information between the subject corresponding to the region of interest and the imaging unit 200. If the plurality of contrast values calculated for each of the plurality of target focus position candidates are all lower than the predetermined threshold, the focus object position of the objective optical system is determined based on the estimation result of the estimation unit 3251. select.
  • a target in-focus position to be selected by distance estimation is obtained.
  • a distance measuring sensor or a complicated image processing method is not used. This is because if the distance measuring sensor or the like is mounted, the imaging unit 200 is increased in size, and complicated image processing increases the processing load and makes high-speed focusing operation difficult.
  • the estimation unit 3251 may estimate the distance information with respect to the imaging unit 200 based on the brightness information of the attention area.
  • This enables distance estimation based on brightness information. This is as simple as the distance between the imaging unit 200 and the subject is short if it is bright, and long if it is dark. Therefore, the processing load can be very light. In particular, this is effective in N focus switching as in this embodiment (especially when N is small such as N 2). For example, in the case of bifocal switching, it is only necessary to determine whether the estimation by the estimation unit 3251 is near or far. Even if the estimation accuracy is low as in the estimation based on the brightness information, the target focus is achieved. Position selection is possible.
  • the switching unit 250 captures an image as a target focus position candidate of the objective optical system compared to the first target focus position candidate having a long in-focus subject distance from the image capturing unit and the first target focus position candidate. You may switch 2 points
  • the focused subject distance is the same as that described above in the first embodiment, and may be the best subject distance in a narrow sense.
  • the position of the subject corresponding to the focused subject distance in the first target focus position candidate is the far point
  • the position of the subject corresponding to the focused subject distance in the second target focus position candidate is the near point.
  • the present invention is not limited to the embodiments 1 and 2 and modified examples as they are.
  • the constituent elements can be modified and embodied without departing from the spirit of the invention.
  • Various inventions can be formed by appropriately combining a plurality of constituent elements disclosed in the above-described first and second embodiments and modifications. For example, some constituent elements may be deleted from all the constituent elements described in the first and second embodiments and modifications. Furthermore, you may combine suitably the component demonstrated in different embodiment and modification.
  • a term described together with a different term having a broader meaning or the same meaning at least once in the specification or the drawings can be replaced with the different term anywhere in the specification or the drawings.

Landscapes

  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Astronomy & Astrophysics (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Endoscopes (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

An endoscopic device and focus control method are provided, wherein after detecting the target region, focus is controlled based on scattered focus subject positions, thereby making it possible to easily and rapidly perform the focusing operation on the target region. An endoscopic device includes an imaging unit (200) for capturing image signals; a switching unit (250) that switches the focus subject position to any one of a plurality of target focus position candidates set at scattered positions; and a focus controller (320) that selects any one of the plurality of target focus position candidates as the target focus position, controls the switching unit (250), and controls the focus. The focus controller (320) has a target region detector (321) and selects the target focus position candidate as the target focus position if it is determined that the detected target region is in focus, and controls the switching unit (250) and switches the focus subject position when the target focus position candidate is selected as the target focus position and it is determined that the detected target region is not in focus.

Description

内視鏡装置及びフォーカス制御方法Endoscope apparatus and focus control method
 本発明は、内視鏡装置及びフォーカス制御方法等に関する。 The present invention relates to an endoscope apparatus, a focus control method, and the like.
 内視鏡のような撮像装置においては、ドクターの診断に支障をきたさないため、パンフォーカスの画像が求められる。このため、内視鏡では比較的Fナンバーが大きい光学系を使用して被写界深度を深くすることでこのような性能を達成している。しかし近年、内視鏡システムにおいても数十万画素程度の高画素の撮像素子が使用されるようになっている。しかし高画素の撮像素子では画素ピッチと共に許容錯乱円が小さくなるため、Fナンバーを小さくする必要があり、撮像装置の被写界深度は狭くなる。このため、内視鏡の撮像部に対物光学系の合焦物体位置を駆動する合焦物体位置駆動部を設け、被写体に対してオートフォーカス(以下、AFと表記する)を行う内視鏡装置が特許文献1に提案されている。 In an imaging apparatus such as an endoscope, a pan-focus image is required in order not to interfere with the doctor's diagnosis. For this reason, the endoscope achieves such performance by using an optical system having a relatively large F number to increase the depth of field. However, in recent years, even in an endoscope system, an image sensor having a high pixel number of about several hundred thousand pixels has been used. However, since the permissible circle of confusion decreases with the pixel pitch in an image sensor with a high pixel, it is necessary to reduce the F number, and the depth of field of the image pickup device becomes narrow. For this reason, an endoscope apparatus for providing an in-focus object position driving unit for driving the in-focus object position of the objective optical system in the imaging unit of the endoscope and performing autofocus (hereinafter referred to as AF) on the subject. Is proposed in Patent Document 1.
特開平8-106060号公報JP-A-8-106060
 しかしながら、上述の特許文献1では、合焦物体位置を連続的に変化させるように駆動するため、駆動のための処理が複雑になってしまう。例えばウォブリング等を用いれば、レンズを前後に小刻みに動かす必要がある。よって、AFを開始してから完了するまでの時間がある程度必要になってしまう。また、AF動作の停止/起動の切り替えを操作部に設けられたスイッチで制御するため、術者は従来の内視鏡操作に加えて、AF動作の停止/起動の操作を行う必要があり、作業の煩雑性が増すことになる。 However, in the above-mentioned Patent Document 1, driving is performed so that the focused object position is continuously changed, so that processing for driving becomes complicated. For example, when wobbling or the like is used, it is necessary to move the lens back and forth in small increments. Therefore, some time is required from the start to the completion of AF. In addition, since the switching of the stop / start of the AF operation is controlled by a switch provided in the operation unit, the operator needs to perform the stop / start operation of the AF operation in addition to the conventional endoscope operation. The complexity of work will increase.
 本発明の幾つかの態様によれば、注目領域を検出した上で、離散的な合焦物体位置制御に基づいて合焦制御を行うことで、容易且つ高速に当該注目領域に対する合焦動作を行う内視鏡装置及びフォーカス制御方法等を提供することができる。 According to some aspects of the present invention, a focus operation is easily and quickly performed on a focus area by performing focus control based on discrete focus object position control after detecting the focus area. An endoscope apparatus and a focus control method to be performed can be provided.
 本発明の一態様は、画像信号を取得する撮像部と、前記撮像部の状態により決定される合焦物体位置を、離散的に設定された複数の目標合焦位置候補のいずれかに切り替える切り替え部と、複数の前記目標合焦位置候補のうち、いずれか1つを目標合焦位置として選択し、選択した前記目標合焦位置が前記合焦物体位置となるように前記切り替え部を制御して合焦制御を行うフォーカス制御部と、を含み、前記フォーカス制御部は、前記画像信号から被写体上の注目領域を検出する注目領域検出部を有し、前記フォーカス制御部は、検出された前記注目領域に対して合焦しているか否かの判定を行い、合焦していると判定された場合に、前記判定が行われたタイミングに対応する前記目標合焦位置候補を、前記目標合焦位置として選択し、合焦していないと判定された場合に、前記切り替え部の制御を行って、前記判定が行われたタイミングに対応する前記目標合焦位置候補とは異なる前記目標合焦位置候補に、前記合焦物体位置を切り替える内視鏡装置に関係する。 According to one aspect of the present invention, an imaging unit that acquires an image signal, and switching that switches a focused object position determined by the state of the imaging unit to any one of a plurality of discretely set target focus position candidates And one of the plurality of target focus position candidates is selected as a target focus position, and the switching unit is controlled so that the selected target focus position becomes the focus object position. A focus control unit that performs focusing control, and the focus control unit includes a target region detection unit that detects a target region on a subject from the image signal, and the focus control unit detects the detected target unit It is determined whether or not the region of interest is in focus. When it is determined that the target region is in focus, the target focus position candidate corresponding to the timing at which the determination is performed is determined as the target focus position. Select as the focal position, When it is determined that the subject is not in focus, the switching unit is controlled so that the target in-focus position candidate different from the target in-focus position candidate corresponding to the timing at which the determination is performed is set to the in-focus state. The present invention relates to an endoscope apparatus that switches an object position.
 本発明の一態様では、撮像部の状態に対応する合焦物体位置が、離散的に設定された複数の目標合焦位置候補から選択された目標合焦位置となるように制御する。その際、画像信号から検出した注目領域に対して合焦しているか否かの判定に基づいて、目標合焦位置として用いられる目標合焦位置候補を選択する。これにより、通常の連続的な合焦物体位置制御を行う合焦制御(オートフォーカス)等に比べて、簡単な制御で高速に合焦動作を行うこと等が可能になる。 In one aspect of the present invention, control is performed so that the in-focus object position corresponding to the state of the imaging unit is a target in-focus position selected from a plurality of discrete target in-focus position candidates. At that time, a target focus position candidate used as the target focus position is selected based on the determination as to whether or not the region of interest detected from the image signal is in focus. As a result, it is possible to perform a focusing operation at high speed with simple control compared to focusing control (autofocus) or the like that performs normal continuous focused object position control.
 本発明の他の態様は、複数の目標合焦位置候補が離散的に設定された対物光学系におけるフォーカス制御方法であって、撮像部からの画像信号を取得し、前記画像信号から被写体上の注目領域を検出し、検出した前記注目領域に対して合焦しているか否かの判定を行い、合焦していると判定された場合に、前記判定が行われたタイミングに対応する前記目標合焦位置候補を、目標合焦位置として選択し、合焦していないと判定された場合に、前記判定が行われたタイミングに対応する前記目標合焦位置候補とは異なる前記目標合焦位置候補に、合焦物体位置を切り替えるフォーカス制御方法に関係する。 Another aspect of the present invention is a focus control method in an objective optical system in which a plurality of target in-focus position candidates are discretely set, and an image signal from an imaging unit is acquired, and the object signal is obtained from the image signal. The target area is detected, and it is determined whether or not the detected target area is in focus. When it is determined that the target area is in focus, the target corresponding to the timing at which the determination is performed When the in-focus position candidate is selected as the in-focus position and it is determined that the in-focus position is not in focus, the target in-focus position different from the in-focus position candidate corresponding to the timing at which the determination is performed This is related to a focus control method for switching a focused object position to a candidate.
図1は、本実施形態の内視鏡装置の構成例。FIG. 1 is a configuration example of an endoscope apparatus according to the present embodiment. 図2は、回転色フィルターの構成例。FIG. 2 is a configuration example of a rotating color filter. 図3は、回転色フィルターの分光特性例。FIG. 3 shows an example of spectral characteristics of the rotating color filter. 図4は、画像処理部の構成例。FIG. 4 is a configuration example of the image processing unit. 図5は、フォーカス制御部の構成例。FIG. 5 is a configuration example of the focus control unit. 図6は、コントラスト値算出部の構成例。FIG. 6 is a configuration example of a contrast value calculation unit. 図7は、注目領域検出部の構成例。FIG. 7 shows a configuration example of the attention area detection unit. 図8は、領域分割を行って評価領域を設定する例。FIG. 8 shows an example of setting an evaluation area by dividing an area. 図9は、フォーカス制御部の他の構成例FIG. 9 shows another configuration example of the focus control unit. 図10は、スクリーニング時の撮像部と被写体との関係、合焦物体位置に対応する撮像部からベスト被写体距離だけ離れた点、及び被写界において奥行き方向で合焦する範囲を説明する図。FIG. 10 is a diagram for explaining a relationship between an imaging unit and a subject at the time of screening, a point away from the imaging unit corresponding to a focused object position by the best subject distance, and a range in which focus is achieved in the depth direction in the object scene. 図11は、近接観察時の撮像部と被写体との関係、合焦物体位置に対応する撮像部からベスト被写体距離だけ離れた点、及び被写界において奥行き方向で合焦する範囲を説明する図。FIG. 11 is a diagram for explaining the relationship between the imaging unit and the subject at the time of close-up observation, a point away from the imaging unit corresponding to the focused object position by the best subject distance, and a range focused in the depth direction in the object scene. . 図12は、4焦点切り替えの例。FIG. 12 shows an example of four-focus switching. 図13は、2焦点切り替えの例。FIG. 13 shows an example of bifocal switching. 図14は、第1の実施形態での合焦物体位置切り替え処理を説明するフローチャート。FIG. 14 is a flowchart for explaining a focused object position switching process in the first embodiment. 図15は、第2の実施形態での合焦物体位置切り替え処理を説明するフローチャート。FIG. 15 is a flowchart illustrating focused object position switching processing according to the second embodiment.
 以下、本実施形態について説明する。なお、以下に説明する本実施形態は、請求の範囲に記載された本発明の内容を不当に限定するものではない。また本実施形態で説明される構成の全てが、本発明の必須構成要件であるとは限らない。 Hereinafter, this embodiment will be described. In addition, this embodiment demonstrated below does not unduly limit the content of this invention described in the claim. In addition, all the configurations described in the present embodiment are not necessarily essential configuration requirements of the present invention.
 1.本実施形態の手法
 まず本実施形態の手法について説明する。内視鏡装置において撮像素子の画素数が増えることにより、被写界深度が狭くなり、所望の被写体に対してピントを合わせる(合焦させる)ことが難しくなっている。
1. First, the method of this embodiment will be described. As the number of pixels of the image sensor increases in an endoscope apparatus, the depth of field becomes narrower, making it difficult to focus (focus) on a desired subject.
 特に拡大観察を行う内視鏡装置では、撮像部の撮像倍率が高くなったり、撮像部から被写体までの距離が短くなったりすることで、さらに被写界深度が狭くなる傾向にあり、手動による合焦動作は非常に難しい。 Especially in endoscope devices that perform magnified observations, the depth of field tends to be further reduced by increasing the imaging magnification of the imaging unit or shortening the distance from the imaging unit to the subject. The focusing operation is very difficult.
 手動による合焦動作が難しい場合、システムにより自動的に合焦動作を行うことが考えられる。しかし、通常のAFでは合焦物体位置を連続的に切り替える構造(例えばフォーカスレンズ位置を連続的に切り替える構造)となっている。そのため、目標合焦位置の選択の自由度が高く、柔軟な対応が可能な反面、目標合焦位置の設定に要する時間が長くなったり、場合によっては機構が複雑になったりする。機構が複雑になれば撮像部自体が大きくなってしまい、当該撮像部を生体内に挿入する内視鏡装置においては好ましくない。 If the manual focusing operation is difficult, it can be considered that the system automatically performs the focusing operation. However, the normal AF has a structure that continuously switches the in-focus object position (for example, a structure that continuously switches the focus lens position). For this reason, the degree of freedom of selection of the target in-focus position is high and flexible handling is possible, but on the other hand, the time required for setting the target in-focus position becomes long, and in some cases, the mechanism becomes complicated. If the mechanism is complicated, the imaging unit itself becomes large, which is not preferable in an endoscope apparatus in which the imaging unit is inserted into a living body.
 なお、合焦物体位置とは、物体、光学系、_GoBack_GoBack像面等を含む系が合焦状態にある場合の、基準位置に対する当該物体の相対的な位置(物点)である。具体的には、所与の位置に像面を設定し、光学系を所与の状態とした場合に、当該光学系により前記像面上に結像する像が合焦している場合の当該物体の位置を表す。本実施形態の内視鏡装置では、像面は撮像部に含まれる撮像素子の面と一致することが想定されるため、撮像素子の面が固定されている場合、光学系の状態を決定すれば合焦物体位置は決定可能となる。 Note that the focused object position is a relative position (object point) of the object with respect to the reference position when a system including the object, the optical system, the _GoBack_GoBack image plane, and the like is in a focused state. Specifically, when an image plane is set at a given position and the optical system is in a given state, the image formed on the image plane by the optical system is in focus. Represents the position of the object. In the endoscope apparatus according to the present embodiment, it is assumed that the image plane coincides with the plane of the imaging element included in the imaging unit. Therefore, when the plane of the imaging element is fixed, the state of the optical system is determined. In this case, the in-focus object position can be determined.
 さらに、目標合焦位置とは、ある時点においてこれから合焦させたい(合焦の目標となる)位置である。ここでは、ある時点において、基準位置に対するこれから合焦させたい物体の相対的な位置(物点)を指す。ここで、基準位置として、例えば撮像素子の面の位置や、光学系の先端の位置が挙げられる。 Furthermore, the target in-focus position is a position where it is desired to focus at a certain point in time (becomes an in-focus target). Here, it indicates the relative position (object point) of the object to be focused on at a certain point in time. Here, examples of the reference position include the position of the surface of the image sensor and the position of the tip of the optical system.
 そこで本出願人は、合焦物体位置を離散的に制御する手法を提案する。例えば離散的に設定された複数の目標合焦位置候補の中から1つを目標合焦位置として選択し、選択された目標合焦位置が合焦物体位置となるような制御をすればよい。このようにすることで、限定された目標合焦位置候補において合焦動作に必要な処理(例えばコントラスト値の算出処理等)を行えばよいため、高速で合焦動作を実行することが可能になる。特に、2焦点切り替えのように目標合焦位置候補の数が少ない場合には、連続的な合焦物体位置制御の場合に比べて機構をシンプルにすることが可能になり、撮像部の小型化が期待できる。 Therefore, the present applicant proposes a method for discretely controlling the focused object position. For example, one of a plurality of target focus position candidates set discretely may be selected as the target focus position, and control may be performed so that the selected target focus position becomes the focus object position. In this way, it is only necessary to perform processing (for example, contrast value calculation processing) necessary for the focusing operation at a limited target focusing position candidate, so that the focusing operation can be executed at high speed. Become. In particular, when the number of target in-focus position candidates is small, such as when switching between two focal points, the mechanism can be simplified compared to the case of continuous in-focus object position control, and the imaging unit can be downsized. Can be expected.
 以下、第1の実施形態において基本的な構成及び手法について説明する。第1の実施形態では図12に示したように4焦点切り替えを例に説明する。また、第2の実施形態では図13に示したように2焦点切り替えについて説明する。2焦点切り替えでは、上述したように機構をシンプルにできる。また、全ての目標合焦位置候補において合焦していると判定できない場合(例えばコントラスト値が小さい場合等)には、第1の実施形態とは異なり、判定部325による判定結果を用いて目標合焦位置を決定する。 Hereinafter, a basic configuration and method in the first embodiment will be described. In the first embodiment, as shown in FIG. 12, four focus switching will be described as an example. In the second embodiment, the two-focus switching will be described as shown in FIG. In bifocal switching, the mechanism can be simplified as described above. Also, when it is not possible to determine that all target in-focus position candidates are in focus (for example, when the contrast value is small, etc.), unlike the first embodiment, the determination result by the determination unit 325 is used to determine the target. Determine the in-focus position.
 2.第1の実施形態
 図1に、第1の実施形態における内視鏡装置の構成例を示す。内視鏡装置は、光源部100、撮像部200、制御装置300(プロセッサ部)、表示部400、外部I/F部500を含む。
2. First Embodiment FIG. 1 shows a configuration example of an endoscope apparatus according to a first embodiment. The endoscope apparatus includes a light source unit 100, an imaging unit 200, a control device 300 (processor unit), a display unit 400, and an external I / F unit 500.
 光源部100は、白色光源110と、光源絞り120と、光源絞り120を駆動させる光源絞り駆動部130と、複数の分光透過率のフィルターを有する回転色フィルター140を有する。また光源部100は、回転色フィルター140を駆動させる回転駆動部150と、回転色フィルター140を透過した光をライトガイドファイバー210の入射端面に集光させる集光レンズ160を含む。光源絞り駆動部130は、制御装置300の制御部340からの制御信号に基づいて、光源絞り120の開閉を行うことで光量の調整を行う。図2に回転色フィルター140の詳細な構成例を示す。回転色フィルター140は、三原色の赤色(以下Rと略す)フィルター701と、緑色(以下Gと略す)フィルター702と、青色(以下Bと略す)フィルター703と、回転モーター704と、から構成されている。図3に、これらの色フィルター701~703の分光特性例を示す。回転駆動部150は、制御部340からの制御信号に基づいて、撮像素子260の撮像期間と同期して回転色フィルター140を所定回転数で回転させる。例えば、回転色フィルター140を1秒間に20回転させると、各色フィルターは60分の1秒間隔で入射白色光を横切ることになる。この場合、撮像素子260は、60分の1秒間隔で画像信号の撮像と転送を完了することになる。ここで、撮像素子260は例えばモノクロ単板撮像素子であり、例えばCCDやCMOSイメージセンサー等により構成される。即ち、本実施形態では、3原色の各色光(R或はG或はB)の画像が60分の1秒間隔で撮像される面順次方式の撮像が行われる。 The light source unit 100 includes a white light source 110, a light source stop 120, a light source stop driving unit 130 for driving the light source stop 120, and a rotating color filter 140 having a plurality of spectral transmittance filters. The light source unit 100 also includes a rotation driving unit 150 that drives the rotation color filter 140 and a condenser lens 160 that condenses the light transmitted through the rotation color filter 140 onto the incident end face of the light guide fiber 210. The light source aperture driving unit 130 adjusts the amount of light by opening and closing the light source aperture 120 based on a control signal from the control unit 340 of the control device 300. FIG. 2 shows a detailed configuration example of the rotation color filter 140. The rotation color filter 140 includes three primary color red (hereinafter abbreviated as R) filter 701, green (hereinafter abbreviated as G) filter 702, blue (hereinafter abbreviated as B) filter 703, and a rotary motor 704. Yes. FIG. 3 shows an example of spectral characteristics of these color filters 701 to 703. The rotation driving unit 150 rotates the rotation color filter 140 at a predetermined number of rotations in synchronization with the imaging period of the imaging element 260 based on a control signal from the control unit 340. For example, if the rotating color filter 140 is rotated 20 times per second, each color filter crosses incident white light at 1/60 second intervals. In this case, the image sensor 260 completes imaging and transfer of image signals at 1/60 second intervals. Here, the image sensor 260 is, for example, a monochrome single-plate image sensor, and is configured by, for example, a CCD or a CMOS image sensor. That is, in the present embodiment, frame sequential imaging is performed in which images of light of each of the three primary colors (R, G, or B) are captured at 1/60 second intervals.
 撮像部200は、例えば、体腔への挿入を可能にするため細長くかつ湾曲可能に形成されている。撮像部200は、光源部100で集光された光を照明レンズ220に導くためのライトガイドファイバー210と、そのライトガイドファイバー210により先端まで導かれてきた光を拡散させて観察対象に照射する照明レンズ220を含む。また、撮像部200は、観察対象から戻る反射光を集光する対物レンズ230と、合焦物体位置を調整するためのフォーカスレンズ240と、フォーカスレンズ240の位置を離散的な位置で切り替えるための切り替え部250と、集光した反射光を検出するための撮像素子260を備えている。切り替え部250は、例えばVCM(Voice Coil Motor)であり、フォーカスレンズ240と接続されている。 The imaging unit 200 is formed to be elongate and bendable, for example, to enable insertion into a body cavity. The imaging unit 200 diffuses the light guide fiber 210 for guiding the light collected by the light source unit 100 to the illumination lens 220 and the light guided to the tip by the light guide fiber 210 to irradiate the observation target. An illumination lens 220 is included. The imaging unit 200 also switches the objective lens 230 that collects the reflected light returning from the observation target, the focus lens 240 for adjusting the focus object position, and the position of the focus lens 240 at discrete positions. A switching unit 250 and an image sensor 260 for detecting the collected reflected light are provided. The switching unit 250 is, for example, a VCM (Voice Coil Motor) and is connected to the focus lens 240.
 切り替え部250は、フォーカスレンズ240の位置を複数の離散的な位置で切り替えることで合焦物体位置を調整する。本実施の形態における、フォーカスレンズ240の位置とこのときの合焦物体位置に対応するベスト被写体距離の関係を図12に示す。ここで、ベスト被写体距離とは、被写界における撮像部からの奥行き方向の距離であり、例えば被写体の像が撮像素子上で合焦している(理想的には被写体上の一点から出た光線が撮像素子上の一点に収束する)場合の、撮像部200から当該被写体までの距離である。つまりベスト被写体距離とは、撮像部200から合焦物体位置までの距離に対応する距離となる。なお、本実施形態での「被写体距離」とは、撮像部200から被写体までの距離を指すものとし、主点や焦点等の光学的な特徴を持つ点からの距離に限定されないものとする。 The switching unit 250 adjusts the focused object position by switching the position of the focus lens 240 at a plurality of discrete positions. FIG. 12 shows the relationship between the position of the focus lens 240 and the best subject distance corresponding to the focused object position at this time in the present embodiment. Here, the best subject distance is the distance in the depth direction from the imaging unit in the object scene. For example, the subject image is focused on the image sensor (ideally, the subject has come out from one point on the subject). Distance from the imaging unit 200 to the subject when the light beam converges to one point on the image sensor. That is, the best subject distance is a distance corresponding to the distance from the imaging unit 200 to the focused object position. Note that the “subject distance” in the present embodiment refers to a distance from the imaging unit 200 to the subject, and is not limited to a distance from a point having optical characteristics such as a principal point and a focal point.
 本実施の形態では、フォーカスレンズ240は離散的な位置A,B,C,Dをとることで、このときの合焦物体位置に対応するベスト被写体距離を4段階で切り替える。撮像部からベスト被写体距離だけ離れた点は、撮像部200に対して近い点(以下、近点)から遠い点(以下、遠点)まで、順にフォーカスレンズの位置A~Dと一対一で対応する。被写界深度は、フォーカスレンズ240の位置を切り替えることで、内視鏡観察に必要とされる被写界深度を達成する。ここでは、フォーカスレンズ240の位置をA~Dまで切り替えた時のそれぞれの被写界の奥行き方向における合焦する範囲(幅が被写界深度に対応)を組み合わせて実現できる、奥行き方向の正味の合焦する範囲が撮像部から2~70mmの範囲を含むようなものとする。一般的に撮像部からベスト被写体距離だけ離れた点が近点側にあると被写界深度が浅くなり、被写体位置がわずかに移動しただけでも被写界の奥行き方向の合焦する範囲を逸脱しやすくなる。このため、フォーカスレンズ位置が近点に合焦する位置にある場合は、図11に示すように、奥行きのない被写体を近接して観察する場合に適している。一方、撮像部からベスト被写体距離だけ離れた点が遠点側にあると被写界深度は深くなる。そのため、図10に示すように、管腔状の被写体をスクリーニングする場合に適している。 In the present embodiment, the focus lens 240 takes discrete positions A, B, C, and D, and switches the best subject distance corresponding to the focused object position at this time in four stages. A point away from the imaging unit by the best subject distance has a one-to-one correspondence with the focus lens positions A to D in order from a point close to the imaging unit 200 (hereinafter referred to as a near point) to a point far away (hereinafter referred to as a far point) To do. The depth of field achieves the depth of field required for endoscopic observation by switching the position of the focus lens 240. Here, a net in the depth direction that can be realized by combining a range of focus in the depth direction of each field when the position of the focus lens 240 is switched from A to D (width corresponds to the depth of field). It is assumed that the in-focus range includes a range of 2 to 70 mm from the imaging unit. In general, if the point away from the imaging unit by the best subject distance is on the near point side, the depth of field becomes shallow, and even if the subject position moves slightly, it deviates from the focus range in the depth direction of the field of view. It becomes easy to do. For this reason, when the focus lens position is at a position that focuses on the near point, as shown in FIG. 11, it is suitable for observing a subject with no depth close to it. On the other hand, if the point away from the imaging unit by the best subject distance is on the far point side, the depth of field becomes deep. Therefore, as shown in FIG. 10, it is suitable for screening a luminal subject.
 制御装置300は、内視鏡装置の各部の制御や画像処理を行う。制御装置300は、A/D変換部310と、フォーカス制御部320と、画像処理部330と、制御部340を含む。 The control device 300 controls each part of the endoscope device and performs image processing. The control device 300 includes an A / D conversion unit 310, a focus control unit 320, an image processing unit 330, and a control unit 340.
 A/D変換部310によりデジタル信号に変換された画像信号は、画像処理部330に転送される。画像処理部330により処理された画像信号は、フォーカス制御部320と、表示部400に転送される。フォーカス制御部320は、切り替え部250に制御信号を転送することで、フォーカスレンズ240の位置を変更する。制御部340は、内視鏡装置の各部の制御を行う。具体的には制御部340は、光源絞り駆動部130と、フォーカス制御部320と、画像処理部330の同期を行う。また、外部I/F部500に接続されており、外部I/F部500からの入力に基づき、フォーカス制御部320と、画像処理部330の制御を行う。 The image signal converted into a digital signal by the A / D conversion unit 310 is transferred to the image processing unit 330. The image signal processed by the image processing unit 330 is transferred to the focus control unit 320 and the display unit 400. The focus control unit 320 changes the position of the focus lens 240 by transferring a control signal to the switching unit 250. The control unit 340 controls each unit of the endoscope apparatus. Specifically, the control unit 340 synchronizes the light source aperture driving unit 130, the focus control unit 320, and the image processing unit 330. Further, it is connected to the external I / F unit 500, and controls the focus control unit 320 and the image processing unit 330 based on the input from the external I / F unit 500.
 表示部400は、動画表示可能な表示装置であり、例えばCRTや液晶モニター等により構成される。 The display unit 400 is a display device capable of displaying a moving image, and includes, for example, a CRT or a liquid crystal monitor.
 外部I/F部500は、内視鏡装置に対する術者からの入力等を行うためのインターフェースである。外部I/F部500は、例えば電源のオン/オフを行うための電源スイッチや、撮影モードやその他各種のモードを切り換えるためのモード切換ボタンなどを含む。外部I/F部500は、入力された情報を制御部340へ転送する。 The external I / F unit 500 is an interface for performing an input from an operator to the endoscope apparatus. The external I / F unit 500 includes, for example, a power switch for turning on / off the power, a mode switching button for switching a photographing mode and other various modes. The external I / F unit 500 transfers the input information to the control unit 340.
 図4に、第1の実施形態における画像処理部330の詳細な構成例を示す。画像処理部330は、前処理部331と、同時化部332と、後処理部333と、を含む。A/D変換部310は、前処理部331に接続される。前処理部331は、同時化部332に接続される。同時化部332は、後処理部333と、フォーカス制御部320に接続される。後処理部333は、表示部400に接続される。制御部340は、前処理部331と、同時化部332と、後処理部333と、に双方向に接続されており、これらの制御を行う。 FIG. 4 shows a detailed configuration example of the image processing unit 330 according to the first embodiment. The image processing unit 330 includes a preprocessing unit 331, a synchronization unit 332, and a postprocessing unit 333. The A / D conversion unit 310 is connected to the preprocessing unit 331. The preprocessing unit 331 is connected to the synchronization unit 332. The synchronization unit 332 is connected to the post-processing unit 333 and the focus control unit 320. The post-processing unit 333 is connected to the display unit 400. The control unit 340 is bi-directionally connected to the pre-processing unit 331, the synchronization unit 332, and the post-processing unit 333, and performs these controls.
 前処理部331は、A/D変換部310から入力される画像信号に対して、制御部340に予め保存されているOBクランプ値、ゲイン補正値、WB係数値を用いて、OBクランプ処理、ゲイン補正処理、WB補正処理を行う。前処理部331は、前処理後の画像信号を同時化部332へ転送する。 The preprocessing unit 331 uses the OB clamp value, gain correction value, and WB coefficient value stored in advance in the control unit 340 for the image signal input from the A / D conversion unit 310, Gain correction processing and WB correction processing are performed. The preprocessing unit 331 transfers the preprocessed image signal to the synchronization unit 332.
 同時化部332は、前処理部331により処理された画像信号に対して、制御部340の制御信号に基づいて、面順次の画像信号を同時化する。具体的には、同時化部332は、面順次で入力された各色光(R或はG或はB)の画像信号を1フレーム分ずつ蓄積し、蓄積した各色光の画像信号を同時に読み出す。同時化部332は、同時化された画像信号を、後処理部333とフォーカス制御部320へ転送する。 The synchronizer 332 synchronizes the frame sequential image signal with the image signal processed by the preprocessor 331 based on the control signal of the controller 340. Specifically, the synchronizer 332 accumulates image signals of each color light (R, G, or B) input in frame order one frame at a time, and simultaneously reads the accumulated image signals of each color light. The synchronization unit 332 transfers the synchronized image signal to the post-processing unit 333 and the focus control unit 320.
 後処理部333は、同時化後の画像信号に対して、制御部340に予め保存されている階調変換係数や色変換係数、輪郭強調係数を用いて、階調変換処理や色処理、輪郭強調処理を行う。後処理部333は、後処理後の画像信号を表示部400へ転送する。 The post-processing unit 333 uses the tone conversion coefficient, color conversion coefficient, and edge enhancement coefficient stored in advance in the control unit 340 for the image signal after synchronization, and performs tone conversion processing, color processing, and contour processing. Perform enhancement processing. The post-processing unit 333 transfers the post-processed image signal to the display unit 400.
 図5に、第1の実施形態におけるフォーカス制御部320の詳細な構成例を示す。フォーカス制御部320は注目領域検出部321(異常部検出部)と、領域設定部322と、コントラスト値算出部323と、切り替え制御部324を含む。 FIG. 5 shows a detailed configuration example of the focus control unit 320 in the first embodiment. The focus control unit 320 includes an attention area detection unit 321 (abnormal part detection unit), an area setting unit 322, a contrast value calculation unit 323, and a switching control unit 324.
 画像処理部330は、注目領域検出部321と、領域設定部322に接続される。注目領域検出部321は、領域設定部322に接続される。領域設定部322は、コントラスト値算出部323に接続される。コントラスト値算出部323は、切り替え制御部324に接続される。切り替え制御部324は、切り替え部250に接続される。制御部340は、注目領域検出部321と、領域設定部322と、コントラスト値算出部323と、切り替え制御部324に接続されており、これらの制御を行う。 The image processing unit 330 is connected to the attention area detection unit 321 and the area setting unit 322. The attention area detection unit 321 is connected to the area setting unit 322. The region setting unit 322 is connected to the contrast value calculation unit 323. The contrast value calculation unit 323 is connected to the switching control unit 324. The switching control unit 324 is connected to the switching unit 250. The control unit 340 is connected to the attention area detection unit 321, the area setting unit 322, the contrast value calculation unit 323, and the switching control unit 324, and performs these controls.
 注目領域検出部321は、画像処理部330から転送された画像信号から被写体の注目領域を検出する。注目領域検出部321は、被写体に注目領域を検出した場合、当該注目領域の位置を示す領域情報を領域設定部322に転送する。一方、被写体に注目領域を検出しなかった場合、注目領域が存在しないことを示す制御信号を切り替え制御部324に転送する。なお、注目領域は病変部等を表す異常部であってもよい。その場合、注目領域検出部321は異常部検出部として実現されることになる。異常部検出部が行う処理については、詳細に後述する。 The attention area detection unit 321 detects the attention area of the subject from the image signal transferred from the image processing unit 330. When the attention area detection unit 321 detects the attention area of the subject, the attention area detection unit 321 transfers area information indicating the position of the attention area to the area setting unit 322. On the other hand, when the attention area is not detected in the subject, a control signal indicating that the attention area does not exist is transferred to the switching control unit 324. Note that the attention area may be an abnormal part representing a lesion part or the like. In this case, the attention area detection unit 321 is realized as an abnormal part detection unit. The processing performed by the abnormal part detection unit will be described in detail later.
 領域設定部322は、注目領域検出部321で注目領域が検出された場合、画像処理部330から転送された画像信号に対して、コントラスト値を算出するための評価領域を設定する。ここでは、例えば注目領域をそのまま評価領域として設定してもよいし、図8に示すように、予め画像信号を複数の領域に分割し、注目領域が最も多く含まれる領域を評価領域として設定するようにしてもよい。その後、領域設定部322は、設定された評価領域情報と画像信号をコントラスト値算出部323に転送する。 The region setting unit 322 sets an evaluation region for calculating a contrast value for the image signal transferred from the image processing unit 330 when the attention region is detected by the attention region detection unit 321. Here, for example, the attention area may be set as the evaluation area as it is, or as shown in FIG. 8, the image signal is divided in advance into a plurality of areas, and the area containing the most attention area is set as the evaluation area. You may do it. Thereafter, the region setting unit 322 transfers the set evaluation region information and the image signal to the contrast value calculation unit 323.
 コントラスト値算出部323は、評価領域情報と画像信号から、評価領域のコントラスト値を算出する。本実施形態では、領域設定部322から転送される画像信号は、任意のチャンネルに対してコントラスト値を算出すればよい。また、R,G,Bの3チャンネルの画素値から輝度信号を生成し、生成した輝度信号の画素値に対してコントラスト値を算出してもよい。ここでは例えばコントラスト値算出部323は、評価領域に含まれるすべての画素に対して任意のハイパスフィルター処理を行い、各画素のハイパスフィルター出力値をすべて加算することでコントラスト値を算出すればよい。また、コントラスト値算出部323は、例えば図6に示すようにハイパスフィルター処理を行う高域抽出部3232の前段に輝点除去部3231を備えるような構成としてもよい。ここで例えば輝点除去部3231は、評価領域に含まれるすべての画素における任意のチャンネルもしくは輝度信号の画素値に対して閾値処理を行い、画素値が閾値以上の画素については、輝点と判断する。輝点と判断された画素は、後段のハイパスフィルター処理の出力値を0とする制御信号を転送することで、輝点がコントラスト値に与える影響を低減することができる。その後、コントラスト値算出部323は切り替え制御部324に評価領域のコントラスト値を転送する。 The contrast value calculation unit 323 calculates the contrast value of the evaluation area from the evaluation area information and the image signal. In the present embodiment, the image signal transferred from the region setting unit 322 may calculate a contrast value for an arbitrary channel. Also, a luminance signal may be generated from pixel values of three channels of R, G, and B, and a contrast value may be calculated for the pixel value of the generated luminance signal. Here, for example, the contrast value calculation unit 323 may perform arbitrary high-pass filter processing on all the pixels included in the evaluation region, and calculate the contrast value by adding all the high-pass filter output values of each pixel. Further, for example, as shown in FIG. 6, the contrast value calculation unit 323 may be configured to include a bright spot removal unit 3231 before the high-frequency extraction unit 3232 that performs high-pass filter processing. Here, for example, the bright spot removing unit 3231 performs threshold processing on an arbitrary channel or pixel value of a luminance signal in all pixels included in the evaluation region, and determines that a pixel having a pixel value equal to or larger than the threshold is a bright spot. To do. The pixel determined to be a bright spot can reduce the influence of the bright spot on the contrast value by transferring a control signal that sets the output value of the subsequent high-pass filter processing to 0. Thereafter, the contrast value calculation unit 323 transfers the contrast value of the evaluation region to the switching control unit 324.
 切り替え制御部324は、コントラスト値算出部323で算出されたコントラスト値に基づき切り替え部250の制御を行う。 The switching control unit 324 controls the switching unit 250 based on the contrast value calculated by the contrast value calculation unit 323.
 フォーカス制御部320での処理の一例を、図14を用いて説明する。まず、注目領域検出部321において注目領域(狭義には異常部)が検出されたかの判定を行い(S101)、注目領域検出部321から注目領域が存在しない制御信号が転送された場合、切り替え制御部324は、フォーカスレンズ位置を、撮像部からのベスト被写体距離が最も長い点(最遠点)に対応する位置に移動させる制御信号を切り替え部250に転送する(S102)。これは最遠点では、被写界深度が最も深くなるため、フォーカスが合う領域が広いためである。一方、注目領域検出部321から注目領域が存在する制御信号が転送された場合(S101でYesの場合)、領域設定部322は検出された注目領域に対応した評価領域を設定する(S103)。そして、コントラスト値算出部323は、設定された評価領域のコントラスト値を算出する(S104)。切り替え制御部324は、算出されたコントラスト値が閾値Tconより大きいか否かの判定を行う(S105)。コントラスト値が閾値Tcon以下(その値を含む)の場合、現在のフォーカスレンズ位置とコントラスト値を切り替え制御部内の不図示のメモリに記憶し、切り替え開始フラグFに1をセットして切り替え処理(S106~S108)に移行する。ここで、全てのフォーカスレンズ位置でコントラスト値が記憶されているか(全フォーカスレンズ位置で、コントラスト値が閾値Tcon以下(その値を含む)であるか)の判定を行い(S106)、記憶されていた場合には、フォーカスレンズ240の位置を、撮像部からのベスト被写体距離が最も長い点(最遠点)に対応する位置(図12の位置A)へ切り替える(S107)。その後、切り替え開始フラグに0をセットし、記憶されているフォーカスレンズ位置とコントラスト値を消去して処理を終了する。一方、コントラスト値が記憶されていないフォーカスレンズ位置がある場合には(S106でNoの場合には)、当該フォーカスレンズ位置(複数ある場合にはその中の1つ)にフォーカスレンズ240を移動させる(S108)。そして、S104に戻り移動後のフォーカスレンズ位置でのコントラスト値を算出する。また、S105の判定でコントラスト値が閾値Tconより大きい場合、切り替え制御部324は、フォーカスレンズ位置を動かさず、切り替え開始フラグFに0をセットし、記憶されているフォーカスレンズ位置とコントラスト値を消去して処理を終了する(S109)。 An example of processing in the focus control unit 320 will be described with reference to FIG. First, it is determined whether or not an attention area (abnormally narrow part) is detected in the attention area detection unit 321 (S101), and when a control signal that does not include the attention area is transferred from the attention area detection unit 321, the switching control unit 324 transfers a control signal for moving the focus lens position to a position corresponding to a point corresponding to the longest subject distance (farthest point) from the imaging unit to the switching unit 250 (S102). This is because, at the farthest point, the depth of field is the deepest, and thus the focused area is wide. On the other hand, when a control signal indicating a region of interest is transferred from the region of interest detection unit 321 (Yes in S101), the region setting unit 322 sets an evaluation region corresponding to the detected region of interest (S103). Then, the contrast value calculation unit 323 calculates the contrast value of the set evaluation area (S104). The switching control unit 324 determines whether the calculated contrast value is larger than the threshold value Tcon (S105). When the contrast value is equal to or less than the threshold value Tcon (including that value), the current focus lens position and the contrast value are stored in a memory (not shown) in the switching control unit, and the switching start flag F is set to 1 to perform the switching process (S106). To S108). Here, it is determined whether or not the contrast value is stored at all the focus lens positions (whether the contrast value is equal to or less than the threshold value Tcon (including the value) at all the focus lens positions) (S106). In this case, the position of the focus lens 240 is switched to a position (position A in FIG. 12) corresponding to the point with the longest subject distance (farthest point) from the imaging unit (S107). Thereafter, the switching start flag is set to 0, the stored focus lens position and contrast value are erased, and the process ends. On the other hand, when there is a focus lens position for which no contrast value is stored (in the case of No in S106), the focus lens 240 is moved to the focus lens position (one of them when there are a plurality of focus lens positions). (S108). Then, returning to S104, the contrast value at the focus lens position after the movement is calculated. If the contrast value is larger than the threshold value Tcon in S105, the switching control unit 324 sets the switching start flag F to 0 without moving the focus lens position, and erases the stored focus lens position and contrast value. Then, the process ends (S109).
 なお、全フォーカスレンズ位置でコントラストが閾値Tcon以下(その値を含む)の場合、撮像部からのベスト被写体距離が最も長い点(最遠点)に対応する位置にフォーカスレンズ240の位置を切り替えることとしたが(S107)、これに限らず、術者が指定した所定の位置や、切り替え制御部内の不図示のメモリに記憶された最もコントラスト値の高いフォーカスレンズ位置へ切り替えることとしてもよい。また、図14の説明において、フォーカスレンズ位置とは離散的に設定されるものであり、フォーカスレンズ位置と目標合焦位置候補は対応関係を持つ。本実施形態の内視鏡装置は、離散的に設定された複数の目標合焦位置候補の中の1つを、対物光学系の目標合焦位置として選択するものである。フォーカスレンズ位置と目標合焦位置とは一方を決めれば他方が決定される関係にあることから、離散的なフォーカスレンズ位置制御を行うことで、離散的な目標合焦位置候補を設定することができる。つまり、図14でのフォーカスレンズ位置とは、図12のA~D、図13のE,F等を表すものである。 When the contrast is equal to or less than the threshold value Tcon at all focus lens positions (including that value), the position of the focus lens 240 is switched to a position corresponding to the point (farthest point) with the longest subject distance from the imaging unit. However, the present invention is not limited to this, and the focus lens position may be switched to a predetermined position designated by the surgeon or a focus lens position with the highest contrast value stored in a memory (not shown) in the switching control unit. In the description of FIG. 14, the focus lens position is set discretely, and the focus lens position and the target focus position candidate have a correspondence relationship. The endoscope apparatus of this embodiment selects one of a plurality of discrete target focus position candidates set as discrete target focus positions of the objective optical system. Since one of the focus lens position and the target focus position is determined and the other is determined, discrete target focus position candidates can be set by performing discrete focus lens position control. it can. That is, the focus lens position in FIG. 14 represents A to D in FIG. 12, E and F in FIG.
 図7に、第1の実施形態における注目領域検出部321の詳細な構成例を示す。注目領域検出部321は、明るさ/色信号算出部3211と、基準信号作成部3212と、差分算出部3213と、領域分割部3214と、候補領域検出部3215と、特徴量算出部3216を含む。なお、図7は、注目領域として異常部を検出する異常部検出部として注目領域検出部321が実現された場合を想定した構成例であり、以下の説明においても異常部を検出する例について説明している。しかしこれに限定されるものではなく、例えば、異常部以外を検出する場合に、図7の構成を用いてもよい。 FIG. 7 shows a detailed configuration example of the attention area detection unit 321 in the first embodiment. The attention area detection unit 321 includes a brightness / color signal calculation unit 3211, a reference signal creation unit 3212, a difference calculation unit 3213, an area division unit 3214, a candidate region detection unit 3215, and a feature amount calculation unit 3216. . FIG. 7 is a configuration example assuming that the attention area detection unit 321 is realized as an abnormal part detection unit that detects an abnormal part as an attention area, and an example of detecting an abnormal part is also described in the following description. is doing. However, the present invention is not limited to this. For example, the configuration shown in FIG.
 画像処理部330は、明るさ/色信号算出部3211と、領域分割部3214に接続される。明るさ/色信号算出部3211は、基準信号作成部3212と、差分算出部3213に接続される。基準信号作成部3212は、差分算出部3213に接続される。差分算出部3213は、候補領域検出部3215に接続される。領域分割部3214は、候補領域検出部3215に接続される。候補領域検出部3215は、特徴量算出部3216と、切り替え制御部324に接続される。特徴量算出部3216は、領域設定部322に接続される。制御部340は、明るさ/色信号算出部3211と、基準信号作成部3212と、差分算出部3213と、領域分割部3214と、候補領域検出部3215と、特徴量算出部3216と、に双方向に接続されており、これらの制御を行う。 The image processing unit 330 is connected to a brightness / color signal calculation unit 3211 and an area division unit 3214. The brightness / color signal calculation unit 3211 is connected to the reference signal creation unit 3212 and the difference calculation unit 3213. The reference signal creation unit 3212 is connected to the difference calculation unit 3213. The difference calculation unit 3213 is connected to the candidate area detection unit 3215. The area dividing unit 3214 is connected to the candidate area detecting unit 3215. The candidate area detection unit 3215 is connected to the feature amount calculation unit 3216 and the switching control unit 324. The feature amount calculation unit 3216 is connected to the region setting unit 322. The control unit 340 includes both a brightness / color signal calculation unit 3211, a reference signal creation unit 3212, a difference calculation unit 3213, a region division unit 3214, a candidate region detection unit 3215, and a feature amount calculation unit 3216. Are connected to each other and perform these controls.
 明るさ/色信号算出部3211は、画像処理部330から転送された画像信号に基づき明るさ信号と色信号を算出する。ここで、色信号は画像信号の赤味の強さを示す信号である。具体的には、画像信号に含まれる各色光の画像信号から、下式(1)、(2)を用いて明るさ信号Yと、色信号Cを算出する。 The brightness / color signal calculation unit 3211 calculates a brightness signal and a color signal based on the image signal transferred from the image processing unit 330. Here, the color signal is a signal indicating the redness of the image signal. Specifically, the brightness signal Y and the color signal C are calculated from the image signals of the respective color lights included in the image signal using the following expressions (1) and (2).
 Y=0.299×R+0.587×G+0.114×B ・・・・・(1) Y = 0.299 × R + 0.587 × G + 0.114 × B (1)
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 ここで、R,G,Bは画像信号に含まれる各色光の画像信号を示している。また、aは定数であり外部から予め入力された値を用いる。算出された明るさ信号と色信号は、基準信号作成部3212と、差分算出部3213に転送される。 Here, R, G, and B indicate image signals of each color light included in the image signal. Further, a is a constant, and a value input in advance from the outside is used. The calculated brightness signal and color signal are transferred to the reference signal creation unit 3212 and the difference calculation unit 3213.
 基準信号作成部3212は、明るさ/色信号算出部3211から転送された明るさ信号と色信号に基づき基準信号を作成する。具体的には、明るさ信号、色信号にそれぞれローパスフィルター処理を行う。ローパスフィルターは、例えば、平均値フィルター、ガウシアンフィルター、エッジ保存型の平滑化フィルターなどの公知の手法を用いる。ローパスフィルター処理が施された明るさ信号、及び、色信号が、それぞれ明るさ基準信号,色基準信号として差分算出部3213に転送される。ここで、基準信号はローパスフィルター処理を施した画像としたが、これに限らず、明るさ信号,色信号に対して、それぞれ近似関数を算出し、近似関数から算出した近似値を基準信号としてもよい。 The reference signal creation unit 3212 creates a reference signal based on the brightness signal and the color signal transferred from the brightness / color signal calculation unit 3211. Specifically, low-pass filter processing is performed on the brightness signal and the color signal, respectively. As the low-pass filter, for example, a known method such as an average value filter, a Gaussian filter, or an edge-preserving smoothing filter is used. The brightness signal and the color signal subjected to the low-pass filter processing are transferred to the difference calculation unit 3213 as the brightness reference signal and the color reference signal, respectively. Here, the reference signal is an image subjected to low-pass filter processing. However, the present invention is not limited to this, and an approximate function is calculated for each of the brightness signal and the color signal, and the approximate value calculated from the approximate function is used as the reference signal. Also good.
 差分算出部3213は、明るさ信号,色信号と、各基準信号と、の差分を算出する。算出された差分信号は、それぞれ明るさ差分信号,色差分信号として候補領域検出部3215に転送される。 The difference calculation unit 3213 calculates the difference between the brightness signal, the color signal, and each reference signal. The calculated difference signals are transferred to the candidate area detection unit 3215 as a brightness difference signal and a color difference signal, respectively.
 領域分割部3214は、画像処理部330から転送された画像信号を局所領域に分割する。具体的には、まず画像信号から公知の輪郭追跡手法を用いて輪郭画素を検出する。輪郭画素により分割された領域は、公知のラベリング処理を施し、候補領域検出部3215に転送される。 The region dividing unit 3214 divides the image signal transferred from the image processing unit 330 into local regions. Specifically, first, a contour pixel is detected from the image signal using a known contour tracking method. The area divided by the contour pixels is subjected to a known labeling process and transferred to the candidate area detection unit 3215.
 候補領域検出部3215は、局所領域毎の差分信号に基づき、異常部の候補領域(広義には注目領域の候補領域である注目候補領域)を検出する。具体的には、まず明るさ差分信号の各画素について、差分が閾値TmaxY以上(その値を含む)の画素を外れ画素として抽出する。外れ画素を除き、各領域毎の明るさ差分の分散σYを算出する。明るさ分散σyが閾値TsYより大きい領域を明るさ異常領域(例えば凹凸病変に対応)として抽出する。続いて、色差分信号の各画素について、差分が閾値Tmaxc以上(その値を含む)の画素を外れ画素として抽出する。外れ画素を除き、各領域毎の色差分の色分散σcを算出する。分散σcが閾値Tscより大きい領域を色異常領域(例えば発赤・褪色病変に対応)として抽出する。明るさ異常領域、または、色異常領域として抽出された領域を、異常部の候補領域として、特徴量算出部3216に転送する。さらに、異常部領域が存在することを示す制御信号を切り替え部制御324へ転送する。一方、明るさ異常領域、または、色異常領域が存在しない場合、異常部が存在しないことを示す制御信号を、切り替え制御部324へ転送する。ここで、閾値TmaxY,TsY,Tmaxc,Tscは定数であり、外部から予め入力された値を用いる。また、算出された明るさ分散σy及び、色分散σcを特徴量算出部3216に転送する。 The candidate area detection unit 3215 detects a candidate area for an abnormal part (a candidate area for attention that is a candidate area for the attention area in a broad sense) based on the difference signal for each local area. Specifically, for each pixel of the brightness difference signal, first, a pixel whose difference is equal to or greater than a threshold value TmaxY (including its value) is extracted as an outlier pixel. Excluding outlier pixels, the variance σY of the brightness difference for each region is calculated. An area where the brightness variance σy is larger than the threshold value TsY is extracted as an abnormal brightness area (for example, corresponding to an uneven lesion). Subsequently, for each pixel of the color difference signal, a pixel having a difference equal to or greater than the threshold value Tmaxc (including its value) is extracted as an outlier pixel. Excluding outliers, the color variance σc of the color difference for each region is calculated. A region where the variance σc is larger than the threshold value Tsc is extracted as a color abnormality region (for example, corresponding to redness and amber lesion). The region extracted as the abnormal brightness region or the abnormal color region is transferred to the feature amount calculation unit 3216 as a candidate region for the abnormal portion. Further, a control signal indicating that an abnormal area is present is transferred to the switching section control 324. On the other hand, when there is no brightness abnormality region or color abnormality region, a control signal indicating that there is no abnormality portion is transferred to the switching control portion 324. Here, the threshold values TmaxY, TsY, Tmaxc, and Tsc are constants, and values input in advance from the outside are used. Further, the calculated brightness variance σy and color variance σc are transferred to the feature amount calculation unit 3216.
 特徴量算出部3216は、候補領域検出部3215から転送された複数の候補領域について特徴量を算出し、算出された特徴量に基づき異常部を検出する。具体的には、各候補領域の明るさ分散σy及び、色分散σcから候補領域の異常度Eを算出する。異常度Eiは下式(3)で算出する。 The feature amount calculation unit 3216 calculates feature amounts for a plurality of candidate regions transferred from the candidate region detection unit 3215, and detects an abnormal portion based on the calculated feature amounts. Specifically, the degree of abnormality E of the candidate area is calculated from the brightness dispersion σy and the color dispersion σc of each candidate area. The degree of abnormality Ei is calculated by the following equation (3).
 Ei=αe×σy+βe×σc ・・・・・(3)
 ここで、αe,βeは定数項であり、外部から予め入力された値を用いる。ここで、αe>βeと設定することで、明るさ異常領域(凹凸病変)が、赤味異常領域(発赤・褪色病変)に比べ異常部として検出されやすくしてもよい。これは、術者がより注目したい領域は、発赤・褪色病変よりも、病変としての重篤度が高い凹凸病変であることが想定されるためである。算出された異常度Eiが最も高い領域を異常部として、領域設定部322へ転送する。ここで、異常部Eを明るさ分散σyと色分散σcの重みつき加算値としたが、特徴量はこれに限らず、例えば候補領域の面積を求めてもよい。その場合、最も面積の大きい候補領域を異常部として領域設定部322へ転送する。
Ei = αe × σy + βe × σc (3)
Here, αe and βe are constant terms, and values previously input from the outside are used. Here, by setting αe> βe, the abnormal brightness region (uneven lesion) may be more easily detected as an abnormal portion than the reddish abnormal region (redness / scarred lesion). This is because the region that the operator wants to pay more attention to is assumed to be an uneven lesion having a higher severity as a lesion than a redness / amber lesion. The area with the highest degree of abnormality Ei calculated is transferred to the area setting unit 322 as an abnormal part. Here, although the abnormal portion E is a weighted addition value of the brightness variance σy and the color variance σc, the feature amount is not limited thereto, and for example, the area of the candidate region may be obtained. In that case, the candidate area with the largest area is transferred to the area setting unit 322 as an abnormal part.
 本実施形態によれば、合焦物体位置を離散的な位置で切り替える。これにより、合焦物体位置を連続的な位置で変化させるようなAF動作に比較して、被写体へ合焦するスピードが速くなる。このため、常に術者の関心領域(異常部)にフォーカスが合った画像を得ることができる。 According to the present embodiment, the focused object position is switched at discrete positions. This increases the speed at which the subject is focused compared to the AF operation in which the focused object position is changed in a continuous position. For this reason, it is possible to always obtain an image focused on the region of interest (abnormal part) of the operator.
 以上の本実施形態では、内視鏡装置は図1に示すように、画像信号を取得する撮像部200と、撮像部200の状態(狭義には対物光学系(例えばフォーカスレンズ240を含む)の状態)により決定される合焦物体位置を、離散的に設定された複数の目標合焦位置候補のうちで切り替える切り替え部250と、切り替え部250を制御して合焦制御を行うフォーカス制御部320とを含む。ここで、フォーカス制御部320は、複数の目標合焦位置候補の中からいずれか1つを目標合焦位置として選択し、選択結果を切り替え部250に出力する。また、フォーカス制御部320は図5に示すように、画像信号から被写体上の注目領域を検出する注目領域検出部321を含む。そして、フォーカス制御部320は、注目領域検出部321で検出された注目領域に対して合焦しているか否かの判定を行い、合焦していると判定された場合に、当該判定処理が行われたタイミングでの目標合焦位置候補(例えば判定処理時のフォーカスレンズ240の位置に応じて決定される位置)を、目標合焦位置として選択する。また、合焦していないと判定された場合には、切り替え部250の制御を行って合焦物体位置を切り替える。 In the above embodiment, as shown in FIG. 1, the endoscope apparatus has an imaging unit 200 that acquires an image signal, and the state of the imaging unit 200 (in a narrow sense, including an objective optical system (for example, the focus lens 240). A switching unit 250 that switches the focused object position determined by the state) among a plurality of discrete target focus position candidates, and a focus control unit 320 that controls the switching unit 250 to perform focusing control. Including. Here, the focus control unit 320 selects any one of a plurality of target focus position candidates as the target focus position, and outputs the selection result to the switching unit 250. Further, as shown in FIG. 5, the focus control unit 320 includes an attention area detection section 321 that detects an attention area on the subject from the image signal. Then, the focus control unit 320 determines whether or not the attention area detected by the attention area detection unit 321 is in focus. When it is determined that the focus area is in focus, the determination process is performed. A target focus position candidate at the performed timing (for example, a position determined according to the position of the focus lens 240 at the time of determination processing) is selected as the target focus position. If it is determined that the subject is not in focus, the switching unit 250 is controlled to switch the focused object position.
 ここで、目標合焦位置候補とは、目標合焦位置の候補のことであり、本実施形態においては複数の目標合焦位置候補の中から1つを目標合焦位置として選択することで合焦制御を行う。例えば、図12のA~Dに黒丸として示したものが目標合焦位置候補に対応する、撮像部からベスト被写体距離だけ離れた点である。また、離散的な複数の目標合焦位置候補を設定する手法は任意であるが、例えばフォーカスレンズ240を離散的に駆動することで実現することができる。フォーカスレンズ240の位置を決めることで目標合焦位置が決定されるため、フォーカスレンズ240を離散的に制御することで合焦物体位置も離散的に制御することができる。 Here, the target in-focus position candidate is a target in-focus position candidate. In this embodiment, one of the plurality of target in-focus position candidates is selected as the in-focus position. Perform focus control. For example, the black circles shown in FIGS. 12A to 12D are points corresponding to the target in-focus position candidate and away from the imaging unit by the best subject distance. A method of setting a plurality of discrete target focus position candidates is arbitrary, but can be realized by, for example, driving the focus lens 240 discretely. Since the target focus position is determined by determining the position of the focus lens 240, the focus object position can also be controlled discretely by controlling the focus lens 240 discretely.
 また、注目領域とは、ユーザーにとって観察の優先順位が他の領域よりも相対的に高い領域であり、例えば、ユーザーが医者であり治療を希望した場合、粘膜部や病変部を写した領域を指す。また、他の例として、医者が観察したいと欲した対象が泡や便であれば、注目領域は、その泡部分や便部分を写した領域になる。すなわち、ユーザーが注目すべき対象は、その観察目的によって異なるが、いずれにしても、その観察に際し、ユーザーにとって観察の優先順位が他の領域よりも相対的に高い領域が注目領域となる。 The attention area is an area where the priority of observation for the user is relatively higher than other areas. For example, when the user is a doctor and desires treatment, an area in which the mucosa or lesion is copied is displayed. Point to. As another example, if the object that the doctor desires to observe is a bubble or stool, the region of interest is a region in which the bubble or stool portion is copied. That is, the object that the user should pay attention to depends on the purpose of observation, but in any case, in the observation, an area that is relatively higher in observation priority for the user than other areas is the attention area.
 これにより、注目領域を検出した場合に、離散的な合焦物体位置制御(離散的なフォーカスレンズ240の位置制御)を行うことで、注目領域に対して高速で合焦させることが可能になる。通常のAF(例えばコントラストAF)では、合焦物体位置を連続的に制御可能であるため、AFを行うためのAF評価値(例えばコントラスト値)を算出する対象となる目標合焦位置が多く存在することになり、合焦するまでの演算量が多くなり、要する時間も長くなる。それに対して、離散的なN焦点切り替え(Nはある程度少ない数であり、例えば4や2等)であれば、AF評価値の算出回数は最大でもN回に抑えられることになり、演算量が少ないため高速で合焦動作を行うことができる。なお、注目領域に合焦していれば、その際の目標合焦位置候補が目標合焦位置となる。つまり、その時点で合焦動作を完了することになるため、場合によっては合焦判定処理をN回よりも少なくすることも可能である。また、注目領域を検出したか否か、検出した場合には当該注目領域に合焦しているか否かに基づいて処理を切り替えることができる。よって、注目領域の検出・合焦状態に応じて自動的に処理のオンオフを切り替え可能となる。つまり、ユーザーからの指示等(例えばAF開始ボタンの操作等)がなくても、適切に合焦動作を行うことができるため、ユーザーにとって使いやすいシステムを実現することができる。 Thereby, when the attention area is detected, it is possible to focus on the attention area at high speed by performing discrete focusing object position control (discrete focus lens 240 position control). . In normal AF (for example, contrast AF), the in-focus object position can be continuously controlled. Therefore, there are many target in-focus positions for calculating an AF evaluation value (for example, contrast value) for performing AF. As a result, the amount of calculation until focusing is increased, and the time required is also increased. On the other hand, if discrete N focus switching (N is a small number to some extent, for example, 4 or 2), the number of AF evaluation value calculations can be suppressed to N times at most, and the amount of calculation is large. Since there are few, a focusing operation can be performed at high speed. If the focus area is in focus, the target focus position candidate at that time becomes the target focus position. That is, since the focusing operation is completed at that time, it is possible to reduce the focusing determination processing to N times in some cases. Further, the processing can be switched based on whether or not the attention area has been detected and, if detected, whether or not the attention area is in focus. Therefore, it is possible to automatically switch the processing on and off according to the detection / focusing state of the attention area. In other words, since it is possible to appropriately perform the focusing operation without an instruction from the user (for example, operation of the AF start button), it is possible to realize a system that is easy for the user to use.
 なお、注目領域に対して合焦していないと判定され、判定が行われたタイミングに対応する目標合焦位置候補とは異なる目標合焦位置候補に、合焦物体位置が切り替えられた場合に、注目領域検出部321は、切り替え後の前記画像信号から被写体上の注目領域を検出し、フォーカス制御部320は、検出された注目領域に対して合焦しているか否かの判定を行ってもよい。 When it is determined that the focus area is not in focus and the focused object position is switched to a target focus position candidate that is different from the target focus position candidate corresponding to the determination timing. The attention area detection unit 321 detects the attention area on the subject from the image signal after switching, and the focus control unit 320 determines whether or not the detected attention area is in focus. Also good.
 また、フォーカス制御部320は、図5に示すように、注目領域に対応する領域のコントラスト値を算出するコントラスト値算出部323を含んでもよい。そして、フォーカス制御部320は、コントラスト値に基づいて注目領域に対して合焦しているか否かの判定を行い、切り替え部250を制御する。 Further, as shown in FIG. 5, the focus control unit 320 may include a contrast value calculation unit 323 that calculates the contrast value of the region corresponding to the region of interest. Then, the focus control unit 320 determines whether or not the attention area is focused based on the contrast value, and controls the switching unit 250.
 これにより、コントラスト値に基づいて、注目領域に合焦しているか否かの判定を行うことが可能になる。その際、合焦しているか否かの対象は注目領域であるから、コントラスト値を算出する対象となる画像上の領域(評価領域)を注目領域に対応させて設定することになる。評価領域は注目領域に一致させてもよいが、それに限定されず、例えば注目領域を含む矩形領域等を用いて演算を容易にしてもよい。コントラスト値に基づく合焦判定は、例えば所与の閾値と比較を行い、閾値よりも大きい場合に合焦していると判定すればよい。 This makes it possible to determine whether or not the region of interest is in focus based on the contrast value. At this time, since the target of whether or not it is in focus is the attention area, the area (evaluation area) on the image whose contrast value is to be calculated is set in correspondence with the attention area. The evaluation area may be matched with the attention area, but is not limited thereto. For example, the calculation may be facilitated by using a rectangular area including the attention area. The in-focus determination based on the contrast value may be determined, for example, by comparing with a given threshold value and determining that the in-focus state is greater than the threshold value.
 また、フォーカス制御部320は、コントラスト値が所定の閾値よりも低い場合には、注目領域に対して合焦していないと判定し、切り替え部250の制御を行って対物光学系の合焦物体位置を切り替えてもよい。 Further, when the contrast value is lower than a predetermined threshold, the focus control unit 320 determines that the region of interest is not in focus, and controls the switching unit 250 to perform a focusing object of the objective optical system. The position may be switched.
 これにより、コントラスト値が閾値よりも小さい場合には、注目領域に対して合焦していないと判定することが可能になる。その際には、切り替え部250を制御することで合焦物体位置を切り替えることになる。具体的には、複数の目標合焦位置候補のなかから、現在の位置とは異なる位置を選択し、選択した位置で合焦判定を行えばよい。重複した処理を避けるためには、一連の合焦動作(前回の合焦完了時から、次の合焦完了時までの間の処理)の中で、合焦判定が行われていない目標合焦位置候補を選択することになる。 Thereby, when the contrast value is smaller than the threshold value, it is possible to determine that the region of interest is not focused. In that case, the focus object position is switched by controlling the switching unit 250. Specifically, a position different from the current position may be selected from a plurality of target focus position candidates, and focus determination may be performed at the selected position. In order to avoid duplicate processing, the target focus for which focus determination has not been performed in a series of focus operations (processing from the previous focus completion to the next focus completion) A position candidate is selected.
 また、フォーカス制御部320は、複数の目標合焦位置候補でそれぞれ算出した複数のコントラスト値が、全て所定の閾値よりも低い場合には、対物光学系の目標合焦位置として所定目標合焦位置候補を選択してもよい。 In addition, when the plurality of contrast values respectively calculated for the plurality of target focus position candidates are all lower than the predetermined threshold, the focus control unit 320 determines the predetermined target focus position as the target focus position of the objective optical system. Candidates may be selected.
 ここで、所定目標合焦位置候補としては、最も撮像部からの合焦被写体距離が長い点に対応する目標合焦位置候補を用いてもよいし、外部からインターフェース部(図1での外部I/F部500に対応)を介して入力された入力情報に基づいて設定してもよい。なお、合焦被写体距離とは、撮像素子260上で被写体の像が合焦する場合での、撮像部200から当該被写体までの距離を指すものとする。よって、ある目標合焦位置候補が選択されることで光学的な条件が決定されれば、合焦被写体距離も当該目標合焦位置候補に対応付けて求めることができる。ただし、被写体上の一点から出た光線が撮像素子状の一点に収束しなかったとしても、それが許容錯乱円に収まっていれば合焦しているとみなされることから、合焦被写体距離とは幅を持つ値であるといえる。ここでの合焦被写体距離は、そのような幅を持つ値であってもよいが、狭義には上述したベスト被写体距離を指すものであってもよい。 Here, as the predetermined target focus position candidate, a target focus position candidate corresponding to a point with the longest focus object distance from the imaging unit may be used, or an external interface unit (external I in FIG. 1). / Corresponding to the F unit 500) may be set based on the input information. Note that the in-focus subject distance refers to the distance from the imaging unit 200 to the subject when the subject image is focused on the image sensor 260. Therefore, if an optical condition is determined by selecting a certain target in-focus position candidate, the in-focus subject distance can also be obtained in association with the target in-focus position candidate. However, even if the light beam emitted from one point on the subject does not converge to one point on the image sensor, it is considered to be in focus if it falls within the permissible circle of confusion. Is a value with a width. The in-focus subject distance here may be a value having such a width, but in a narrow sense, it may indicate the above-mentioned best subject distance.
 また、フォーカス制御部320は、複数の目標合焦位置候補でそれぞれ算出した複数のコントラスト値が、全て所定の閾値よりも低い場合には、最もコントラスト値の高い目標合焦位置候補を用いてもよい。 In addition, the focus control unit 320 may use the target focus position candidate having the highest contrast value when the plurality of contrast values respectively calculated for the plurality of target focus position candidates are all lower than a predetermined threshold value. Good.
 これにより、全てのコントラスト値が閾値以下である場合、つまり全ての目標合焦位置候補において注目領域に対して合焦していないと判定された場合にも、所与の位置に目標合焦位置を設定することが可能になる。コントラスト値が最も高い目標合焦位置候補を用いた場合には、合焦しているとは言えないものの、他の位置に比べてボケの程度が少ない位置を目標合焦位置とすることができる。撮像部からの合焦被写体距離が長い点に対応する(つまり遠点に合焦する)目標合焦位置を選択した場合には、光学的に被写体距離が長いほど被写界深度が広くなるため(正確に言えば、撮像部200から被写体までの距離ではなく、例えば前側の焦点から被写体までの距離等に応じて被写界深度は変化する)、他の位置に比べて被写界深度を広くとることが可能になり、画像上で焦点の合っている領域を大きくできる可能性が高まる。また、被写界深度が広ければ手動でのピント合わせも実行可能である。また、所定目標合焦位置候補は以上の2つに限定されるものではなく、例えば外部装置やユーザーからの入力値に基づいて任意に設定されてもよい。 Accordingly, even when all the contrast values are equal to or less than the threshold value, that is, when it is determined that all the target focus position candidates are not focused on the attention area, the target focus position is set to the given position. Can be set. When the target focus position candidate with the highest contrast value is used, the target focus position can be determined as a position with less blur compared to other positions, although it cannot be said that it is in focus. . When a target focus position corresponding to a point with a long in-focus subject distance from the imaging unit (that is, focusing on a far point) is selected, the depth of field increases as the subject distance optically increases. (To be precise, the depth of field changes not according to the distance from the imaging unit 200 to the subject but according to the distance from the front focal point to the subject, for example), and the depth of field compared to other positions. It becomes possible to take a wide area, and the possibility that the focused area on the image can be enlarged is increased. If the depth of field is wide, manual focusing can be performed. Further, the predetermined target in-focus position candidates are not limited to the above two, and may be arbitrarily set based on, for example, an input value from an external device or a user.
 また、注目領域検出部321は図7に示すように、画像信号から基準信号を作成する基準信号作成部3212と、画像信号と基準信号に基づいて注目候補領域を検出する候補領域検出部3215と、注目候補領域の特徴量を算出する特徴量算出部3216とを含んでもよい。そして注目領域検出部321は、特徴量に基づいて注目領域を検出する。 Further, as shown in FIG. 7, the attention area detection unit 321 includes a reference signal generation unit 3212 that generates a reference signal from an image signal, and a candidate area detection unit 3215 that detects an attention candidate area based on the image signal and the reference signal. The feature amount calculation unit 3216 that calculates the feature amount of the candidate region of interest may be included. Then, the attention area detection unit 321 detects the attention area based on the feature amount.
 ここで、基準信号とは画像信号に基づいて求められる信号のことであり、例えば画像信号に対してフィルター処理(具体的にはローパスフィルター処理等)を行って、所与の空間周波数成分を抽出したものであってもよい。注目候補領域とは注目領域の候補となる領域のことであり、画像信号と基準信号に基づいて検出される。具体的には画像信号と基準信号の差分値から検出されてもよく、上述の例では輝度信号の差分に基づいて検出されたものと、色信号の差分に基づいて検出されたものの両方を注目候補領域としている。 Here, the reference signal is a signal obtained based on the image signal. For example, the image signal is subjected to filter processing (specifically, low-pass filter processing) to extract a given spatial frequency component. It may be what you did. The attention candidate region is a region that is a candidate for the attention region, and is detected based on the image signal and the reference signal. Specifically, it may be detected from the difference value between the image signal and the reference signal. In the above example, attention is paid to both the detection based on the difference between the luminance signals and the detection based on the difference between the color signals. Candidate areas.
 これにより、画像信号から基準信号を作成した上で、画像信号と基準信号とに基づいて注目候補領域を検出することが可能になる。そして、注目候補領域の特徴量を算出し、算出した特徴量に基づいて注目領域を検出する。画像信号と基準信号の差分をとることにより、例えば病変部等の異常部を検出することが可能になる。例えば、画像信号の明るさ成分(明るさ信号)と、当該明るさ信号から求められた基準信号との差分に基づいて(例えば差分の分散をとることにより)、形状(凹凸等)に特徴のある病変部を検出することができる。また、画像信号の色成分(色信号)と、当該色信号から求められた基準信号との差分に基づいて(例えば差分の分散をとることにより)、色(発赤・褪色等)に特徴のある病変部を検出することができる。よって、この例で言えば、凹凸或いは発赤・褪色等の特徴から病変部であることが疑われる領域が、注目候補領域として検出されることになる。注目候補領域は複数検出されてもよいため、その中からどの注目候補領域を注目領域とするかは、特徴量の値によって決められる。特徴量の算出手法を切り替えることで、注目領域として検出される領域の特性を切り替えることが可能になる。 This makes it possible to detect the candidate region of interest based on the image signal and the reference signal after creating the reference signal from the image signal. Then, the feature amount of the attention candidate region is calculated, and the attention region is detected based on the calculated feature amount. By taking the difference between the image signal and the reference signal, for example, an abnormal part such as a lesion part can be detected. For example, based on the difference between the brightness component (brightness signal) of the image signal and the reference signal obtained from the brightness signal (for example, by taking the variance of the difference), the shape (unevenness, etc.) A lesion can be detected. Further, the color (redness / dark blue color, etc.) is characterized based on the difference between the color component (color signal) of the image signal and the reference signal obtained from the color signal (for example, by taking the variance of the difference). A lesion can be detected. Therefore, in this example, an area that is suspected of being a lesion due to features such as unevenness or redness / dark blue is detected as the attention candidate area. Since a plurality of attention candidate areas may be detected, which attention candidate area is selected as the attention area is determined by the feature value. By switching the feature amount calculation method, it is possible to switch the characteristics of the area detected as the attention area.
 また、特徴量算出部3216は、画像信号と基準信号の差分に基づいて、注目候補領域の特徴量を算出してもよい。そして、注目領域検出部321は、特徴量が最も大きい注目候補領域を注目領域として検出する。 Also, the feature amount calculation unit 3216 may calculate the feature amount of the candidate region of interest based on the difference between the image signal and the reference signal. Then, the attention area detection unit 321 detects an attention candidate area having the largest feature amount as the attention area.
 これにより、画像信号と基準信号の差分に基づいて特徴量を算出することが可能になる。画像信号及び当該画像信号から求められる基準信号は、注目候補領域の検出に用いられたものと同じものを用いてもよいし、違うものを用いてもよい。つまり、画像信号としては、R,G,Bのいずれかを用いてもよいし、上式(1)、(2)のY,Cを用いてもよいし、他の画像信号を用いてもよい。 This makes it possible to calculate the feature amount based on the difference between the image signal and the reference signal. As the image signal and the reference signal obtained from the image signal, the same signal as that used for detection of the candidate region of interest may be used, or a different signal may be used. That is, any of R, G, and B may be used as the image signal, Y and C in the above equations (1) and (2) may be used, and other image signals may be used. Good.
 また、特徴量算出部3216は、差分として輝度信号差分及び色信号差分を用いてもよい。そして、輝度信号差分及び色信号差分を重み付け加算した値を特徴量として算出してもよい。 Further, the feature amount calculation unit 3216 may use a luminance signal difference and a color signal difference as the differences. Then, a value obtained by weighted addition of the luminance signal difference and the color signal difference may be calculated as the feature amount.
 これにより、画像信号と基準信号の差分として、輝度信号差分及び色信号差分を用いることが可能になる。上述したように、輝度信号差分は凹凸病変等に対応し、色信号差分は発赤・褪色病変等に対応する。この際の特徴量の算出には例えば上式(3)のEiを用いればよい。なお、重み付け係数αe、βeのいずれか一方が0であってもよいものとする。つまり、本実施形態における輝度信号差分と色信号差分の重み付け加算した値には、輝度信号差分のみのパターン及び色信号差分のみのパターンも含まれる。 This makes it possible to use the luminance signal difference and the color signal difference as the difference between the image signal and the reference signal. As described above, the luminance signal difference corresponds to an uneven lesion or the like, and the color signal difference corresponds to a redness / amber lesion or the like. For example, Ei in the above equation (3) may be used to calculate the feature amount at this time. Note that either one of the weighting coefficients αe and βe may be zero. That is, the value obtained by weighted addition of the luminance signal difference and the color signal difference in the present embodiment includes a pattern of only the luminance signal difference and a pattern of only the color signal difference.
 また、特徴量算出部3216は、輝度信号差分の重みとして色信号差分の重みに比べて大きい値を用いて特徴量を算出してもよい。 Further, the feature amount calculation unit 3216 may calculate the feature amount using a value larger than the weight of the color signal difference as the weight of the luminance signal difference.
 これにより、上式(3)において、αe>βeとすることになり、明るさ信号差分に基づいて注目すべきと判定された領域(凹凸病変に対応)が、色信号差分に基づいて注目すべきと判定された領域(発赤・褪色病変に対応)に比べて、注目領域として検出されやすくすることが可能になる。一般的に、発赤・褪色病変に比べて、凹凸病変の方が病変としての重篤度が高い傾向にあるため、より重篤度の高い病変部を注目領域として検出しやすくすることができる。本実施形態では検出した注目領域に対して合焦するような処理が行われるため、重篤度の高い領域に対してピントが合いやすくなることになり、ユーザー(ドクター)の観察をスムーズにすることが可能になる。 Thereby, in the above equation (3), αe> βe is satisfied, and an area determined to be noticed based on the brightness signal difference (corresponding to the uneven lesion) is noticed based on the color signal difference. Compared to a region determined to be a power (corresponding to redness and amber lesion), it can be easily detected as a region of interest. Generally, since unevenness lesions tend to have higher severity as lesions than redness / amber lesions, lesions with higher severity can be easily detected as a region of interest. In the present embodiment, since processing to focus on the detected attention area is performed, it becomes easy to focus on a highly serious area, and smooth observation of the user (doctor). It becomes possible.
 また、特徴量算出部3216は、注目候補領域の面積に基づいて、注目候補領域の特徴量を算出してもよい。そして、注目領域検出部321は、特徴量が最も大きい注目候補領域を注目領域として検出する。 Further, the feature amount calculation unit 3216 may calculate the feature amount of the attention candidate region based on the area of the attention candidate region. Then, the attention area detection unit 321 detects an attention candidate area having the largest feature amount as the attention area.
 これにより、検出された注目領域の中で、最も面積の大きいものを注目領域として検出することが可能になる。面積が広いということは、それだけ広範囲にわたって注目すべき対象(例えば病変部や泡等。注目すべき対象は注目候補領域の検出手法に応じて変化する)が分布していることになるため、ユーザーにとって観察する優先度が高いと考えられる。 This makes it possible to detect the detected area of interest having the largest area as the attention area. The large area means that there are distributions of objects to be noticed over such a wide range (for example, lesions and bubbles, etc. The objects to be noticed change depending on the detection method of the candidate region of interest). It is considered that the priority of observation is high.
 また、注目領域検出部321は、注目領域として、被写体の異常部を検出してもよい。 Further, the attention area detection unit 321 may detect an abnormal part of the subject as the attention area.
 これにより、注目領域として異常部(例えば病変部)を検出することが可能になる。内視鏡装置による生体内観察は医療分野で用いられることが想定されるが、そのような場合には、観察すべき(発見すべき)対象は病変部(腫瘍や血管の密集部等)であることが多い。よって、異常部を注目領域として検出することで、観察優先度の高い領域に適切に合焦させることが可能になる。 This makes it possible to detect an abnormal part (for example, a lesion part) as the attention area. In-vivo observation with an endoscopic device is assumed to be used in the medical field. In such a case, the object to be observed (to be discovered) is a lesion (such as a tumor or a dense blood vessel). There are often. Therefore, by detecting the abnormal part as the attention area, it is possible to appropriately focus on the area having a high observation priority.
 また、切り替え部250で合焦物体位置として切り替える対象となる目標合焦位置候補の数は、4以下(その値を含む)であってもよい。 Further, the number of target in-focus position candidates to be switched as the in-focus object position by the switching unit 250 may be 4 or less (including that value).
 これにより、目標合焦位置候補の数を少なくすることが可能になる。離散的なN焦点切り替えを行う対物光学系であっても、Nが大きい値であれば合焦処理の負荷(例えば合焦物体位置の切り替え回数やコントラスト値の算出回数等)が大きくなり、連続的な焦点切り替えに比べた場合に合焦動作を高速に行えない可能性が出てくる。よって、本実施形態の手法を効果的に用いるためには、Nの値はある程度小さい必要があり、具体的にはN≦4であってもよい。 This makes it possible to reduce the number of target focus position candidates. Even in an objective optical system that performs discrete N-focus switching, if N is a large value, the load of focusing processing (for example, the number of times of switching the in-focus object position, the number of times of calculation of contrast values, etc.) increases and is continuous. There is a possibility that the focusing operation cannot be performed at a high speed as compared with a typical focus switching. Therefore, in order to use the method of this embodiment effectively, the value of N needs to be small to some extent, and specifically, N ≦ 4 may be satisfied.
 なお、目標合焦位置候補の数は、連続的な焦点切り替えに比べて合焦動作が高速で実行できる程度であればよく、具体的に4に限定されるものではない。例えば、被写界における奥行き方向の合焦する範囲の条件から決定されてもよい。一例としては、撮像部からベスト被写体距離だけ離れた複数の点に対応する、近い側から第1~第Nの目標合焦位置候補と、それぞれに対応する第1~第Nの被写界における奥行き方向の合焦する範囲を考えた場合に、第i(1≦i≦N-2)の被写界における奥行き方向の合焦する範囲が第i+1の被写界における奥行き方向の合焦する範囲とオーバーラップし、且つ第iの被写界における奥行き方向の合焦する範囲が第i+2の被写界における奥行き方向の合焦する範囲とオーバーラップしないように目標合焦位置候補を設定することが考えられる。このようにすることで、隣り合う目標合焦位置候補の間では被写界における奥行き方向の合焦する範囲がオーバーラップする(ある位置が、第iの被写界における奥行き方向の合焦する範囲にも第i+1の被写界における奥行き方向の合焦する範囲にも含まれる状態を指すものとする)ため、第1の被写界における奥行き方向の合焦する範囲の端点(例えば図12のAでの被写界における奥行き方向の合焦する範囲の右端)と、第Nの被写界における奥行き方向の合焦する範囲の端点(例えば図12のDでの被写界における奥行き方向の合焦する範囲の左端)との間の距離範囲では、任意の位置が第1~第Nの被写界における奥行き方向の合焦する範囲のいずれかに含まれることになる。つまり、この距離範囲では合焦物体位置を適切に設定することで必ずピントを合わせることが可能であり、撮像部200と被写体との距離を調整する必要がない。一方、2つ隣の被写界における奥行き方向の合焦する範囲とはオーバーラップさせない。隣同士をオーバーラップさせることで、所定の距離範囲をカバーできているのであるから、それ以上目標合焦位置候補の数を増やさなくてもよいためである。仮に第iの被写界における奥行き方向の合焦する範囲と第i+2の被写界における奥行き方向の合焦する範囲がオーバーラップしているのであれば、その間にある第i+1の被写界における奥行き方向の合焦する範囲内の任意の位置は、第iの被写界における奥行き方向の合焦する範囲及び第i+2の被写界における奥行き方向の合焦する範囲の少なくとも一方に含まれることになるため、第i+1の目標合焦位置候補を設定する利点は大きくない。 Note that the number of target focus position candidates is not limited to four as long as the focus operation can be performed at a higher speed than continuous focus switching. For example, it may be determined from the condition of the focusing range in the depth direction in the object scene. As an example, in the first to Nth object scenes corresponding to the first to Nth target in-focus positions corresponding to a plurality of points away from the imaging unit by the best subject distance, and the corresponding first to Nth object scenes, respectively. When considering the in-focus range in the depth direction, the in-focus range in the depth direction in the i-th (1 ≦ i ≦ N−2) object field is in-focus in the depth direction in the (i + 1) -th object field. The target focus position candidate is set so that the range in the depth direction in the i-th object field does not overlap with the range in the depth direction in the i + 2 object field. It is possible. By doing so, the in-focus range in the depth direction overlaps between adjacent target focus position candidates (a certain position is in-depth focus in the i-th object field). And the end point of the in-focus range in the first object field (for example, FIG. 12). And the end point of the focus range in the depth direction in the Nth scene (for example, the depth direction in the scene at D in FIG. 12). In the range of the distance from the left end of the in-focus range, an arbitrary position is included in any of the in-focus ranges in the depth direction in the first to Nth scenes. That is, in this distance range, it is possible to always focus by appropriately setting the focused object position, and it is not necessary to adjust the distance between the imaging unit 200 and the subject. On the other hand, it does not overlap with the range in the depth direction in the two adjacent scenes. This is because the predetermined distance range can be covered by overlapping the neighbors so that it is not necessary to increase the number of target in-focus position candidates any further. If the range of focus in the depth direction in the i-th object field overlaps the range of focus in the depth direction of the i + 2 object field, in the i + 1-th object field between them. An arbitrary position within the in-focus range in the depth direction is included in at least one of the in-focus range in the depth direction in the i-th object field and the in-depth range in the i + 2 object field. Therefore, the advantage of setting the i + 1th target focus position candidate is not great.
 このようにすることで、被写界における奥行き方向の合焦する範囲に抜けを作らないという条件を満たしつつ、目標合焦位置候補の数を抑制することが可能になる。また、隣り合う被写界における奥行き方向の合焦する範囲のオーバーラップ領域を小さくすることで、効率的な目標合焦位置候補の数の抑制が可能になるため、オーバーラップ領域に制限をかけてもよい(例えば被写界における奥行き方向の合焦する範囲の広さに対する許容オーバーラップ領域の比率の上限を設定してもよい)。なお、撮像部200と被写体との距離を調整することを許容するのであれば、隣り合う被写界における奥行き方向の合焦する範囲をオーバーラップさせなくてもよい。そのようにすれば、さらに目標合焦位置候補の数を減らすことができる。この際には、隣り合う被写界における奥行き方向の合焦する範囲の間の領域では、合焦物体位置制御を行ったとしてもピントを合わせることができないことになる。よって、そのような領域が広すぎると、ユーザーにとって使いにくいシステムとなってしまうため、当該領域の広さに対して制限をかける(例えば上限値を設定する)ようにしてもよい。 In this way, it is possible to reduce the number of target focus position candidates while satisfying the condition that no omission is made in the depth-focused range in the object scene. In addition, by reducing the overlap area in the depth direction in the adjacent object field, it is possible to efficiently reduce the number of target focus position candidates. (For example, an upper limit of the ratio of the allowable overlap area to the width of the in-focus area in the depth direction may be set). Note that if it is allowed to adjust the distance between the imaging unit 200 and the subject, it is not necessary to overlap the in-focus range in the depth direction in the adjacent scenes. By doing so, the number of target in-focus position candidates can be further reduced. In this case, focusing cannot be performed in the area between the in-focus areas in the depth direction in the adjacent object scenes even if focus object position control is performed. Therefore, if such a region is too wide, it becomes a system that is difficult for the user to use. Therefore, the width of the region may be limited (for example, an upper limit value is set).
 3.第2の実施形態
 第2の実施形態における内視鏡装置の構成例は、第1の実施形態と同様の構成をとる。以下に、第1の実施例と異なる部分のみ説明する。
3. Second Embodiment A configuration example of an endoscope apparatus according to the second embodiment has the same configuration as that of the first embodiment. Only the parts different from the first embodiment will be described below.
 まず、切り替え部250の機能が異なる。第2の実施形態では、図13に示すようにフォーカスレンズ位置を、撮像部からのベスト被写体距離が長い点(遠点)に対応するE点と、撮像部からのベスト被写体距離が短い点(近点)に対応するF点の2段階で切り替えを行う。ここでは、フォーカスレンズ位置がE点にある場合は体腔内の管腔状の被写体をスクリーニングする際に適した被写界深度となり、F点にある場合は被写体を近接して観察の精査する際に適した被写界深度となる。 First, the function of the switching unit 250 is different. In the second embodiment, as shown in FIG. 13, the focus lens position is set to point E corresponding to a point (far point) where the best subject distance from the imaging unit is long and point where the best subject distance from the imaging unit is short ( Switching is performed in two stages of F point corresponding to (near point). Here, when the focus lens position is at point E, the depth of field is suitable for screening a luminal subject in the body cavity, and when the focus lens position is at point F, the subject is closely approached for close examination. Suitable depth of field.
 また、フォーカス制御部320の構成が異なる。図9に第2の実施形態におけるフォーカス制御部320の詳細な構成例を示す。第1の実施形態から判定部325が追加された構成となる。 Also, the configuration of the focus control unit 320 is different. FIG. 9 shows a detailed configuration example of the focus control unit 320 in the second embodiment. The determination unit 325 is added from the first embodiment.
 画像処理部330は、領域設定部322と、注目領域検出部321と、判定部325に接続される。判定部325は、切り替え制御部324へ接続される。 The image processing unit 330 is connected to the region setting unit 322, the attention region detection unit 321, and the determination unit 325. The determination unit 325 is connected to the switching control unit 324.
 判定部325は、注目領域の合焦に関する判定を行う。具体的には、判定部325は推定部3251を含んでもよく、推定部3251は、注目領域検出部321から転送された注目領域とのおおまかな距離情報を推定する。そして、判定部325は推定部3251での推定結果に基づいて、注目領域が近点にあるか遠点にあるかの判定を行う。具体的には、まず、画像信号から注目領域に対応する領域の明るさYeを算出する。次に、注目領域の明るさYeを、光源絞りの開度Lとゲイン補正値Gainで補正する。補正は下式(4)を用いる。 判定 Determining unit 325 performs determination related to focusing of the attention area. Specifically, the determination unit 325 may include an estimation unit 3251, and the estimation unit 3251 estimates rough distance information about the attention area transferred from the attention area detection unit 321. Then, the determination unit 325 determines whether the attention area is at the near point or the far point based on the estimation result in the estimation unit 3251. Specifically, first, the brightness Ye of the area corresponding to the attention area is calculated from the image signal. Next, the brightness Ye of the attention area is corrected by the light source aperture L and the gain correction value Gain. The following equation (4) is used for correction.
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 ここで、Lは制御部340を介して得る光源絞り120の開度、Gainは制御部340を介して得るゲイン補正値、αy,βyは重み付け定数を表す。 Here, L is the opening of the light source aperture 120 obtained through the control unit 340, Gain is a gain correction value obtained through the control unit 340, and αy and βy are weighting constants.
 続いて、補正後の注目領域の明るさYcが閾値Trより大きい場合は、注目領域は近点に存在すると判定する。一方、注目領域の明るさYcが閾値Tr以下(その値を含む)の場合は、注目領域は遠点に存在すると判定する。判定部325は、判定結果を、切り替え制御部324へ転送する。 Subsequently, if the corrected brightness Yc of the attention area is larger than the threshold value Tr, it is determined that the attention area exists at a near point. On the other hand, when the brightness Yc of the attention area is equal to or less than the threshold value Tr (including that value), it is determined that the attention area exists at a far point. The determination unit 325 transfers the determination result to the switching control unit 324.
 フォーカス制御部320での処理の一例を、図15を用いて説明する。S201~S206については図14のS101~S106と、S208~S209については図14のS108~S109と同様の処理であるため、詳細な説明は省略する。本実施形態においては、第1の実施形態と比べた場合に、全てのフォーカスレンズ位置でコントラスト値がTcon以下(その値を含む)の場合の処理(第1の実施形態では図14のS107の処理)が図15のS210~S212となる点が異なる。具体的には、全てのフォーカスレンズ位置でコントラスト値が閾値Tcon以下(その値を含む)の場合、判定部325は注目領域の明るさ情報Ycと閾値Trの比較を行い(S210)、YcがTrよりも大きい場合には、切り替え制御部324はフォーカスレンズ位置を近点に合焦する位置に移動させる(S211)。また、YCがTr以下(その値を含む)の場合には、切り替え制御部324はフォーカスレンズ位置を遠点に合焦する位置に移動させる(S212)。 An example of processing in the focus control unit 320 will be described with reference to FIG. S201 to S206 are the same as S101 to S106 in FIG. 14, and S208 to S209 are the same as S108 to S109 in FIG. In the present embodiment, as compared with the first embodiment, the processing in the case where the contrast value is equal to or lower than Tcon (including the value) at all focus lens positions (in the first embodiment, the process of S107 in FIG. 14). The processing is different from S210 to S212 in FIG. Specifically, when the contrast value is equal to or lower than the threshold value Tcon (including the value) at all focus lens positions, the determination unit 325 compares the brightness information Yc of the attention area with the threshold value Tr (S210), and Yc is If it is larger than Tr, the switching control unit 324 moves the focus lens position to a position for focusing on the near point (S211). On the other hand, if YC is equal to or smaller than Tr (including the value thereof), the switching control unit 324 moves the focus lens position to a position for focusing on the far point (S212).
 本実施の形態では、検出された注目領域のコントラストが低い場合、注目領域が遠点か近点のどちらにあるかを、被写体の明るさを用いて判定する。このため、コントラストが低い注目領域に対しても、フォーカスの合った画像を得ることが可能となる。 In this embodiment, when the contrast of the detected attention area is low, it is determined using the brightness of the subject whether the attention area is at a far point or a near point. For this reason, it is possible to obtain a focused image even for a region of interest with low contrast.
 以上の本実施形態では、フォーカス制御部320は、注目領域に対応する被写体と、撮像部200との距離情報を推定する推定部3251を含む。そして、複数の目標合焦位置候補でそれぞれ算出した複数のコントラスト値が、全て所定の閾値よりも低い場合には、推定部3251での推定結果に基づいて、対物光学系の合焦物体位置を選択する。 In the present embodiment described above, the focus control unit 320 includes the estimation unit 3251 that estimates distance information between the subject corresponding to the region of interest and the imaging unit 200. If the plurality of contrast values calculated for each of the plurality of target focus position candidates are all lower than the predetermined threshold, the focus object position of the objective optical system is determined based on the estimation result of the estimation unit 3251. select.
 これにより、全てのコントラスト値が閾値以下である場合、つまり全ての目標合焦位置候補において注目領域に対して合焦していないと判定された場合にも、適切な目標合焦位置を選択することが可能になる。画像上の被写体がエッジ成分を多く含まない場合等では、画像信号の高周波成分が少ないため、被写体に合焦していたとしてもコントラスト値が低いことがあり得る。その場合、第1の実施形態で述べたように、比較的コントラスト値が高い目標合焦位置候補を目標合焦位置として選択しても、合焦するか否かはわからない。また、遠点に合焦する目標合焦位置を選択した場合には、被写界深度を広くとることはできても、注目領域に合焦しているか否かは考慮していない。そこで、距離推定により選択すべき目標合焦位置を求めるものとする。ただし、ここでは測距センサや複雑な画像処理等の手法は用いないことを想定している。測距センサ等を搭載すれば撮像部200の大型化につながるし、複雑な画像処理では処理負荷が増大してしまい高速な合焦動作が難しくなるためである。 As a result, when all the contrast values are equal to or less than the threshold value, that is, when it is determined that all the target focus position candidates are not focused on the attention area, an appropriate target focus position is selected. It becomes possible. When the subject on the image does not contain many edge components, the high-frequency component of the image signal is small, so the contrast value may be low even if the subject is in focus. In that case, as described in the first embodiment, even if a target focus position candidate having a relatively high contrast value is selected as the target focus position, it is not known whether or not the focus is achieved. Further, when the target focus position for focusing on the far point is selected, it is not considered whether or not the focus area is focused, even though the depth of field can be increased. Therefore, a target in-focus position to be selected by distance estimation is obtained. However, it is assumed here that a distance measuring sensor or a complicated image processing method is not used. This is because if the distance measuring sensor or the like is mounted, the imaging unit 200 is increased in size, and complicated image processing increases the processing load and makes high-speed focusing operation difficult.
 また、推定部3251は、注目領域の明るさ情報に基づいて、撮像部200との距離情報を推定してもよい。 Further, the estimation unit 3251 may estimate the distance information with respect to the imaging unit 200 based on the brightness information of the attention area.
 これにより、明るさ情報に基づいた距離推定が可能になる。これは、明るければ撮像部200と被写体との距離は短く、暗ければ距離は長いという簡単なものである。よって、処理負荷は非常に軽くできる。特に、本実施形態のようなN焦点切り替え(特にN=2のようにNの値が小さい場合)では有効である。例えば2焦点切り替えであれば、推定部3251による推定は近いか遠いかのどちらかであるかの判定が行えればよく、明るさ情報に基づく推定のように推定精度が低くとも、目標合焦位置の選択が可能である。 This enables distance estimation based on brightness information. This is as simple as the distance between the imaging unit 200 and the subject is short if it is bright, and long if it is dark. Therefore, the processing load can be very light. In particular, this is effective in N focus switching as in this embodiment (especially when N is small such as N = 2). For example, in the case of bifocal switching, it is only necessary to determine whether the estimation by the estimation unit 3251 is near or far. Even if the estimation accuracy is low as in the estimation based on the brightness information, the target focus is achieved. Position selection is possible.
 また、切り替え部250は、対物光学系の目標合焦位置候補として、撮像部からの合焦被写体距離が長い第1の目標合焦位置候補と、第1の目標合焦位置候補に比べて撮像部からの合焦被写体距離が短い第2の目標合焦位置候補の2点を切り替えてもよい。 In addition, the switching unit 250 captures an image as a target focus position candidate of the objective optical system compared to the first target focus position candidate having a long in-focus subject distance from the image capturing unit and the first target focus position candidate. You may switch 2 points | pieces of the 2nd target focusing position candidate with a short focus object distance from a part.
 ここで、合焦被写体距離とは第1の実施形態において上述したものと同様であり、狭義にはベスト被写体距離であってもよい。なお、第1の目標合焦位置候補での合焦被写体距離に対応する被写体の位置を遠点とし、第2の目標合焦位置候補での合焦被写体距離に対応する被写体の位置を近点とする。 Here, the focused subject distance is the same as that described above in the first embodiment, and may be the best subject distance in a narrow sense. Note that the position of the subject corresponding to the focused subject distance in the first target focus position candidate is the far point, and the position of the subject corresponding to the focused subject distance in the second target focus position candidate is the near point. And
 これにより、離散的な合焦物体位置切り替えとして、2焦点切り替えを行うことが可能になる。よって、切り替え部250は2点間の切り替えを行えばよいため、連続的な合焦物体位置切り替えや、目標合焦位置候補の数が多い離散的合焦物体位置切り替えに比べて、構成を簡略化することが可能になる。また、合焦動作も少なくて済む(例えば、最大2回のコントラスト値算出処理を行えばよい)ため、合焦動作終了までの時間を短くすることができる。さらに、ユーザーは現在の合焦物体位置が遠点に合焦する位置と近点に合焦する位置のどちらにあるかを把握すればよいため、システムの操作が容易であり、インターフェースも簡易なものとすることができる。 This makes it possible to perform bifocal switching as discrete focused object position switching. Therefore, since the switching unit 250 only needs to switch between two points, the configuration is simplified compared to continuous focused object position switching or discrete focused object position switching with a large number of target focused position candidates. It becomes possible to become. Further, since the focusing operation can be reduced (for example, the contrast value calculation process may be performed twice at the maximum), the time until the focusing operation is completed can be shortened. In addition, since the user only needs to know whether the current in-focus object position is in the far point or the near point, the system is easy to operate and the interface is simple. Can be.
 以上、本発明を適用した2つの実施の形態1~2およびその変形例について説明したが、本発明は、各実施の形態1~2やその変形例そのままに限定されるものではなく、実施段階では、発明の要旨を逸脱しない範囲内で構成要素を変形して具体化することができる。また、上記した各実施の形態1~2や変形例に開示されている複数の構成要素を適宜組み合わせることによって、種々の発明を形成することができる。例えば、各実施の形態1~2や変形例に記載した全構成要素からいくつかの構成要素を削除してもよい。さらに、異なる実施の形態や変形例で説明した構成要素を適宜組み合わせてもよい。また、明細書又は図面において、少なくとも一度、より広義または同義な異なる用語と共に記載された用語は、明細書又は図面のいかなる箇所においても、その異なる用語に置き換えることができる。このように、発明の主旨を逸脱しない範囲内において種々の変形や応用が可能である。 As described above, the two embodiments 1 and 2 to which the present invention is applied and the modified examples thereof have been described. However, the present invention is not limited to the embodiments 1 and 2 and modified examples as they are. The constituent elements can be modified and embodied without departing from the spirit of the invention. Various inventions can be formed by appropriately combining a plurality of constituent elements disclosed in the above-described first and second embodiments and modifications. For example, some constituent elements may be deleted from all the constituent elements described in the first and second embodiments and modifications. Furthermore, you may combine suitably the component demonstrated in different embodiment and modification. In addition, a term described together with a different term having a broader meaning or the same meaning at least once in the specification or the drawings can be replaced with the different term anywhere in the specification or the drawings. Thus, various modifications and applications are possible without departing from the spirit of the invention.
100 光源部、110 白色光源、130 光源絞り駆動部、
140 回転色フィルター、150 回転駆動部、160 集光レンズ、
200 撮像部、210 ライトガイドファイバー、220 照明レンズ、
230 対物レンズ、240 フォーカスレンズ、250 切り替え部、
260 撮像素子、300 制御装置、310 A/D変換部、
320 フォーカス制御部、321 注目領域検出部、322 領域設定部、
323 コントラスト値算出部、324 切り替え制御部、325 判定部、
330 画像処理部、331 前処理部、332 同時化部、333 後処理部、
340 制御部、400 表示部、500 外部I/F部、3211 色信号算出部、
3212 基準信号作成部、3213 差分算出部、3214 領域分割部、
3215 候補領域検出部、3216 特徴量算出部、3231 輝点除去部、
3232 高域抽出部、3251 推定部
100 light source unit, 110 white light source, 130 light source aperture drive unit,
140 rotation color filter, 150 rotation drive unit, 160 condenser lens,
200 imaging unit, 210 light guide fiber, 220 illumination lens,
230 objective lens, 240 focus lens, 250 switching unit,
260 imaging device, 300 control device, 310 A / D converter,
320 focus control unit, 321 attention region detection unit, 322 region setting unit,
323 contrast value calculation unit, 324 switching control unit, 325 determination unit,
330 Image processing unit, 331 Pre-processing unit, 332 Synchronization unit, 333 Post-processing unit,
340 control unit, 400 display unit, 500 external I / F unit, 3211 color signal calculation unit,
3212 reference signal creation unit, 3213 difference calculation unit, 3214 region division unit,
3215 candidate area detection unit, 3216 feature quantity calculation unit, 3231 bright spot removal unit,
3232 High frequency extraction unit, 3251 estimation unit

Claims (19)

  1.  画像信号を取得する撮像部と、
     前記撮像部の状態により決定される合焦物体位置を、離散的に設定された複数の目標合焦位置候補のいずれかに切り替える切り替え部と、
     複数の前記目標合焦位置候補のうち、いずれか1つを目標合焦位置として選択し、選択した前記目標合焦位置が前記合焦物体位置となるように前記切り替え部を制御して合焦制御を行うフォーカス制御部と、
     を含み、
     前記フォーカス制御部は、
     前記画像信号から被写体上の注目領域を検出する注目領域検出部を有し、
     前記フォーカス制御部は、
     検出された前記注目領域に対して合焦しているか否かの判定を行い、合焦していると判定された場合に、前記判定が行われたタイミングに対応する前記目標合焦位置候補を、前記目標合焦位置として選択し、
     合焦していないと判定された場合に、前記切り替え部の制御を行って、前記判定が行われたタイミングに対応する前記目標合焦位置候補とは異なる前記目標合焦位置候補に、前記合焦物体位置を切り替えることを特徴とする内視鏡装置。
    An imaging unit for obtaining an image signal;
    A switching unit that switches the focused object position determined by the state of the imaging unit to any one of a plurality of discretely set target focus position candidates;
    One of the plurality of target focus position candidates is selected as a target focus position, and the switching unit is controlled to focus so that the selected target focus position becomes the focus object position. A focus control unit for controlling,
    Including
    The focus control unit
    An attention area detection unit for detecting an attention area on the subject from the image signal;
    The focus control unit
    It is determined whether or not the detected region of interest is in focus, and when it is determined that the target area is in focus, the target focus position candidate corresponding to the timing at which the determination is performed is determined. , Select as the target focus position,
    When it is determined that the in-focus state is not achieved, the switching unit is controlled so that the in-focus position candidate different from the target in-focus position candidate corresponding to the timing at which the determination is performed is set to the in-focus position candidate. An endoscope apparatus characterized by switching a focal object position.
  2.  請求項1において、
     前記フォーカス制御部は、
     前記注目領域に対応する領域のコントラスト値を算出するコントラスト値算出部を含み、
     前記フォーカス制御部は、
     算出された前記コントラスト値に基づいて前記注目領域に対して合焦しているか否かの判定を行い、合焦していると判定された場合に、前記判定が行われたタイミングに対応する前記目標合焦位置候補を、前記目標合焦位置として選択することを特徴とする内視鏡装置。
    In claim 1,
    The focus control unit
    A contrast value calculation unit that calculates a contrast value of a region corresponding to the region of interest;
    The focus control unit
    It is determined whether or not the region of interest is in focus based on the calculated contrast value. When it is determined that the region of focus is in focus, the timing corresponding to the timing at which the determination is performed is determined. An endoscope apparatus, wherein a target focus position candidate is selected as the target focus position.
  3.  請求項2において、
     前記フォーカス制御部は、
     前記コントラスト値が所定の閾値よりも低い場合に、前記注目領域に対して合焦していないと判定し、前記切り替え部の制御を行って、前記判定が行われたタイミングに対応する前記目標合焦位置候補とは異なる前記目標合焦位置候補に、前記合焦物体位置を切り替えることを特徴とする内視鏡装置。
    In claim 2,
    The focus control unit
    When the contrast value is lower than a predetermined threshold value, it is determined that the region of interest is not in focus, the switching unit is controlled, and the target alignment corresponding to the timing at which the determination is performed. An endoscope apparatus, wherein the focus object position is switched to a target focus position candidate different from a focus position candidate.
  4.  請求項3おいて、
     前記フォーカス制御部は、
     前記複数の目標合焦位置候補でそれぞれ算出した複数の前記コントラスト値が、全て所定の閾値よりも低い場合に、前記目標合焦位置として所定目標合焦位置候補を選択することを特徴とする内視鏡装置。
    In claim 3,
    The focus control unit
    The predetermined target in-focus position candidate is selected as the target in-focus position when a plurality of the contrast values respectively calculated for the plurality of target in-focus position candidates are all lower than a predetermined threshold. Endoscopic device.
  5.  請求項3おいて、
     前記フォーカス制御部は、
     前記複数の目標合焦位置候補でそれぞれ算出した複数の前記コントラスト値が、全て所定の閾値よりも低い場合に、前記目標合焦位置として、最も前記コントラスト値の高い前記目標合焦位置候補を用いることを特徴とする内視鏡装置。
    In claim 3,
    The focus control unit
    When the plurality of contrast values respectively calculated for the plurality of target focus position candidates are all lower than a predetermined threshold, the target focus position candidate having the highest contrast value is used as the target focus position. An endoscope apparatus characterized by that.
  6.  請求項4において、
     前記フォーカス制御部は、
     前記所定目標合焦位置候補として、複数の前記目標合焦位置候補のうち、撮像部からの合焦被写体距離が最も長いものを用いることを特徴とする内視鏡装置。
    In claim 4,
    The focus control unit
    An endoscope apparatus characterized in that, as the predetermined target focus position candidate, the one having the longest focus object distance from the imaging unit among the plurality of target focus position candidates is used.
  7.  請求項4において、
     外部からの入力情報を受け付けるインターフェース部を含み、
     前記フォーカス制御部は、
     外部から前記インターフェース部を介して入力された前記入力情報に基づいて、前記所定目標合焦位置候補を設定することを特徴とする内視鏡装置。
    In claim 4,
    Including an interface that accepts external input information,
    The focus control unit
    An endoscope apparatus, wherein the predetermined target in-focus position candidate is set based on the input information input from the outside via the interface unit.
  8.  請求項2において、
     前記フォーカス制御部は、
     前記注目領域に対応する前記被写体と、前記撮像部との距離情報を推定する推定部を含み、
     前記複数の目標合焦位置候補でそれぞれ算出した複数の前記コントラスト値が、全て所定の閾値よりも低い場合に、前記推定部での推定結果に基づいて、前記目標合焦位置を選択することを特徴とする内視鏡装置。
    In claim 2,
    The focus control unit
    An estimation unit that estimates distance information between the subject corresponding to the region of interest and the imaging unit;
    Selecting the target in-focus position based on the estimation result in the estimation unit when the plurality of contrast values respectively calculated by the plurality of target in-focus position candidates are all lower than a predetermined threshold. Endoscopic device characterized.
  9.  請求項8において、
     前記推定部は、
     前記注目領域の明るさ情報に基づいて、前記撮像部との前記距離情報を推定することを特徴とする内視鏡装置。
    In claim 8,
    The estimation unit includes
    An endoscope apparatus, wherein the distance information with respect to the imaging unit is estimated based on brightness information of the attention area.
  10.  請求項1において、
     前記切り替え部は、
     前記目標合焦位置候補として、前記撮像部からの合焦被写体距離が長い第1の目標合焦位置候補と、前記撮像部からの合焦被写体距離が前記第1の目標合焦位置候補に比べて短い第2の目標合焦位置候補の2点を切り替えることを特徴とする内視鏡装置。
    In claim 1,
    The switching unit is
    As the target focus position candidate, a first target focus position candidate having a long focus subject distance from the imaging unit and a focus subject distance from the image capture unit are compared with the first target focus position candidate. An endoscope apparatus characterized by switching two points of a short second target focus position candidate.
  11.  請求項1において、
     前記注目領域検出部は、
     前記画像信号から基準信号を作成する基準信号作成部と、
     前記画像信号と前記基準信号に基づいて、注目候補領域を検出する候補領域検出部と、
     前記注目候補領域の特徴量を算出する特徴量算出部と、
     を含み、
     前記注目領域検出部は、
     前記特徴量に基づいて前記注目領域を検出することを特徴とする内視鏡装置。
    In claim 1,
    The attention area detector
    A reference signal creation unit for creating a reference signal from the image signal;
    A candidate area detector for detecting a candidate area of interest based on the image signal and the reference signal;
    A feature amount calculating unit for calculating a feature amount of the attention candidate region;
    Including
    The attention area detector
    An endoscope apparatus that detects the region of interest based on the feature amount.
  12.  請求項11において、
     前記特徴量算出部は、
     前記画像信号と前記基準信号の差分に基づいて、前記注目候補領域の前記特徴量を算出し、
     前記注目領域検出部は、
     前記特徴量が最も大きい前記注目候補領域を前記注目領域として検出することを特徴とする内視鏡装置。
    In claim 11,
    The feature amount calculation unit includes:
    Based on the difference between the image signal and the reference signal, the feature amount of the attention candidate region is calculated,
    The attention area detector
    An endoscope apparatus, wherein the attention candidate region having the largest feature amount is detected as the attention region.
  13.  請求項12において、
     前記特徴量算出部は、
     前記差分として輝度信号差分及び色信号差分を用い、前記輝度信号差分及び前記色信号差分を重み付け加算した値を前記特徴量として算出することを特徴とする内視鏡装置。
    In claim 12,
    The feature amount calculation unit includes:
    An endoscope apparatus, wherein a luminance signal difference and a color signal difference are used as the difference, and a value obtained by weighted addition of the luminance signal difference and the color signal difference is calculated as the feature amount.
  14.  請求項13において、
     前記特徴量算出部は、
     前記輝度信号差分の前記重みとして前記色信号差分の前記重みに比べて大きい値を用いて前記特徴量を算出することを特徴とする内視鏡装置。
    In claim 13,
    The feature amount calculation unit includes:
    An endoscope apparatus, wherein the feature amount is calculated using a value larger than the weight of the color signal difference as the weight of the luminance signal difference.
  15.  請求項11において、
     前記特徴量算出部は、
     前記注目候補領域の面積に基づいて、前記注目候補領域の前記特徴量を算出し、
     前記注目領域検出部は、
     前記特徴量が最も大きい前記注目候補領域を前記注目領域として検出することを特徴とする内視鏡装置。
    In claim 11,
    The feature amount calculation unit includes:
    Based on the area of the attention candidate region, the feature amount of the attention candidate region is calculated,
    The attention area detector
    An endoscope apparatus, wherein the attention candidate region having the largest feature amount is detected as the attention region.
  16.  請求項11において、
     前記基準信号作成部は、
     前記画像信号に対してフィルター処理を施して、前記基準信号を作成することを特徴とする内視鏡装置。
    In claim 11,
    The reference signal generator is
    An endoscope apparatus, wherein the reference signal is created by performing a filtering process on the image signal.
  17.  請求項1において、
     前記注目領域検出部は、
     前記注目領域として、前記被写体の異常部を検出することを特徴とする内視鏡装置。
    In claim 1,
    The attention area detector
    An endoscope apparatus, wherein an abnormal portion of the subject is detected as the attention area.
  18.  請求項1において、
     離散的に設定される複数の前記目標合焦位置候補の数は、4以下(その値を含む)であることを特徴とする内視鏡装置。
    In claim 1,
    The number of the plurality of target in-focus position candidates set discretely is 4 or less (including that value).
  19.  複数の目標合焦位置候補が離散的に設定された対物光学系におけるフォーカス制御方法であって、
     撮像部からの画像信号を取得し、
     前記画像信号から被写体上の注目領域を検出し、
     検出した前記注目領域に対して合焦しているか否かの判定を行い、
     合焦していると判定された場合に、前記判定が行われたタイミングに対応する前記目標合焦位置候補を、目標合焦位置として選択し、
     合焦していないと判定された場合に、前記判定が行われたタイミングに対応する前記目標合焦位置候補とは異なる前記目標合焦位置候補に、合焦物体位置を切り替えることを特徴とするフォーカス制御方法。
    A focus control method in an objective optical system in which a plurality of target focus position candidates are discretely set,
    Obtain the image signal from the imaging unit,
    Detecting a region of interest on the subject from the image signal;
    Determine whether or not the detected region of interest is in focus,
    When it is determined that the subject is in focus, the target focus position candidate corresponding to the timing at which the determination is performed is selected as a target focus position,
    When it is determined that the subject is not in focus, the focused object position is switched to the target focused position candidate different from the target focused position candidate corresponding to the timing at which the determination is performed. Focus control method.
PCT/JP2012/077282 2011-10-26 2012-10-23 Endoscopic device and focus control method WO2013061939A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011235247 2011-10-26
JP2011-235247 2011-10-26

Publications (1)

Publication Number Publication Date
WO2013061939A1 true WO2013061939A1 (en) 2013-05-02

Family

ID=48167769

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/077282 WO2013061939A1 (en) 2011-10-26 2012-10-23 Endoscopic device and focus control method

Country Status (1)

Country Link
WO (1) WO2013061939A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018116225A (en) * 2017-01-20 2018-07-26 キヤノン株式会社 Focus adjustment device, control method therefor, program, and imaging apparatus
CN110062596A (en) * 2016-12-20 2019-07-26 奥林巴斯株式会社 The working method of automatic focal point control device, endoscope apparatus and automatic focal point control device
WO2021149141A1 (en) * 2020-01-21 2021-07-29 オリンパス株式会社 Focus control device, endoscope system, and operation method for focus control device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS55166609A (en) * 1979-06-12 1980-12-25 Olympus Optical Co Ltd Method of focusing of optical system having lighting device
JP2002153421A (en) * 2000-08-23 2002-05-28 Toshiba Corp Endoscope system
JP2006246491A (en) * 2006-03-10 2006-09-14 Olympus Corp Video microscope
JP2006288432A (en) * 2005-04-05 2006-10-26 Olympus Medical Systems Corp Electronic endoscope
JP2008307229A (en) * 2007-06-14 2008-12-25 Olympus Corp Image processing device and image processing program
JP2011193983A (en) * 2010-03-18 2011-10-06 Olympus Corp Endoscope system, imaging apparatus, and control method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS55166609A (en) * 1979-06-12 1980-12-25 Olympus Optical Co Ltd Method of focusing of optical system having lighting device
JP2002153421A (en) * 2000-08-23 2002-05-28 Toshiba Corp Endoscope system
JP2006288432A (en) * 2005-04-05 2006-10-26 Olympus Medical Systems Corp Electronic endoscope
JP2006246491A (en) * 2006-03-10 2006-09-14 Olympus Corp Video microscope
JP2008307229A (en) * 2007-06-14 2008-12-25 Olympus Corp Image processing device and image processing program
JP2011193983A (en) * 2010-03-18 2011-10-06 Olympus Corp Endoscope system, imaging apparatus, and control method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110062596A (en) * 2016-12-20 2019-07-26 奥林巴斯株式会社 The working method of automatic focal point control device, endoscope apparatus and automatic focal point control device
JP2018116225A (en) * 2017-01-20 2018-07-26 キヤノン株式会社 Focus adjustment device, control method therefor, program, and imaging apparatus
WO2021149141A1 (en) * 2020-01-21 2021-07-29 オリンパス株式会社 Focus control device, endoscope system, and operation method for focus control device

Similar Documents

Publication Publication Date Title
JP6013020B2 (en) Endoscope apparatus and method for operating endoscope apparatus
JP5953187B2 (en) Focus control device, endoscope system, and focus control method
JP5948076B2 (en) Focus control device, endoscope device and focus control method
JP6137921B2 (en) Image processing apparatus, image processing method, and program
US9154745B2 (en) Endscope apparatus and program
JP6049518B2 (en) Image processing apparatus, endoscope apparatus, program, and operation method of image processing apparatus
US20160128545A1 (en) Endoscope apparatus and method for controlling endoscope apparatus
JP5951211B2 (en) Focus control device and endoscope device
JP5698476B2 (en) ENDOSCOPE SYSTEM, ENDOSCOPE SYSTEM OPERATING METHOD, AND IMAGING DEVICE
JP5973708B2 (en) Imaging apparatus and endoscope apparatus
CN107005646B (en) Focus control device, endoscope device, and control method for focus control device
US20120120305A1 (en) Imaging apparatus, program, and focus control method
JP6453905B2 (en) FOCUS CONTROL DEVICE, ENDOSCOPE DEVICE, AND FOCUS CONTROL DEVICE CONTROL METHOD
JP5857160B2 (en) ENDOSCOPE IMAGING SYSTEM AND ENDOSCOPE IMAGING SYSTEM OPERATING METHOD
JP2012245157A (en) Endoscope apparatus
JP2011147707A (en) Imaging device, endoscope system, and control method of imaging device
JP6533284B2 (en) Focus control device, imaging device, endoscope system, control method of focus control device
JP5996218B2 (en) Endoscope apparatus and method for operating endoscope apparatus
JP6120491B2 (en) Endoscope apparatus and focus control method for endoscope apparatus
JP2013043007A (en) Focal position controller, endoscope, and focal position control method
WO2013061939A1 (en) Endoscopic device and focus control method
JP2013076823A (en) Image processing apparatus, endoscope system, image processing method, and program
JP6177387B2 (en) Endoscope device focus control device, endoscope device, and operation method of endoscope control device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12843943

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12843943

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP