US7869704B2 - Focus adjusting device, image pickup apparatus, and focus adjustment method - Google Patents

Focus adjusting device, image pickup apparatus, and focus adjustment method Download PDF

Info

Publication number
US7869704B2
US7869704B2 US11/970,386 US97038608A US7869704B2 US 7869704 B2 US7869704 B2 US 7869704B2 US 97038608 A US97038608 A US 97038608A US 7869704 B2 US7869704 B2 US 7869704B2
Authority
US
United States
Prior art keywords
focus
focus detection
detection area
image
focused
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US11/970,386
Other versions
US20080193115A1 (en
Inventor
Masaaki Uenishi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Assigned to CANON KABUSHIKI KAISHA reassignment CANON KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: UENISHI, MASAAKI
Publication of US20080193115A1 publication Critical patent/US20080193115A1/en
Priority to US12/968,109 priority Critical patent/US8208803B2/en
Application granted granted Critical
Publication of US7869704B2 publication Critical patent/US7869704B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/02Mountings, adjusting means, or light-tight connections, for optical elements for lenses
    • G02B7/04Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
    • G02B7/08Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification adapted to co-operate with a remote control mechanism
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/635Region indicators; Field of view indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/675Focus control based on electronic image sensor signals comprising setting of focusing regions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals

Definitions

  • the present invention relates to a focus adjusting device, an image pickup apparatus, and a focus adjustment method. More particularly, the present invention relates to autofocusing used in, for example, an electronic still camera or a video.
  • an autofocusing (AF) method making use of an object detection function of detecting, for example, the moving body or the face of a human being is well known.
  • AF autofocusing
  • a focus detection area is set at the position of the detected main object, to control the focusing so the main object is brought into focus.
  • an area where the main object might exist, such as the center of a screen is assumed, to set the focus detection area.
  • the main object When the AF method is used to continue focusing the main object at all times at the focus detection area which is set as in, for example, a continuance AF operation or an AF operation during movie recording, the main object may or may not be successfully detected. In such a case, the position of the focus detection area is frequently changed between a main-object area and the central area of the screen.
  • the focus position moves between the main object and a background object, thereby resulting in flickering of the screen, which is troublesome.
  • a camera that is provided with a line-of-sight detecting function (in which the size of an area selected by line-of-sight detection is changed in accordance with the reliability of an output of the line-of-sight detection) is proposed (refer to Japanese Patent Laid-Open No. 6-308373).
  • the reliability of the output of the line-of-sight detection is low, the size of a focus detection area is made large, to set a main object within the focus detection area.
  • an autofocusing controlling device using a moving-object detecting unit a device that calculates the difference between AF evaluation values for image frames having different times for respective AF areas that have been previously divided and set is proposed. By calculating this difference, it is possible to extract an AF area including a moving object, so that the AF area is focused as an area to be focused. (Refer to Japanese Patent Laid-Open No. 2000-188713.)
  • Japanese Patent Laid-Open No. 6-308373 and Japanese Patent Laid-Open No. 2004-138970 discuss methods that only increase the focus detection area when the main object cannot be detected (that is, when the reliability of the detection of the main object is low). These methods do not allow the main object to be reliably set in the focus detection area unless, for example, the focus detection area is spread over the entire screen. In addition, when the focus detection area is merely increased, the proportion of the background in the focus detection area is increased. As a result, the background may be brought into focus.
  • the present invention provides a focus adjusting device that can continue focusing a main object without performing unnecessary setting of focus detection areas, an image pickup apparatus, and a focus adjustment method.
  • the present invention makes it possible to continue focusing a main object without performing unnecessary setting of focus detection areas.
  • a detecting unit that detects, from a picked up image, an object image to be focused is capable of detecting the object image to be focused, but is no longer capable of detecting the object image after setting a first focus detection area
  • a setting unit that sets a focus detection area for detecting a focused state of an image pickup optical system sets a second focus detection area, in addition to the first focus detection area, in accordance with the position of the focus detection area. Then, on the basis of signal outputs at the set first and second focus detection areas, the image pickup optical system is moved to adjust the focus.
  • FIG. 1 is a block diagram of a structure of an electronic camera to which the present invention is applied.
  • FIG. 2 is a flowchart illustrating the operation of the electronic camera to which the present invention is applied.
  • FIGS. 3A and 3B are flowcharts of a subroutine of continuous AF operations in FIG. 2 .
  • FIGS. 4A and 4B are flowcharts of a subroutine of setting an AF frame during the continuous AF operation in FIGS. 3A and 3B .
  • FIGS. 5A to 5D illustrate a method of setting an AF frame during the continuous AF operation in FIGS. 4A and 4B .
  • FIG. 6 is a flowchart of a subroutine of an AF operation in FIG. 2 .
  • FIG. 7 is a flowchart of a subroutine of a photographing operation in FIG. 2 .
  • FIGS. 8A and 8B illustrate a problem in setting an AF frame in a related art.
  • FIG. 1 is a block diagram of a structure of an electronic camera, which is an image pickup apparatus according to an embodiment of the present invention.
  • Reference numeral 101 denotes a photographing lens including a zoom mechanism
  • reference numeral 102 denotes an aperture-and-shutter that controls light quantity.
  • Reference numeral 103 denotes an automatic exposure control (AE) processing unit
  • reference numeral 104 denotes a focus lens that performs focus adjustment on an image pickup element (described later).
  • the photographing lens 101 , the aperture-and-shutter 102 , and the focus lens 104 are also called an image pickup optical system.
  • Reference numeral 105 denotes an AF processing unit
  • reference numeral 106 denotes a stroboscope
  • reference numeral 107 denotes a flash pre-emission (EF) processing unit.
  • Reference numeral 108 denotes the image pickup element serving as a light detecting unit or a photoelectric converting unit that converts reflected light from an object into an electrical signal (signal output) and outputs a picked up image.
  • Reference numeral 109 denotes an A/D converting unit including a correlated double sampling (CDS) circuit that reduces output noise of the image pickup element 108 or a nonlinear amplifying circuit that performs nonlinear amplification prior to A/D conversion.
  • CDS correlated double sampling
  • Reference numeral 110 denotes an image processing unit
  • reference numeral 111 denotes a white balance (WB) processing unit
  • reference numeral 112 denotes a format converting unit
  • Reference numeral 113 denotes a high-speed internal memory (such as a random access memory; hereunder referred to as “DRAM”)
  • reference numeral 114 denotes an image recording unit including a recording medium, such as a memory card, and an interface thereof.
  • Reference numeral 115 denotes a system controlling unit that controls a system, such as a photographing sequence
  • reference numeral 116 denotes an image display memory (hereunder referred to as “VRAM”).
  • Reference numeral 117 denotes an operation display unit that displays an index corresponding to a focus detection area and a photographing screen when performing a photographing operation, in addition to displaying an image, performing a displaying operation for assisting an operation, and displaying the state of the camera.
  • Reference numeral 118 denotes an operating unit for operating the camera from the outside.
  • Reference numeral 119 denotes a photographing mode switch that performs a setting operation, such as turning on and off a face detection mode.
  • Reference numeral 120 denotes a main switch for turning on a power supply of the system
  • reference numeral 121 denotes a switch (hereunder referred to as “SW 1 ”) for performing a photographing standby operation (such as AF or AE)
  • reference numeral 122 denotes a photographing switch (hereunder referred to as “SW 2 ”) for performing a photographing operation after operating the SW 1
  • Reference numeral 123 denotes a moving-image switch (hereafter referred to as “moving-image SW”) that starts or ends the photographing operation of a moving image.
  • Reference numeral 124 denotes a face detection module that detects a face as an object image using an image signal processed at the image processing unit 110 , and that sends to the system controlling unit 115 information (position, size, and reliability) of one detected face or a plurality of detected faces and the order of priority determined by the face information.
  • the method of detecting a face will not be described in detail below because it is not the central feature of the present invention.
  • the DRAM 113 is used as a high-speed buffer serving as a temporary image storage unit, or is used in, for example, a working memory in compressing or expanding an image.
  • Examples of the operating unit 118 include the following. They are a menu switch, a zoom lever, and an operation mode change-over switch.
  • the menu switch is provided with a photographing function of the electronic camera or performs various setting operations, such as a setting operation performed when reproducing an image.
  • the zoom lever instructs a zooming operation of the photographing lens.
  • the change-over switch switches operation modes, that is, a photographing mode and a reproduction mode.
  • FIG. 2 is a flowchart illustrating the operation of the electronic camera.
  • Step S 201 the main switch 120 is detected. If it is turned on, the process proceeds to Step S 202 .
  • the function of the main switch 120 is to turn on the power supply of the system. If the main switch 120 is turned off, Step S 201 is repeated.
  • Step S 202 the remaining capacity of the image recording unit 114 is examined. If the remaining capacity is zero, the process proceeds to Step S 203 . If it is not zero, the process proceeds to Step S 204 .
  • Step S 203 a warning is given that the remaining capacity at the image recording unit 114 is zero, and the process returns to Step S 201 .
  • the warning can be achieved by displaying it on the operation display unit 117 , generating a warning sound from a sound outputting unit (not shown), or a combination thereof.
  • Step S 204 whether or not an AF mode is a face detection mode is examined. If it is a face detection mode, the process proceeds to Step S 205 . If it is not a face detection mode, the process proceeds to Step S 208 . In Step S 205 , a face detection processing operation is executed in the face detection module 124 , to obtain face information and face priority order. Then, the process proceeds to Step S 206 .
  • Step S 206 it is examined whether or not a face is detected in the face detection processing operation of Step S 205 . If it is detected, the process proceeds to Step S 207 . If it is not detected, the process proceeds to Step S 208 . At this time, an image that is generated on the basis of a picked up image obtained by the image pickup element 108 is sequentially displayed. In this state, by sequentially displaying by the operation display unit 117 an image that is sequentially written to VRAM 116 , the operation display unit 117 functions as an electronic finder. In Step S 207 , in the operation display unit 117 , frames are displayed at positions of faces having Z orders of priority at most among the detected faces.
  • Step S 208 the moving-image SW 123 is detected. If it is turned on, the process proceeds to Step S 209 . If it is not turned on, the process proceeds to Step S 213 . In Step S 209 , recording of the moving image is started, so that the process proceeds to Step S 210 . In Step S 210 , a continuous AF operation is performed in accordance with the flowcharts of FIGS. 3A and 3B .
  • Step S 211 the moving-image SW 123 is detected. If it is turned on, the process proceeds to Step S 212 . If it is not turned on, the process proceeds to Step S 210 .
  • Step S 211 the remaining capacity of the image recording unit 114 is examined. If the remaining capacity is zero, the process proceeds to Step S 212 . If it is not zero, the process proceeds to Step S 210 . In Step S 212 , the recording of the moving image is ended, so that the process returns to Step S 201 .
  • Step S 213 whether the AF mode is a continuous AF mode or a single AF mode is examined. If it is a continuous AF mode, the process proceeds to Step S 214 . If it is a single AF mode, the process proceeds to Step S 215 . In Step S 214 , a continuous AF operation is performed in accordance with the flowcharts of FIGS. 3A and 3B (described later).
  • Step S 215 the state of the SW 1 is examined. If it is turned on, the process proceeds to Step S 216 . If it is not turned on, the process returns to Step S 201 .
  • Step S 216 at the AE processing unit 103 , AE processing is performed from an output of the image processing unit 110 .
  • Step S 217 an AF operation is performed in accordance with the flowchart of FIG. 6 (described later).
  • Step S 218 the state of SW 2 is examined. If it is turned on, the process proceeds to Step S 219 . If it is not turned on, the process proceeds to Step S 220 .
  • Step S 219 a photographing operation is performed in accordance with the flowchart of FIG. 7 (described later).
  • Step S 220 the state of the SW 1 is examined. If it is not turned on, the process returns to Step S 201 . If it is turned on, the process returns to Step S 218 , to lock the focusing operation until the SW 2 is turned on or the SW 1 is turned off.
  • Step S 301 in accordance with the flowcharts of FIGS. 4A and 4B (described later), a focus detection area (hereunder referred to as “AF frame”) is set, and the process proceeds to Step S 302 .
  • Step S 302 a focus evaluation value (also called “AF evaluation value”) is obtained.
  • Step S 303 whether or not a face is detected is examined. If a face is detected, the process proceeds to Step S 305 . If a face is not detected, the process proceeds to Step S 304 .
  • Step S 304 whether or not a reference obtaining flag, used to determine whether or not a reference evaluation value (reference value) is obtained and used in the flowcharts of FIGS.
  • Step S 305 the focus evaluation value is stored, as a reference evaluation value, in a computing memory (not shown) built in the system controlling unit 115 .
  • the reference obtaining flag is made false.
  • Step S 306 whether or not a peak detection flag is true is examined. If it is true, the process proceeds to Step S 323 . If it is false, the process proceeds to Step S 307 .
  • Step S 307 the current position of the focus lens 104 is obtained.
  • Step S 308 “1” is added to an obtaining counter for counting the number of times the focus evaluation value is obtained and the number of times the current position of the focus lens 104 is obtained. In an initialization operation (not shown), the obtaining counter is previously set at 0.
  • Step S 309 whether or not the value of the obtaining counter is 1 is examined. If it is 1, the process proceeds to Step S 312 . If it is not 1, the process proceeds to Step S 310 .
  • Step S 310 whether or not the present evaluation value is greater than a previous focus evaluation value is determined. If present evaluation value is greater than the previous focus evaluation value, the process proceeds to Step S 311 . If it is not greater than the previous focus evaluation value, the process proceeds to Step S 318 .
  • Step S 311 “1” is added to an increment counter that counts increments of the present evaluation value from the previous focus evaluation value.
  • the increment counter is previously set at 0.
  • Step S 312 the present focus evaluation value is stored, as a maximum value of the focus evaluation value, in the computing memory (not shown) built in the system controlling unit 115 .
  • Step S 313 the current position of the focus lens 104 is stored, as a peak position of the focus evaluation value, in the computing memory (not shown) built in the system controlling unit 115 .
  • Step S 314 the current focus evaluation value is stored, as the previous focus evaluation value, in the computing memory (not shown) built in the system controlling unit 115 .
  • Step S 315 whether or not the current position of the focus lens 104 is at an end of a focus detection range is examined. If it is at an end of the focus detection range, the process proceeds to Step S 316 .
  • Step S 317 the process proceeds to Step S 317 .
  • Step S 316 the direction of movement of the focus lens 104 is reversed.
  • Step S 317 the focus lens 104 is moved by a predetermined amount.
  • Step S 318 whether or not the difference between the maximum value of the focus evaluation value and the current focus evaluation value is greater than a predetermined value is examined. If the difference is greater than the predetermined value, the process proceeds to Step S 319 . If the difference is not greater than the predetermined value, the process proceeds to Step S 314 . In Step S 319 , whether or not the value of the increment counter is greater than 0 is examined. If it is less than 0, the process proceeds to Step S 314 . If it is greater than 0, the process proceeds to Step S 320 .
  • Step S 320 the focus lens 104 is moved to the peak position where the focus evaluation value stored in Step S 307 is a maximum.
  • Step S 321 the peak detection flag is made true.
  • Step S 322 the obtaining counter is set at 0.
  • Step S 323 a focus stop flag that indicates that a focusing operation is stopped as a result of detecting a peak is made true, so that the process proceeds to Step S 324 .
  • Step S 324 whether or not the present focus evaluation value is varied with respect to the maximum value of the focus evaluation value by a proportion greater than or equal to a predetermined proportion is examined. If it is varied by a proportion greater than or equal to the predetermined proportion, the process proceeds to Step S 326 . If it is varied by a small proportion, the process proceeds to Step S 325 . In Step S 325 , the position of the focus lens 104 is maintained as it is.
  • Step S 326 since the position of the focus lens where the focus evaluation value is a maximum is re-determined, the peak detection flag and the focus stop flag are both made false, so that the maximum value of the focus evaluation value and the peak position are reset.
  • Step S 327 the increment counter is reset to end the subroutine.
  • the focus lens is driven so that a focused state in the AF frame is achieved at all times.
  • FIGS. 4A and 4B are flowcharts of the subroutine of setting continuous AF frames (Step S 301 ) in the flowchart in FIG. 3A .
  • FIGS. 5A to 5D illustrate a method of setting AF frames in the flowcharts of FIGS. 4A and 4B .
  • Step S 401 whether or not a face is detected is examined. If a face is detected, the process proceeds to Step S 402 . If a face is not detected, the process proceeds to Step S 404 .
  • Step S 402 as shown in FIG. 5A , at the position of the face having the highest priority in the most recent face detection processing result (hereunder referred to as “main face”), one AF frame is set as an AF frame corresponding to a main object. Then, the process proceeds to Step S 403 .
  • the center of the AF frame in an image may correspond to the center of the detected main face or midway between both eyes.
  • WID which is a size as shown in FIG.
  • Step S 403 a face detection flag indicating that a face is detected is made true.
  • a face tracking flag (described later) is made false.
  • a face NG counter that counts the number of times a focus evaluation value is obtained from the state in which a face can no longer be detected is cleared, so that the process proceeds to Step S 302 of FIG. 3A .
  • what is detected as a main object to be focused can be indicated to a photographer by superposing the area of the detected face upon and displaying it at an image generated on the basis of a picked up image obtained by the image pickup element 108 .
  • a display indicating the area of the face refers to a display indicating coordinates of a rectangular area, surrounding the area of the face and extending along the size of the detected face area, as an identifying area indicating the position of the main object to be focused.
  • identifying the face area at this time for example, a pair of eyes, nose, mouth, and the lines of the face may be detected, to determine the area of the face of a person by the relative positions of these parts.
  • the area surrounding the face area is not limited to a rectangular area, so that it may be an elliptical area, a shape extending along the lines of the detected face, etc.
  • the “display indicating the area of the face” also refers to a display corresponding to an AF frame, and may be an index.
  • the AF frame is always set at the position of the main face, which is the main object, to perform the continuous AF operation, so that the main face can be continuously focused.
  • Step S 404 whether or not the face detection flag is true is examined. If the face detection flag is true, the process proceeds to Step S 406 . If the face detection flag is false, the process proceeds to Step S 405 . In Step S 405 , the AF frame is set at the center of a screen. The process then proceeds to Step S 302 in FIG. 3A .
  • the AF frame is set at the center position where the main object might exist.
  • the display of the area that is superposed upon the image that is generated on the basis of the picked up image obtained by the image pickup element 108 is switched to a display of an area corresponding to the AF frame.
  • Step S 406 whether or not the focus stop flag is true is examined. If the focus stop flag is true, the process proceeds to Step S 422 . If it is false, the process proceeds to Step S 407 .
  • Steps S 407 to Step S 421 are related to an operation for detecting the movement of the face and a face tracking operation (in which an AF frame indicating a focus evaluation value substantially equal to the reference evaluation value after detecting the movement of the face is determined as a movement-destination AF frame). These operations are performed when the detection of the face is unsuccessfully performed (that is, no good (NG)).
  • a focus evaluation value of an AF frame that is set from the most recent result of detection of a face is compared with the reference evaluation value used for face movement detection and re-obtained in Step S 304 of FIG. 3A when a face detection state is OK or the reference obtaining flag is true, so that the face tracking operation is performed.
  • the face tracking flag is true.
  • Step S 407 whether or not the face tracking flag is true is examined. If the face tracking flag is true, the process proceeds to Step S 410 . If it is false, the process proceeds to Step S 408 .
  • Step S 409 the face tracking flag is true.
  • N ⁇ M frames are set.
  • an area corresponding to the reference AF frame is superposed upon and displayed at the image generated on the basis of the picked up image obtained by the image pickup element 108 and displayed on the operation display unit 117 .
  • the display makes it possible to indicate to a photographer the location of focus of the camera.
  • Step S 410 whether or not a change counter that counts the number of changes in the focus evaluation value in Step S 412 (described below) is a predetermined value is examined. If the value of the change counter is equal to the predetermined value, the process proceeds to Step S 414 . If it is not equal to the predetermined value, the process proceeds to Step S 411 . In Step S 411 , whether or not the focus evaluation value is changed from the reference evaluation value by a proportion greater than or equal to a predetermined proportion (which is set as an evaluation-value change threshold value) is examined. If the focus evaluation value is changed by a proportion greater than or equal to the predetermined proportion, the process proceeds to Step S 412 . If not, the process proceeds to Step S 413 .
  • Step S 412 “1” is added to the change counter that counts the number of times the focus evaluation value changes continuously from the reference evaluation value by a proportion greater than or equal to the predetermined proportion.
  • Step S 413 since the focus evaluation value does not change by a proportion greater than or equal to the predetermined proportion, the change counter is cleared.
  • Steps S 414 to S 421 are performed when, in Steps S 410 and S 411 , it is determined that the focus evaluation value is changed with respect to the reference evaluation value and that the face moves by a number of times corresponding to the number of successive changes.
  • these operations are related to an operation that is performed when, from a plurality of set AF frames, it is determined that the AF frame indicating a focus evaluation value that is substantially equal to the reference evaluation value is the movement-destination AF frame.
  • Step S 414 a very small range is scanned with the current position of the focus lens as center. That is, the focus evaluation value corresponding to a focused state is obtained by obtaining an image while moving the focus lens.
  • Step S 415 the focused state is determined with every set AF frame on the basis of whether or not a peak value that is greater than or equal to a predetermined value exists in the focus evaluation values obtained by scanning.
  • Step S 416 whether or not there exists an AF frame where a focused state is achieved in the focused-state determination results of the plurality of AF frames is examined. If there exists such an AF frame, the process proceeds to Step S 417 . If not, the process proceeds to Step S 405 .
  • Step S 417 the AF frame in which the proportion of change with respect to the reference evaluation value is smallest among the peak values of the focus evaluation values obtained by scanning in the respective AF frames is determined as the most probable face AF frame and selected.
  • Step S 418 whether or not the proportion of change of the focus evaluation value of the selected AF frame is within the predetermined proportion (that is set as a face-evaluation-value determining threshold value) with respect to the reference evaluation value is examined. If the proportion of change is within the predetermined proportion, the process proceeds to Step S 419 . If not, the process proceeds to Step S 405 .
  • Step S 419 with reference to the selected AF frame position, as in Step S 408 , the N ⁇ M AF frames are reset.
  • Step S 420 the reference obtaining flag is true, and the change counter is cleared in Step S 421 .
  • Step S 422 the setting of the AF frames is maintained, and the process proceeds to Step S 423 .
  • Step S 423 “1” is added to the face NG counter (which counts the number of times the focus evaluation value is obtained from the state in which the face can no longer be detected), and the process proceeds to Step S 424 .
  • Step S 424 whether or not the value of the face NG counter is greater than or equal to a predetermined value is examined. If it is greater than or equal to the predetermined value, it is determined that the main object is not within the screen, and the process proceeds to Step S 405 to set the AF frames at the center of the screen.
  • a threshold value of the change counter in Step S 410 in FIG. 4B may be changed.
  • the evaluation value change threshold value in Step S 411 , the face-evaluation-value determining threshold value in Step S 418 , and a threshold value of the face NG counter in Step S 424 may be changed.
  • Step S 601 whether or not the continuous AF operation is being performed is examined. If the continuous AF operation is being performed, the process proceeds to Step S 602 . If not, the process proceeds to Step S 604 .
  • Step S 602 whether or not the peak detection flag in the continuous AF operation in FIGS. 3A and 3B is true is examined. If it is true, the process proceeds to Step S 603 . If not, the process proceeds to Step S 604 .
  • Step S 603 a state close to the focused state may be achieved due to the continuous AF operation. Therefore, a controlling operation is performed so that scanning is performed over a range that is narrower than the entire focus detection range with the current focus lens position as center. In contrast, in Step S 604 , a controlling operation is performed so that scanning is performed over the entire focus detection range.
  • Step S 605 an AF evaluation value computing section in the AF processing unit 105 is used to determine a focused state on the basis of whether or not there exists a peak value greater than or equal to the predetermined value in the focus evaluation values obtained by scanning.
  • Step S 606 whether or not the focused-state determination result in Step S 605 is a focused state is examined. If it is a focused state, the process proceeds to Step S 607 . If not, the process proceeds to Step S 608 .
  • Step S 607 a controlling operation is performed so that the focus lens 104 is moved to the position of the focus lens 104 when a focused-state position determining section in the AF processing unit 105 extracts a peak value that is greater than or equal to the predetermined value, that is, the focus lens 104 is moved to the focused-state position.
  • a controlling operation is performed so that the focused focus detection area is displayed on the operation display unit 117 . Since, in Step S 608 , the entire focus detection range is scanned when the narrow range is only scanned, whether or not the scanning of the entire focus detection range is completed is examined. If the scanning of the entire focus detection range is completed, the process proceeds to Step S 609 . If not, the process proceeds to Step S 604 . In Step S 609 , a peak value that is greater than or equal to the predetermined value cannot be determined, so that the focused state is not achieved. Therefore, the focus lens 104 is moved to a previously set position called a fixed point.
  • Step S 701 the brightness of the object is measured.
  • Step S 702 exposure is performed on the image pickup element 108 in accordance with the brightness of the object measured in Step S 701 .
  • An image formed on a surface of the image pickup element by the exposure is subjected to photoelectric conversion and becomes an analog signal.
  • Step S 703 the analog signal is transmitted to the A/D converting unit 109 , and is converted into a digital signal after pre-processing such as nonlinear processing or noise reduction of an output of the image pickup element 108 .
  • Step S 704 the image processing unit 110 processes an output signal from the A/D converting unit 109 into a suitable output image signal.
  • Step S 705 image format conversion, such as conversion to a JPEG format, is performed on the output image signal, and the resulting output image signal is transmitted to and stored in the recording unit 114 in Step S 706 .
  • the main object to be photographed by the photographer is a human being, and an area for focus detection is an area where the face is detected
  • the present invention is not limited thereto.
  • Step S 424 in FIG. 4A when the face is unsuccessfully detected by a number of times greater than or equal to the predetermined number of times, and the setting of the plurality of focus detection areas is repeated, the tracking of the face is stopped, and the focus detection areas are set at the center of the screen.
  • the present invention is not limited thereto. That is, when the face is unsuccessfully detected after a predetermined time has elapsed, it is possible to stop the tracking of the face, and set the focus detection areas at the center of the screen.
  • Step S 408 in FIG. 4B N ⁇ M frames are set with reference to a previously set AF-frame position.
  • the number of AF frames can be made less than N ⁇ M frames, and set.
  • the face of an object is detected on the basis of a picked up image. Then, on the basis of a detected face area, a main object is determined to set a focus detection area corresponding to the main object. Then, on the basis of the picked up image in the focus detection area, a focus evaluation value is calculated to perform focus adjustment.
  • the present invention is not limited thereto. For example, it is possible to detect a defocusing amount of the focus lens in a focus detection area of a phase-difference detecting sensor corresponding to an area where the face is detected, to perform focus adjustment.
  • the defocusing amounts of the focus lens in focus detection areas of the phase-difference detecting sensor corresponding to the main-object position when the face is previously successfully detected and the position of the vicinity thereof, are detected. Then, focus adjustment is performed on the basis of the detection results. By this, it is possible to continuously focus the main object without performing unnecessary setting of focus detection areas.
  • N ⁇ M frames are set with reference to an AF frame where a recent face detection is successfully performed.
  • an AF frame differing from the AF frame where the face is successfully detected is set along with the AF frame where the face is successfully detected, to perform focus adjustment on the basis of the result of detection in the focus detection area corresponding to this position.
  • an area corresponding to an AF frame serving as a reference is superposed upon and displayed at an image displayed by the operation display unit 117 .
  • the areas corresponding to the frames can be superposed upon and displayed at the image displayed by the operation display unit 117 , or it is possible not to display the areas when tracking the face.
  • the focusing is controlled by setting the main focus detection area at the main object position which is successfully detected.
  • a minimum number of focus detection areas is set. Then, the movement and destination of movement of the main object are determined from the focus evaluation value of each focus detection area, and the main focus detection area and the focus detection areas are repeatedly reset on the basis of the movement destination. This makes it possible to continuously focus the main object without performing unnecessary setting of focus detection areas. Further, since there is no need to perform unnecessary setting of focus detection areas, unnecessary computation time can be reduced.
  • a focus adjusting device comprising a detecting unit configured to detect an image of an object to be focused, from a picked up image; a setting unit configured to set a focus detection area for when a focused state of an image pickup optical system is detected; and a focus adjusting unit configured to adjust a focus by moving the image pickup optical system on the basis of a signal output at the focus detection area, wherein, when the detecting unit is capable of detecting the object image to be focused, but is no longer capable of performing the detection after a first focus detection area is set, the setting unit sets, in addition to the first focus detection area, at least one second focus detection area in accordance with the position of the first focus detection area.
  • a further embodiment of the present invention provides a focus adjustment method of a focus adjusting device, the method comprising detecting an image of an object to be focused, from a picked up image; setting a focus detection area for when a focused state of an image pickup optical system is detected; and adjusting a focus by moving the image pickup optical system on the basis of a signal output at the focus detection area, wherein, when the detecting step is capable of detecting the object image to be focused, but is no longer capable of performing the detection after a first focus detection area is set, the setting step sets, in addition to the set first focus detection area, a second focus detection area in accordance with the position of the first focus detection area.
  • a further embodiment of the present invention provides a focus adjusting device comprising a detecting unit configured to detect an image of an object to be focused, from a picked up image; a setting unit configured to set a focus detection area for when a focused state of an image pickup optical system is detected; and a focus adjusting unit configured to adjust a focus by moving the image pickup optical system on the basis of a signal output at the focus detection area, wherein the setting unit sets a first focus detection area corresponding to the object image that is detected by the detecting unit and that is to be focused, or both the previously set first focus detection area and a second focus detection area that is in accordance with the position of the first focus detection area.
  • a further embodiment of the present invention provides a focus adjustment method of a focus adjusting device, the method comprising detecting an image of an object to be focused, from a picked up image; setting a focus detection area for when a focused state of an image pickup optical system is detected; and adjusting a focus by moving the image pickup optical system on the basis of a signal output at the focus detection area, wherein the setting step is used to set a first focus detection area corresponding to the object image that is detected by the detecting step and that is to be focused, or both the previously set first focus detection area and a second focus detection area that is in accordance with the position of the first focus detection area.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)
  • Focusing (AREA)

Abstract

A focus adjusting device in which, when a detecting unit that detects an object image to be focused from a picked up image is capable of detecting the object image to be focused, a setting unit that sets a focus detection area when a focused state of an image pickup optical system is detected sets a second focus detection area after setting a first focus detection area and in accordance with the position of the first focus detection area, corresponding to the object image that is detected by the detecting unit and that is to be focused. Then, on the basis of signal outputs at the set focus detection area, the image pickup optical system is driven to perform focus adjustment.

Description

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to a focus adjusting device, an image pickup apparatus, and a focus adjustment method. More particularly, the present invention relates to autofocusing used in, for example, an electronic still camera or a video.
2. Description of the Related Art
In an image pickup apparatus (typically, a digital camera), an autofocusing (AF) method making use of an object detection function of detecting, for example, the moving body or the face of a human being is well known. In the AF method, first, a main object is detected from a picked up image, to control focusing so that the detected main object is brought into focus.
In the AF method, when the main object is successfully detected, a focus detection area is set at the position of the detected main object, to control the focusing so the main object is brought into focus. However, when the main object cannot be detected, an area where the main object might exist, such as the center of a screen, is assumed, to set the focus detection area.
When the AF method is used to continue focusing the main object at all times at the focus detection area which is set as in, for example, a continuance AF operation or an AF operation during movie recording, the main object may or may not be successfully detected. In such a case, the position of the focus detection area is frequently changed between a main-object area and the central area of the screen. Here, when the main object does not exist at the central area of the screen, as shown in FIGS. 8A and 8B, the focus position moves between the main object and a background object, thereby resulting in flickering of the screen, which is troublesome. To overcome this problem, for example, it is necessary to set the focus detection area at the position of a previously detected main object, even if the main object cannot be successfully detected.
As a related detecting unit that detects a main object and that tries to solve to the aforementioned problem, a camera that is provided with a line-of-sight detecting function (in which the size of an area selected by line-of-sight detection is changed in accordance with the reliability of an output of the line-of-sight detection) is proposed (refer to Japanese Patent Laid-Open No. 6-308373). According to the document, when the reliability of the output of the line-of-sight detection is low, the size of a focus detection area is made large, to set a main object within the focus detection area.
In addition, a main object detecting unit that makes a focus detection area large when the main object cannot be detected is proposed in Japanese Patent Laid-Open No. 2004-138970.
Further, as an autofocusing controlling device using a moving-object detecting unit, a device that calculates the difference between AF evaluation values for image frames having different times for respective AF areas that have been previously divided and set is proposed. By calculating this difference, it is possible to extract an AF area including a moving object, so that the AF area is focused as an area to be focused. (Refer to Japanese Patent Laid-Open No. 2000-188713.)
However, Japanese Patent Laid-Open No. 6-308373 and Japanese Patent Laid-Open No. 2004-138970 discuss methods that only increase the focus detection area when the main object cannot be detected (that is, when the reliability of the detection of the main object is low). These methods do not allow the main object to be reliably set in the focus detection area unless, for example, the focus detection area is spread over the entire screen. In addition, when the focus detection area is merely increased, the proportion of the background in the focus detection area is increased. As a result, the background may be brought into focus.
In the case where a moving-object detecting unit, such as that discussed in Japanese Patent Laid-Open No. 2000-188713, is used, when the main object is a moving body, it is possible to always detect the main object to set the focus detection area. However, since a plurality of focus detection areas must be set over the entire screen, it takes time to perform calculate differences by an amount corresponding to the number of focus detection areas. Therefore, time is wasted in correspondence with the number of unnecessary focus detection areas. In addition, the main object cannot be detected when it does not move.
SUMMARY OF THE INVENTION
The present invention provides a focus adjusting device that can continue focusing a main object without performing unnecessary setting of focus detection areas, an image pickup apparatus, and a focus adjustment method.
The present invention makes it possible to continue focusing a main object without performing unnecessary setting of focus detection areas.
According to an aspect of the present invention, it is possible to carry out the following. That is, when a detecting unit that detects, from a picked up image, an object image to be focused is capable of detecting the object image to be focused, but is no longer capable of detecting the object image after setting a first focus detection area, a setting unit that sets a focus detection area for detecting a focused state of an image pickup optical system sets a second focus detection area, in addition to the first focus detection area, in accordance with the position of the focus detection area. Then, on the basis of signal outputs at the set first and second focus detection areas, the image pickup optical system is moved to adjust the focus.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram of a structure of an electronic camera to which the present invention is applied.
FIG. 2 is a flowchart illustrating the operation of the electronic camera to which the present invention is applied.
FIGS. 3A and 3B are flowcharts of a subroutine of continuous AF operations in FIG. 2.
FIGS. 4A and 4B are flowcharts of a subroutine of setting an AF frame during the continuous AF operation in FIGS. 3A and 3B.
FIGS. 5A to 5D illustrate a method of setting an AF frame during the continuous AF operation in FIGS. 4A and 4B.
FIG. 6 is a flowchart of a subroutine of an AF operation in FIG. 2.
FIG. 7 is a flowchart of a subroutine of a photographing operation in FIG. 2.
FIGS. 8A and 8B illustrate a problem in setting an AF frame in a related art.
DESCRIPTION OF THE EMBODIMENTS
Embodiments of the present invention will now be described with reference to the drawings.
Description of Apparatus
FIG. 1 is a block diagram of a structure of an electronic camera, which is an image pickup apparatus according to an embodiment of the present invention. Reference numeral 101 denotes a photographing lens including a zoom mechanism, and reference numeral 102 denotes an aperture-and-shutter that controls light quantity. Reference numeral 103 denotes an automatic exposure control (AE) processing unit, and reference numeral 104 denotes a focus lens that performs focus adjustment on an image pickup element (described later). The photographing lens 101, the aperture-and-shutter 102, and the focus lens 104 are also called an image pickup optical system.
Reference numeral 105 denotes an AF processing unit, reference numeral 106 denotes a stroboscope, and reference numeral 107 denotes a flash pre-emission (EF) processing unit. Reference numeral 108 denotes the image pickup element serving as a light detecting unit or a photoelectric converting unit that converts reflected light from an object into an electrical signal (signal output) and outputs a picked up image. Reference numeral 109 denotes an A/D converting unit including a correlated double sampling (CDS) circuit that reduces output noise of the image pickup element 108 or a nonlinear amplifying circuit that performs nonlinear amplification prior to A/D conversion.
Reference numeral 110 denotes an image processing unit, reference numeral 111 denotes a white balance (WB) processing unit, and reference numeral 112 denotes a format converting unit. Reference numeral 113 denotes a high-speed internal memory (such as a random access memory; hereunder referred to as “DRAM”), reference numeral 114 denotes an image recording unit including a recording medium, such as a memory card, and an interface thereof. Reference numeral 115 denotes a system controlling unit that controls a system, such as a photographing sequence, and reference numeral 116 denotes an image display memory (hereunder referred to as “VRAM”).
Reference numeral 117 denotes an operation display unit that displays an index corresponding to a focus detection area and a photographing screen when performing a photographing operation, in addition to displaying an image, performing a displaying operation for assisting an operation, and displaying the state of the camera. Reference numeral 118 denotes an operating unit for operating the camera from the outside. Reference numeral 119 denotes a photographing mode switch that performs a setting operation, such as turning on and off a face detection mode.
Reference numeral 120 denotes a main switch for turning on a power supply of the system, reference numeral 121 denotes a switch (hereunder referred to as “SW1”) for performing a photographing standby operation (such as AF or AE), and reference numeral 122 denotes a photographing switch (hereunder referred to as “SW2”) for performing a photographing operation after operating the SW1. Reference numeral 123 denotes a moving-image switch (hereafter referred to as “moving-image SW”) that starts or ends the photographing operation of a moving image. Reference numeral 124 denotes a face detection module that detects a face as an object image using an image signal processed at the image processing unit 110, and that sends to the system controlling unit 115 information (position, size, and reliability) of one detected face or a plurality of detected faces and the order of priority determined by the face information. The method of detecting a face will not be described in detail below because it is not the central feature of the present invention.
The DRAM 113 is used as a high-speed buffer serving as a temporary image storage unit, or is used in, for example, a working memory in compressing or expanding an image. Examples of the operating unit 118 include the following. They are a menu switch, a zoom lever, and an operation mode change-over switch. The menu switch is provided with a photographing function of the electronic camera or performs various setting operations, such as a setting operation performed when reproducing an image. The zoom lever instructs a zooming operation of the photographing lens. The change-over switch switches operation modes, that is, a photographing mode and a reproduction mode.
Flow of Operation of Apparatus
An operation according to an embodiment of the present invention will now be described in detail with reference to FIG. 2. FIG. 2 is a flowchart illustrating the operation of the electronic camera.
First, in Step S201, the main switch 120 is detected. If it is turned on, the process proceeds to Step S202. Here, the function of the main switch 120 is to turn on the power supply of the system. If the main switch 120 is turned off, Step S201 is repeated. In Step S202, the remaining capacity of the image recording unit 114 is examined. If the remaining capacity is zero, the process proceeds to Step S203. If it is not zero, the process proceeds to Step S204. In Step S203, a warning is given that the remaining capacity at the image recording unit 114 is zero, and the process returns to Step S201. The warning can be achieved by displaying it on the operation display unit 117, generating a warning sound from a sound outputting unit (not shown), or a combination thereof.
In Step S204, whether or not an AF mode is a face detection mode is examined. If it is a face detection mode, the process proceeds to Step S205. If it is not a face detection mode, the process proceeds to Step S208. In Step S205, a face detection processing operation is executed in the face detection module 124, to obtain face information and face priority order. Then, the process proceeds to Step S206.
In Step S206, it is examined whether or not a face is detected in the face detection processing operation of Step S205. If it is detected, the process proceeds to Step S207. If it is not detected, the process proceeds to Step S208. At this time, an image that is generated on the basis of a picked up image obtained by the image pickup element 108 is sequentially displayed. In this state, by sequentially displaying by the operation display unit 117 an image that is sequentially written to VRAM 116, the operation display unit 117 functions as an electronic finder. In Step S207, in the operation display unit 117, frames are displayed at positions of faces having Z orders of priority at most among the detected faces. The intended meaning of this sentence is that Z (design value) frames that are not troublesome to display are displayed. For example, in a group photograph, when there are tens of people, if frames are displayed for all of the faces, a screen becomes covered with frames, thereby making it difficult to see an image. To prevent this, the number of frames that is displayed for face positions is limited. Then, the process proceeds to Step S208.
In Step S208, the moving-image SW 123 is detected. If it is turned on, the process proceeds to Step S209. If it is not turned on, the process proceeds to Step S213. In Step S209, recording of the moving image is started, so that the process proceeds to Step S210. In Step S210, a continuous AF operation is performed in accordance with the flowcharts of FIGS. 3A and 3B. In Step S211, the moving-image SW 123 is detected. If it is turned on, the process proceeds to Step S212. If it is not turned on, the process proceeds to Step S210. In Step S211, the remaining capacity of the image recording unit 114 is examined. If the remaining capacity is zero, the process proceeds to Step S212. If it is not zero, the process proceeds to Step S210. In Step S212, the recording of the moving image is ended, so that the process returns to Step S201.
In Step S213, whether the AF mode is a continuous AF mode or a single AF mode is examined. If it is a continuous AF mode, the process proceeds to Step S214. If it is a single AF mode, the process proceeds to Step S215. In Step S214, a continuous AF operation is performed in accordance with the flowcharts of FIGS. 3A and 3B (described later).
In Step S215, the state of the SW1 is examined. If it is turned on, the process proceeds to Step S216. If it is not turned on, the process returns to Step S201. In Step S216, at the AE processing unit 103, AE processing is performed from an output of the image processing unit 110.
In Step S217, an AF operation is performed in accordance with the flowchart of FIG. 6 (described later). In Step S218, the state of SW2 is examined. If it is turned on, the process proceeds to Step S219. If it is not turned on, the process proceeds to Step S220. In Step S219, a photographing operation is performed in accordance with the flowchart of FIG. 7 (described later). In Step S220, the state of the SW1 is examined. If it is not turned on, the process returns to Step S201. If it is turned on, the process returns to Step S218, to lock the focusing operation until the SW2 is turned on or the SW1 is turned off.
Continuous AF Subroutine
A subroutine of the continuous AF operation of Step S210 and a subroutine of the continuous AF operation of Step S214 in the flowchart of FIG. 2 will now be described with reference to the flowcharts of FIGS. 3A and 3B.
First, in Step S301, in accordance with the flowcharts of FIGS. 4A and 4B (described later), a focus detection area (hereunder referred to as “AF frame”) is set, and the process proceeds to Step S302. In Step S302, a focus evaluation value (also called “AF evaluation value”) is obtained. In Step S303, whether or not a face is detected is examined. If a face is detected, the process proceeds to Step S305. If a face is not detected, the process proceeds to Step S304. In Step S304, whether or not a reference obtaining flag, used to determine whether or not a reference evaluation value (reference value) is obtained and used in the flowcharts of FIGS. 4A and 4B (described later), is true is examined. If it is true, the process proceeds to Step S305. If it is false, the process proceeds to Step S306. In Step S305, the focus evaluation value is stored, as a reference evaluation value, in a computing memory (not shown) built in the system controlling unit 115. In addition, the reference obtaining flag is made false.
In Step S306, whether or not a peak detection flag is true is examined. If it is true, the process proceeds to Step S323. If it is false, the process proceeds to Step S307. In Step S307, the current position of the focus lens 104 is obtained. In Step S308, “1” is added to an obtaining counter for counting the number of times the focus evaluation value is obtained and the number of times the current position of the focus lens 104 is obtained. In an initialization operation (not shown), the obtaining counter is previously set at 0. In Step S309, whether or not the value of the obtaining counter is 1 is examined. If it is 1, the process proceeds to Step S312. If it is not 1, the process proceeds to Step S310.
In Step S310, whether or not the present evaluation value is greater than a previous focus evaluation value is determined. If present evaluation value is greater than the previous focus evaluation value, the process proceeds to Step S311. If it is not greater than the previous focus evaluation value, the process proceeds to Step S318.
In Step S311, “1” is added to an increment counter that counts increments of the present evaluation value from the previous focus evaluation value. In an initialization operation (not shown), the increment counter is previously set at 0.
In Step S312, the present focus evaluation value is stored, as a maximum value of the focus evaluation value, in the computing memory (not shown) built in the system controlling unit 115. In Step S313, the current position of the focus lens 104 is stored, as a peak position of the focus evaluation value, in the computing memory (not shown) built in the system controlling unit 115. In Step S314, the current focus evaluation value is stored, as the previous focus evaluation value, in the computing memory (not shown) built in the system controlling unit 115. In Step S315, whether or not the current position of the focus lens 104 is at an end of a focus detection range is examined. If it is at an end of the focus detection range, the process proceeds to Step S316. If it is not at an end of the focus detection range, the process proceeds to Step S317. In Step S316, the direction of movement of the focus lens 104 is reversed. In Step S317, the focus lens 104 is moved by a predetermined amount.
In Step S318, whether or not the difference between the maximum value of the focus evaluation value and the current focus evaluation value is greater than a predetermined value is examined. If the difference is greater than the predetermined value, the process proceeds to Step S319. If the difference is not greater than the predetermined value, the process proceeds to Step S314. In Step S319, whether or not the value of the increment counter is greater than 0 is examined. If it is less than 0, the process proceeds to Step S314. If it is greater than 0, the process proceeds to Step S320. Here, if, after continuingly increasing the previous focus evaluation value, the difference between the maximum value of the focus evaluation value and the current focus evaluation value is greater than the predetermined value, that is, the maximum value is reduced by the predetermined value, the maximum value is determined as a focus peak position. In Step S320, the focus lens 104 is moved to the peak position where the focus evaluation value stored in Step S307 is a maximum. In Step S321, the peak detection flag is made true. In Step S322, the obtaining counter is set at 0.
In Step S323, a focus stop flag that indicates that a focusing operation is stopped as a result of detecting a peak is made true, so that the process proceeds to Step S324. In Step S324, whether or not the present focus evaluation value is varied with respect to the maximum value of the focus evaluation value by a proportion greater than or equal to a predetermined proportion is examined. If it is varied by a proportion greater than or equal to the predetermined proportion, the process proceeds to Step S326. If it is varied by a small proportion, the process proceeds to Step S325. In Step S325, the position of the focus lens 104 is maintained as it is. In Step S326, since the position of the focus lens where the focus evaluation value is a maximum is re-determined, the peak detection flag and the focus stop flag are both made false, so that the maximum value of the focus evaluation value and the peak position are reset. In Step S327, the increment counter is reset to end the subroutine.
As described above, in the continuous AF operation, the focus lens is driven so that a focused state in the AF frame is achieved at all times.
Subroutine of Setting AF frame
A subroutine of setting continuous AF frames of Step S301 in the flowchart of FIG. 3A will now be described with reference to the flowcharts of FIGS. 4A and 4B and the illustrations of FIGS. 5A to 5D. FIGS. 4A and 4B are flowcharts of the subroutine of setting continuous AF frames (Step S301) in the flowchart in FIG. 3A. FIGS. 5A to 5D illustrate a method of setting AF frames in the flowcharts of FIGS. 4A and 4B.
First, in Step S401, whether or not a face is detected is examined. If a face is detected, the process proceeds to Step S402. If a face is not detected, the process proceeds to Step S404. In Step S402, as shown in FIG. 5A, at the position of the face having the highest priority in the most recent face detection processing result (hereunder referred to as “main face”), one AF frame is set as an AF frame corresponding to a main object. Then, the process proceeds to Step S403. Here, the center of the AF frame in an image may correspond to the center of the detected main face or midway between both eyes. “WID,” which is a size as shown in FIG. 5A, is determined on the basis of the size of the detected main face. In Step S403, a face detection flag indicating that a face is detected is made true. In addition, a face tracking flag (described later) is made false. Thereafter, a face NG counter that counts the number of times a focus evaluation value is obtained from the state in which a face can no longer be detected is cleared, so that the process proceeds to Step S302 of FIG. 3A. At this time, what is detected as a main object to be focused can be indicated to a photographer by superposing the area of the detected face upon and displaying it at an image generated on the basis of a picked up image obtained by the image pickup element 108. Here, “a display indicating the area of the face” refers to a display indicating coordinates of a rectangular area, surrounding the area of the face and extending along the size of the detected face area, as an identifying area indicating the position of the main object to be focused. In identifying the face area at this time, for example, a pair of eyes, nose, mouth, and the lines of the face may be detected, to determine the area of the face of a person by the relative positions of these parts. However, the area surrounding the face area is not limited to a rectangular area, so that it may be an elliptical area, a shape extending along the lines of the detected face, etc. The “display indicating the area of the face” also refers to a display corresponding to an AF frame, and may be an index.
Accordingly, during the period in which the face is being detected, the AF frame is always set at the position of the main face, which is the main object, to perform the continuous AF operation, so that the main face can be continuously focused.
In Step S404, whether or not the face detection flag is true is examined. If the face detection flag is true, the process proceeds to Step S406. If the face detection flag is false, the process proceeds to Step S405. In Step S405, the AF frame is set at the center of a screen. The process then proceeds to Step S302 in FIG. 3A.
Accordingly, when a face is not detected, or when the position of the face cannot be estimated, the AF frame is set at the center position where the main object might exist. In addition, the display of the area that is superposed upon the image that is generated on the basis of the picked up image obtained by the image pickup element 108 is switched to a display of an area corresponding to the AF frame.
In Step S406, whether or not the focus stop flag is true is examined. If the focus stop flag is true, the process proceeds to Step S422. If it is false, the process proceeds to Step S407.
The operations in the following Steps S407 to Step S421 are related to an operation for detecting the movement of the face and a face tracking operation (in which an AF frame indicating a focus evaluation value substantially equal to the reference evaluation value after detecting the movement of the face is determined as a movement-destination AF frame). These operations are performed when the detection of the face is unsuccessfully performed (that is, no good (NG)). A focus evaluation value of an AF frame that is set from the most recent result of detection of a face is compared with the reference evaluation value used for face movement detection and re-obtained in Step S304 of FIG. 3A when a face detection state is OK or the reference obtaining flag is true, so that the face tracking operation is performed. When the face tracking operation is performed, the face tracking flag is true.
In Step S407, whether or not the face tracking flag is true is examined. If the face tracking flag is true, the process proceeds to Step S410. If it is false, the process proceeds to Step S408. In Step S408, with reference to the previously set AF-frame position, that is, the AF frame set from the most recent result of face detection, N×M frames are set. Then, the process proceeds to Step S409. For example, when N=3 and M=3, as shown in FIG. 5B, 3×3 or 9 AF frames are set. An AF frame used to determine a change in the focus evaluation value in Step S411 (described later) corresponds to the AF frame set at the previously set AF-frame position. In Step S409, the face tracking flag is true. When the face tracking flag is true, N×M frames are set. However, instead of areas corresponding to the frames, an area corresponding to the reference AF frame is superposed upon and displayed at the image generated on the basis of the picked up image obtained by the image pickup element 108 and displayed on the operation display unit 117. The display makes it possible to indicate to a photographer the location of focus of the camera.
In Step S410, whether or not a change counter that counts the number of changes in the focus evaluation value in Step S412 (described below) is a predetermined value is examined. If the value of the change counter is equal to the predetermined value, the process proceeds to Step S414. If it is not equal to the predetermined value, the process proceeds to Step S411. In Step S411, whether or not the focus evaluation value is changed from the reference evaluation value by a proportion greater than or equal to a predetermined proportion (which is set as an evaluation-value change threshold value) is examined. If the focus evaluation value is changed by a proportion greater than or equal to the predetermined proportion, the process proceeds to Step S412. If not, the process proceeds to Step S413. In Step S412, “1” is added to the change counter that counts the number of times the focus evaluation value changes continuously from the reference evaluation value by a proportion greater than or equal to the predetermined proportion. In Step S413, since the focus evaluation value does not change by a proportion greater than or equal to the predetermined proportion, the change counter is cleared.
In this way, when the main object undergoes a change as shown in, for example, FIGS. 5B and 5C, the focus evaluation value of the AF frame that is set at the previously set AF frame position is changed, so that it is possible to determine that the main face, which is the main object, has moved.
The operations in the following Steps S414 to S421 are performed when, in Steps S410 and S411, it is determined that the focus evaluation value is changed with respect to the reference evaluation value and that the face moves by a number of times corresponding to the number of successive changes. In addition, these operations are related to an operation that is performed when, from a plurality of set AF frames, it is determined that the AF frame indicating a focus evaluation value that is substantially equal to the reference evaluation value is the movement-destination AF frame.
In Step S414, a very small range is scanned with the current position of the focus lens as center. That is, the focus evaluation value corresponding to a focused state is obtained by obtaining an image while moving the focus lens. In Step S415, the focused state is determined with every set AF frame on the basis of whether or not a peak value that is greater than or equal to a predetermined value exists in the focus evaluation values obtained by scanning. In Step S416, whether or not there exists an AF frame where a focused state is achieved in the focused-state determination results of the plurality of AF frames is examined. If there exists such an AF frame, the process proceeds to Step S417. If not, the process proceeds to Step S405. In Step S417, the AF frame in which the proportion of change with respect to the reference evaluation value is smallest among the peak values of the focus evaluation values obtained by scanning in the respective AF frames is determined as the most probable face AF frame and selected. In Step S418, whether or not the proportion of change of the focus evaluation value of the selected AF frame is within the predetermined proportion (that is set as a face-evaluation-value determining threshold value) with respect to the reference evaluation value is examined. If the proportion of change is within the predetermined proportion, the process proceeds to Step S419. If not, the process proceeds to Step S405. In Step S419, with reference to the selected AF frame position, as in Step S408, the N×M AF frames are reset. In Step S420, the reference obtaining flag is true, and the change counter is cleared in Step S421.
In this way, for example, when the position of the main object is changed from the position shown in FIG. 5B to the position shown in FIG. 5C, the movement of the face and the movement-destination frame to which the face moves are determined, so that the AF frames can be reset as shown in FIG. 5D. By this operation, even after the face can no longer be detected, it possible to continue setting the AF frames at the position of the face, so that the focusing of the face can be continued. In addition, it is not necessary to provide a plurality of frames over the entire screen. That is, the frames only need to be provided at and in the vicinity of the face, so that the computation time is not wasted.
In Step S422, the setting of the AF frames is maintained, and the process proceeds to Step S423. In Step S423, “1” is added to the face NG counter (which counts the number of times the focus evaluation value is obtained from the state in which the face can no longer be detected), and the process proceeds to Step S424. In Step S424, whether or not the value of the face NG counter is greater than or equal to a predetermined value is examined. If it is greater than or equal to the predetermined value, it is determined that the main object is not within the screen, and the process proceeds to Step S405 to set the AF frames at the center of the screen.
Accordingly, if the face is no longer detected by the number of times greater than or equal to a predetermined number of times after the face has been detected, it is determined that the face is no longer within the screen, so that the tracking of the face is stopped. In the continuous AF operation during recording of the moving image in Step S210 in FIG. 2 and the continuous AF operation prior to picking up a still image in Step S214 in FIG. 2, a threshold value of the change counter in Step S410 in FIG. 4B may be changed. In addition, the evaluation value change threshold value in Step S411, the face-evaluation-value determining threshold value in Step S418, and a threshold value of the face NG counter in Step S424 may be changed.
Subroutine of AF Operation
A subroutine of the AF operation in S217 in the flowchart of FIG. 2 will now be described with reference to the flowchart of FIG. 6.
First, in Step S601, whether or not the continuous AF operation is being performed is examined. If the continuous AF operation is being performed, the process proceeds to Step S602. If not, the process proceeds to Step S604. In Step S602, whether or not the peak detection flag in the continuous AF operation in FIGS. 3A and 3B is true is examined. If it is true, the process proceeds to Step S603. If not, the process proceeds to Step S604. In Step S603, a state close to the focused state may be achieved due to the continuous AF operation. Therefore, a controlling operation is performed so that scanning is performed over a range that is narrower than the entire focus detection range with the current focus lens position as center. In contrast, in Step S604, a controlling operation is performed so that scanning is performed over the entire focus detection range.
In Step S605, an AF evaluation value computing section in the AF processing unit 105 is used to determine a focused state on the basis of whether or not there exists a peak value greater than or equal to the predetermined value in the focus evaluation values obtained by scanning. In Step S606, whether or not the focused-state determination result in Step S605 is a focused state is examined. If it is a focused state, the process proceeds to Step S607. If not, the process proceeds to Step S608. In Step S607, a controlling operation is performed so that the focus lens 104 is moved to the position of the focus lens 104 when a focused-state position determining section in the AF processing unit 105 extracts a peak value that is greater than or equal to the predetermined value, that is, the focus lens 104 is moved to the focused-state position. In addition, a controlling operation is performed so that the focused focus detection area is displayed on the operation display unit 117. Since, in Step S608, the entire focus detection range is scanned when the narrow range is only scanned, whether or not the scanning of the entire focus detection range is completed is examined. If the scanning of the entire focus detection range is completed, the process proceeds to Step S609. If not, the process proceeds to Step S604. In Step S609, a peak value that is greater than or equal to the predetermined value cannot be determined, so that the focused state is not achieved. Therefore, the focus lens 104 is moved to a previously set position called a fixed point.
Subroutine of Photographing Operation
A subroutine of the photographing operation in Step S219 in the flowchart of FIG. 2 will now be described with reference to the flowchart of FIG. 7.
First, in Step S701, the brightness of the object is measured. In Step S702, exposure is performed on the image pickup element 108 in accordance with the brightness of the object measured in Step S701. An image formed on a surface of the image pickup element by the exposure is subjected to photoelectric conversion and becomes an analog signal. In Step S703, the analog signal is transmitted to the A/D converting unit 109, and is converted into a digital signal after pre-processing such as nonlinear processing or noise reduction of an output of the image pickup element 108. Then, in Step S704, the image processing unit 110 processes an output signal from the A/D converting unit 109 into a suitable output image signal. Then, in Step S705, image format conversion, such as conversion to a JPEG format, is performed on the output image signal, and the resulting output image signal is transmitted to and stored in the recording unit 114 in Step S706.
Although, in the foregoing description, the main object to be photographed by the photographer is a human being, and an area for focus detection is an area where the face is detected, the present invention is not limited thereto. For example, it is possible to cut out an object image from the background, and detect this as a main object, to set an area corresponding to the position of the object as a focus detection area.
In Step S424 in FIG. 4A, when the face is unsuccessfully detected by a number of times greater than or equal to the predetermined number of times, and the setting of the plurality of focus detection areas is repeated, the tracking of the face is stopped, and the focus detection areas are set at the center of the screen. However, the present invention is not limited thereto. That is, when the face is unsuccessfully detected after a predetermined time has elapsed, it is possible to stop the tracking of the face, and set the focus detection areas at the center of the screen.
In Step S408 in FIG. 4B, N×M frames are set with reference to a previously set AF-frame position. However, when the previously set AF-frame position exists towards an end where the N×M frames cannot be set, the number of AF frames can be made less than N×M frames, and set.
In the foregoing description, the face of an object is detected on the basis of a picked up image. Then, on the basis of a detected face area, a main object is determined to set a focus detection area corresponding to the main object. Then, on the basis of the picked up image in the focus detection area, a focus evaluation value is calculated to perform focus adjustment. However, the present invention is not limited thereto. For example, it is possible to detect a defocusing amount of the focus lens in a focus detection area of a phase-difference detecting sensor corresponding to an area where the face is detected, to perform focus adjustment. Even in this case, when the face, or the main object, can no longer be detected, the defocusing amounts of the focus lens in focus detection areas of the phase-difference detecting sensor, corresponding to the main-object position when the face is previously successfully detected and the position of the vicinity thereof, are detected. Then, focus adjustment is performed on the basis of the detection results. By this, it is possible to continuously focus the main object without performing unnecessary setting of focus detection areas.
In the foregoing description, when face detection is unsuccessfully performed (that is, no good (NG)), N×M frames are set with reference to an AF frame where a recent face detection is successfully performed. However, even if a large number of frames do not exist, an AF frame differing from the AF frame where the face is successfully detected is set along with the AF frame where the face is successfully detected, to perform focus adjustment on the basis of the result of detection in the focus detection area corresponding to this position.
When the face tracking flag is true, instead of areas corresponding to the frames, an area corresponding to an AF frame serving as a reference is superposed upon and displayed at an image displayed by the operation display unit 117. However, the areas corresponding to the frames can be superposed upon and displayed at the image displayed by the operation display unit 117, or it is possible not to display the areas when tracking the face.
In the embodiment described above, even if the main object can no longer be successfully detected, the focusing is controlled by setting the main focus detection area at the main object position which is successfully detected. In addition, with reference to the main focus detection area, a minimum number of focus detection areas is set. Then, the movement and destination of movement of the main object are determined from the focus evaluation value of each focus detection area, and the main focus detection area and the focus detection areas are repeatedly reset on the basis of the movement destination. This makes it possible to continuously focus the main object without performing unnecessary setting of focus detection areas. Further, since there is no need to perform unnecessary setting of focus detection areas, unnecessary computation time can be reduced.
Other Embodiments
According to a further embodiment of the present invention there is provided a focus adjusting device comprising a detecting unit configured to detect an image of an object to be focused, from a picked up image; a setting unit configured to set a focus detection area for when a focused state of an image pickup optical system is detected; and a focus adjusting unit configured to adjust a focus by moving the image pickup optical system on the basis of a signal output at the focus detection area, wherein, when the detecting unit is capable of detecting the object image to be focused, but is no longer capable of performing the detection after a first focus detection area is set, the setting unit sets, in addition to the first focus detection area, at least one second focus detection area in accordance with the position of the first focus detection area.
A further embodiment of the present invention provides a focus adjustment method of a focus adjusting device, the method comprising detecting an image of an object to be focused, from a picked up image; setting a focus detection area for when a focused state of an image pickup optical system is detected; and adjusting a focus by moving the image pickup optical system on the basis of a signal output at the focus detection area, wherein, when the detecting step is capable of detecting the object image to be focused, but is no longer capable of performing the detection after a first focus detection area is set, the setting step sets, in addition to the set first focus detection area, a second focus detection area in accordance with the position of the first focus detection area.
A further embodiment of the present invention provides a focus adjusting device comprising a detecting unit configured to detect an image of an object to be focused, from a picked up image; a setting unit configured to set a focus detection area for when a focused state of an image pickup optical system is detected; and a focus adjusting unit configured to adjust a focus by moving the image pickup optical system on the basis of a signal output at the focus detection area, wherein the setting unit sets a first focus detection area corresponding to the object image that is detected by the detecting unit and that is to be focused, or both the previously set first focus detection area and a second focus detection area that is in accordance with the position of the first focus detection area.
A further embodiment of the present invention provides a focus adjustment method of a focus adjusting device, the method comprising detecting an image of an object to be focused, from a picked up image; setting a focus detection area for when a focused state of an image pickup optical system is detected; and adjusting a focus by moving the image pickup optical system on the basis of a signal output at the focus detection area, wherein the setting step is used to set a first focus detection area corresponding to the object image that is detected by the detecting step and that is to be focused, or both the previously set first focus detection area and a second focus detection area that is in accordance with the position of the first focus detection area.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all modifications, equivalent structures and functions.
This application claims the benefit of Japanese Application No. 2007-029355 filed Feb. 8, 2007, which is hereby incorporated by reference herein in its entirety.

Claims (9)

1. A focus adjusting device comprising:
a detecting unit configured to detect a focusing target from a captured image;
a setting unit configured to set, in response to the detection of the focusing target by the detecting unit, a focus detection area for detecting a focusing condition of a focusing lens; and
a focus adjusting unit configured to perform focus adjustment by moving the focusing lens based on an image of the focus detection area,
wherein the setting unit is operable, when the focusing target is no longer detected after a first focus detection area has been set in response to the detection of the focusing target by the detecting unit, to set at least one second focus detection area in accordance with the position of the first focus detection area,
wherein a focused state that is obtained in the focus detection area when the detecting unit is capable of detecting the focusing target is set as a reference value, and the focus adjusting unit is configured to drive the focusing lens on the basis of a comparison between the reference value and a focused state obtained at the first focus detection area and a comparison between the reference value and a focused state obtained at the second focus detection area.
2. A focus adjusting device according to claim 1, further comprising a determining unit configured to, when the values of the focused states obtained at the first and second focus detection areas vary with respect to the reference value by a predetermined proportion, determine that the object image has moved, detect the focused states in the first and second focus detection areas while driving the focusing lens, and determine a focus detection area where the object exists from the first and second focus detection areas on the basis of the reference value and result of the detection.
3. A focus adjusting device according to claim 1, wherein the setting unit is operable, when the focusing target is no longer detected after a first focus detection area has been set in response to the detection of the focusing target by the detecting unit, to set at least one second focus detection area, in addition to the first focus detection area.
4. A focus adjusting device according to claim 3, further comprising a display unit configured to display an index at the focus detection area, wherein, when the focusing target is no longer detected after a first focus detection area has been set in response to the detection of the focusing target by the detecting unit, the display unit displays the index.
5. A focus adjusting device according to claim 1, wherein the detecting unit is operable to detect at least one of the position, size, and reliability of an object corresponding to the object image to be focused.
6. A focus adjusting device according to claim 1, wherein the first focus detection area is equal in size to the or each second focus detection area are equal to each other.
7. A focus adjusting device according to claim 1, wherein the focus adjusting unit is operable to inhibit driving of the focusing lens when the detecting unit is no longer capable of detecting the focusing target, and either the first and second focus detection areas have been set by a number of times that is greater than or equal to a predetermined number of times or when a predetermined amount of time has elapsed.
8. A focus adjusting device comprising:
a detecting unit configured to detect, from an image picked up by an image pickup optical system, an image of an object to be focused;
a setting unit configured to set a focus detection area when the image pickup optical system is in a focused state; and
a focus adjusting unit configured to adjust a focus by driving the image pickup optical system on the basis of a signal output at the focus detection area,
wherein the setting unit is operable to set at least one second focus detection area after setting a first focus detection area and in accordance with the position of the first focus detection area, corresponding to the object image that is detected by the detecting unit and that is to be focused,
wherein a focused state that is obtained in the focus detection area when the detecting unit is capable of detecting the object image to be focused is set as a reference value, and the focus adjusting unit is configured to drive the image pickup optical system on the basis of a comparison between the reference value and a focused state obtained at the first focus detection area and a comparison between the reference value and a focused state obtained at the second focus detection area.
9. A focus adjusting device according to claim 8, further comprising a determining unit configured to, when the values of the focused states obtained at the first and second focus detection areas vary with respect to the reference value by a predetermined proportion, determine that the object image has moved, detect the focused states in the first and second focus detection areas while driving the image pickup optical system, and determine a focus detection area where the object exists from the first and second focus detection areas on the basis of the reference value and result of the detection.
US11/970,386 2007-02-08 2008-01-07 Focus adjusting device, image pickup apparatus, and focus adjustment method Expired - Fee Related US7869704B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/968,109 US8208803B2 (en) 2007-02-08 2010-12-14 Focus adjusting device, image pickup apparatus, and focus adjustment method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2007-029355 2007-02-08
JP2007029355A JP5188071B2 (en) 2007-02-08 2007-02-08 Focus adjustment device, imaging device, and focus adjustment method

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/968,109 Continuation US8208803B2 (en) 2007-02-08 2010-12-14 Focus adjusting device, image pickup apparatus, and focus adjustment method

Publications (2)

Publication Number Publication Date
US20080193115A1 US20080193115A1 (en) 2008-08-14
US7869704B2 true US7869704B2 (en) 2011-01-11

Family

ID=39323586

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/970,386 Expired - Fee Related US7869704B2 (en) 2007-02-08 2008-01-07 Focus adjusting device, image pickup apparatus, and focus adjustment method
US12/968,109 Active US8208803B2 (en) 2007-02-08 2010-12-14 Focus adjusting device, image pickup apparatus, and focus adjustment method

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/968,109 Active US8208803B2 (en) 2007-02-08 2010-12-14 Focus adjusting device, image pickup apparatus, and focus adjustment method

Country Status (4)

Country Link
US (2) US7869704B2 (en)
EP (1) EP1956831B1 (en)
JP (1) JP5188071B2 (en)
CN (1) CN101241222B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090028394A1 (en) * 2007-07-24 2009-01-29 Nikon Corporation Imaging device, image detecting method and focus adjusting method
US20090109322A1 (en) * 2007-10-30 2009-04-30 Nikon Corporation Image recognition device, focus adjustment device and imaging device
US20110081141A1 (en) * 2007-02-08 2011-04-07 Canon Kabushiki Kaisha Focus adjusting device, image pickup apparatus, and focus adjustment method
US8724981B2 (en) * 2010-06-15 2014-05-13 Ricoh Company, Limited Imaging apparatus, focus position detecting method, and computer program product
US8860872B2 (en) 2010-01-22 2014-10-14 Canon Kabushiki Kaisha Automatic focusing apparatus with cyclic pattern determination

Families Citing this family (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4429328B2 (en) * 2007-02-09 2010-03-10 キヤノン株式会社 Automatic focusing device, control method therefor, and imaging device
JP5106064B2 (en) * 2007-11-27 2012-12-26 キヤノン株式会社 Imaging device and lens unit
JP2009139807A (en) * 2007-12-10 2009-06-25 Sony Corp Imaging device
US8237807B2 (en) 2008-07-24 2012-08-07 Apple Inc. Image capturing device with touch screen for adjusting camera settings
JP5171468B2 (en) * 2008-08-06 2013-03-27 キヤノン株式会社 IMAGING DEVICE AND IMAGING DEVICE CONTROL METHOD
JP5305817B2 (en) * 2008-10-01 2013-10-02 キヤノン株式会社 Automatic focusing device, automatic focusing method, and imaging device
JP5517435B2 (en) * 2008-10-22 2014-06-11 キヤノン株式会社 Automatic focusing device, automatic focusing method, and imaging device
US8525916B2 (en) 2008-10-30 2013-09-03 Panasonic Corporation Imaging apparatus using different driving methods according to estimation results
JP2010113129A (en) * 2008-11-06 2010-05-20 Nikon Corp Image tracking device, focusing device, and image capturing apparatus
JP2010113130A (en) * 2008-11-06 2010-05-20 Nikon Corp Focus detecting device, imaging apparatus, focus detecting method
JP2010134309A (en) * 2008-12-08 2010-06-17 Renesas Electronics Corp Autofocus device, autofocus method and imaging apparatus
JP5339954B2 (en) * 2009-02-17 2013-11-13 キヤノン株式会社 Focus adjustment device and focus adjustment method
JP5300520B2 (en) * 2009-02-17 2013-09-25 キヤノン株式会社 Focus adjustment device and focus adjustment method
US8717490B2 (en) * 2009-06-19 2014-05-06 Casio Computer Co., Ltd Imaging apparatus, focusing method, and computer-readable recording medium recording program
JP5421691B2 (en) * 2009-08-18 2014-02-19 キヤノン株式会社 Focus adjustment device and focus adjustment method
JP4906893B2 (en) * 2009-08-18 2012-03-28 キヤノン株式会社 IMAGING DEVICE AND IMAGING DEVICE CONTROL METHOD
JP2011043633A (en) * 2009-08-20 2011-03-03 Canon Inc Focus adjustment device and focus adjustment method
JP5427024B2 (en) * 2009-12-25 2014-02-26 キヤノン株式会社 IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND PROGRAM
JP5787634B2 (en) * 2010-08-09 2015-09-30 キヤノン株式会社 Imaging device
CN102905066B (en) * 2011-07-27 2015-12-09 康佳集团股份有限公司 Realize method and the system thereof of automatic camera
JP5438805B2 (en) * 2012-07-23 2014-03-12 キヤノン株式会社 Automatic focusing device, automatic focusing method, and imaging device
TWI519840B (en) 2012-11-22 2016-02-01 原相科技股份有限公司 Method for automatically focusing on specific movable object, photographic apparatus including automatic focus function, and computer readable storage media for storing automatic focus function program
CN103856708B (en) * 2012-12-03 2018-06-29 原相科技股份有限公司 The method and photographic device of auto-focusing
KR101979803B1 (en) * 2013-01-02 2019-05-17 삼성전자주식회사 High speed continuous shooting Digital photographing apparatus and method for controlling the same
JP5949591B2 (en) * 2013-02-13 2016-07-06 ソニー株式会社 Imaging apparatus, control method, and program
JP5790751B2 (en) * 2013-12-17 2015-10-07 株式会社ニコン Imaging device
JP6281409B2 (en) * 2014-05-26 2018-02-21 富士通株式会社 Display control method, information processing program, and information processing apparatus
JP6501536B2 (en) 2015-02-02 2019-04-17 キヤノン株式会社 Imaging device, control method therefor, program, storage medium
JP2016206352A (en) * 2015-04-20 2016-12-08 キヤノン株式会社 Focus adjustment device and its control method, its program, its recording medium and imaging device
JP6576171B2 (en) * 2015-09-02 2019-09-18 キヤノン株式会社 Video processing apparatus, video processing method, and program
US10491804B2 (en) * 2016-03-29 2019-11-26 Huawei Technologies Co, Ltd. Focus window determining method, apparatus, and device
CN106126053B (en) * 2016-05-27 2019-08-27 努比亚技术有限公司 Mobile terminal control device and method
KR20180059306A (en) * 2016-11-25 2018-06-04 삼성전자주식회사 Device comprising antenna and control method of the same
CN106846399B (en) * 2017-01-16 2021-01-08 浙江大学 Method and device for acquiring visual gravity center of image
CN107613204B (en) * 2017-09-28 2020-08-28 努比亚技术有限公司 Focusing area adjusting method, terminal and computer storage medium
KR102645340B1 (en) * 2018-02-23 2024-03-08 삼성전자주식회사 Electronic device and method for recording thereof
US10798292B1 (en) 2019-05-31 2020-10-06 Microsoft Technology Licensing, Llc Techniques to set focus in camera in a mixed-reality environment with hand gesture interaction

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0263510A2 (en) 1986-10-08 1988-04-13 Canon Kabushiki Kaisha Automatic focusing device
JPH06308373A (en) 1993-04-22 1994-11-04 Canon Inc Camera with line of sight detecting function
US5659814A (en) * 1993-11-24 1997-08-19 Nikon Corporation Camera
US5739857A (en) 1990-02-08 1998-04-14 Canon Kabushiki Kaisha Image pickup device with settable image detecting region
US5995767A (en) * 1996-12-27 1999-11-30 Lg Electronics Inc. Method for controlling focusing areas of a camera and an apparatus for performing the same
JP2000188713A (en) 1998-12-22 2000-07-04 Ricoh Co Ltd Automatic focus controller and method for determining its focusing
JP2004138970A (en) 2002-10-21 2004-05-13 Sharp Corp Autofocus camera and photographing method
US20050013601A1 (en) 1999-11-16 2005-01-20 Masataka Ide Distance-measuring device installed in camera
US20050162540A1 (en) * 2004-01-27 2005-07-28 Fujinon Corporation Autofocus system
US20060066744A1 (en) 2004-09-29 2006-03-30 Stavely Donald J Implementing autofocus in an image capture device while compensating for movement
US20060285842A1 (en) * 2005-06-21 2006-12-21 Masataka Ide Camera having focusing device
US7433586B2 (en) * 2004-08-18 2008-10-07 Casio Computer Co., Ltd. Camera with an auto-focus function

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005215750A (en) * 2004-01-27 2005-08-11 Canon Inc Face detecting device and face detecting method
US7809259B2 (en) * 2004-07-26 2010-10-05 Panasonic Corporation Optical disk and optical disk device
JP4581730B2 (en) * 2005-02-15 2010-11-17 株式会社ニコン Digital camera
US20060182433A1 (en) * 2005-02-15 2006-08-17 Nikon Corporation Electronic camera
JP4096950B2 (en) * 2005-02-24 2008-06-04 船井電機株式会社 Imaging apparatus and automatic imaging method
JP4642542B2 (en) 2005-05-09 2011-03-02 キヤノン株式会社 Focus adjustment device, imaging device, and control method thereof
JP5188071B2 (en) * 2007-02-08 2013-04-24 キヤノン株式会社 Focus adjustment device, imaging device, and focus adjustment method

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0263510A2 (en) 1986-10-08 1988-04-13 Canon Kabushiki Kaisha Automatic focusing device
US5739857A (en) 1990-02-08 1998-04-14 Canon Kabushiki Kaisha Image pickup device with settable image detecting region
JPH06308373A (en) 1993-04-22 1994-11-04 Canon Inc Camera with line of sight detecting function
US5659814A (en) * 1993-11-24 1997-08-19 Nikon Corporation Camera
US5995767A (en) * 1996-12-27 1999-11-30 Lg Electronics Inc. Method for controlling focusing areas of a camera and an apparatus for performing the same
JP2000188713A (en) 1998-12-22 2000-07-04 Ricoh Co Ltd Automatic focus controller and method for determining its focusing
US20050013601A1 (en) 1999-11-16 2005-01-20 Masataka Ide Distance-measuring device installed in camera
JP2004138970A (en) 2002-10-21 2004-05-13 Sharp Corp Autofocus camera and photographing method
US20050162540A1 (en) * 2004-01-27 2005-07-28 Fujinon Corporation Autofocus system
US7433586B2 (en) * 2004-08-18 2008-10-07 Casio Computer Co., Ltd. Camera with an auto-focus function
US20060066744A1 (en) 2004-09-29 2006-03-30 Stavely Donald J Implementing autofocus in an image capture device while compensating for movement
US20060285842A1 (en) * 2005-06-21 2006-12-21 Masataka Ide Camera having focusing device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JP2004-138970 Machined Translation available at JPO website. *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110081141A1 (en) * 2007-02-08 2011-04-07 Canon Kabushiki Kaisha Focus adjusting device, image pickup apparatus, and focus adjustment method
US8208803B2 (en) * 2007-02-08 2012-06-26 Canon Kabushiki Kaisha Focus adjusting device, image pickup apparatus, and focus adjustment method
US20090028394A1 (en) * 2007-07-24 2009-01-29 Nikon Corporation Imaging device, image detecting method and focus adjusting method
US8542941B2 (en) * 2007-07-24 2013-09-24 Nikon Corporation Imaging device, image detecting method and focus adjusting method
US20090109322A1 (en) * 2007-10-30 2009-04-30 Nikon Corporation Image recognition device, focus adjustment device and imaging device
US8368800B2 (en) * 2007-10-30 2013-02-05 Nikon Corporation Image recognition device, focus adjustment device and imaging device
US8860872B2 (en) 2010-01-22 2014-10-14 Canon Kabushiki Kaisha Automatic focusing apparatus with cyclic pattern determination
US9411128B2 (en) 2010-01-22 2016-08-09 Canon Kabushiki Kaisha Automatic focusing apparatus with cyclic pattern determination
US8724981B2 (en) * 2010-06-15 2014-05-13 Ricoh Company, Limited Imaging apparatus, focus position detecting method, and computer program product

Also Published As

Publication number Publication date
CN101241222B (en) 2013-06-05
US20110081141A1 (en) 2011-04-07
JP2008197153A (en) 2008-08-28
CN101241222A (en) 2008-08-13
EP1956831A2 (en) 2008-08-13
JP5188071B2 (en) 2013-04-24
US20080193115A1 (en) 2008-08-14
EP1956831A3 (en) 2010-02-10
EP1956831B1 (en) 2013-10-09
US8208803B2 (en) 2012-06-26

Similar Documents

Publication Publication Date Title
US7869704B2 (en) Focus adjusting device, image pickup apparatus, and focus adjustment method
US7469098B2 (en) Optical apparatus
US7747159B2 (en) Focusing device and image-capturing device provided with the same
US8446519B2 (en) Focus control apparatus and optical apparatus
US8279323B2 (en) Image capturing apparatus and control method for the same
US8184192B2 (en) Imaging apparatus that performs an object region detection processing and method for controlling the imaging apparatus
US8107806B2 (en) Focus adjustment apparatus and focus adjustment method
US20190086768A1 (en) Automatic focusing apparatus and control method therefor
US8494354B2 (en) Focus adjusting apparatus and focus adjusting method
JP2010015024A (en) Image pickup apparatus, control method thereof, program and storage medium
US8731437B2 (en) Focus adjusting apparatus and focus adjusting method
JP2010139666A (en) Imaging device
US20120033127A1 (en) Image capture apparatus
US9357124B2 (en) Focusing control device and controlling method of the same
JP2011048265A (en) Focus detection device and focus detection method
JP4502376B2 (en) Focus control device and photographing device
JP4957461B2 (en) Imaging apparatus and imaging method
JPH11211974A (en) Image pickup device
JP2006157604A (en) Camera apparatus and automatic photographing control program
JP2005227447A (en) Autofocus camera
JP2010034827A (en) Imaging apparatus and control method of imaging apparatus
JP2010060932A (en) Automatic focusing device and method
JP2012022274A (en) Device and method for controlling focal point

Legal Events

Date Code Title Description
AS Assignment

Owner name: CANON KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:UENISHI, MASAAKI;REEL/FRAME:020419/0983

Effective date: 20071220

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20190111