JP2006211139A - Imaging apparatus - Google Patents

Imaging apparatus Download PDF

Info

Publication number
JP2006211139A
JP2006211139A JP2005018603A JP2005018603A JP2006211139A JP 2006211139 A JP2006211139 A JP 2006211139A JP 2005018603 A JP2005018603 A JP 2005018603A JP 2005018603 A JP2005018603 A JP 2005018603A JP 2006211139 A JP2006211139 A JP 2006211139A
Authority
JP
Japan
Prior art keywords
face
face detection
means
detected
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2005018603A
Other languages
Japanese (ja)
Inventor
Hitoshi Hongo
Yohei Ishii
Masahiko Yamada
晶彦 山田
仁志 本郷
洋平 石井
Original Assignee
Sanyo Electric Co Ltd
三洋電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sanyo Electric Co Ltd, 三洋電機株式会社 filed Critical Sanyo Electric Co Ltd
Priority to JP2005018603A priority Critical patent/JP2006211139A/en
Publication of JP2006211139A publication Critical patent/JP2006211139A/en
Pending legal-status Critical Current

Links

Images

Abstract

PROBLEM TO BE SOLVED: To automatically detect a person's face to be photographed by a photographer, and to perform automatic focusing control, automatic exposure control, automatic white balance adjustment and the like on the person to be photographed. An object of the present invention is to provide an imaging device that can be used.
An image pickup means for picking up an image, a display device for displaying an image picked up by the image pickup means, a mark indicating a face detection point added to image data input from the image pickup means, and the display device Means for displaying, face detection means for detecting a face area of a person overlapping a face detection point from image data input from the imaging means, and at least a part of the face area detected by the face detection means Is an adjustment target area, and includes means for performing at least one of automatic focusing control, automatic exposure control, and automatic white balance adjustment.
[Selection] Figure 3

Description

  The present invention relates to an imaging apparatus such as a digital camera.

  In an imaging apparatus having a conventional automatic focusing function, automatic focusing is generally performed using one or a plurality of areas at the center of the screen as a ranging area. However, it is not easy to keep the moving person's face accurately aligned with the distance measurement area with the shutter button pressed halfway.

  Japanese Patent Laid-Open No. 2003-107335 focuses on a person's face regardless of the position of the person by focusing on the person's face in the screen using a means for detecting the person's face. It is disclosed.

However, when a plurality of faces are detected on the screen, the photographer has to select one face from the plurality of faces, which is troublesome. The above publication also discloses a technique for automatically selecting a person at the center of the screen or a person whose face is greatly reflected when a plurality of faces are detected in the screen. The user needs to set a mode for deciding whether to automatically select a correct face.
JP 2003-107335 A

  The present invention can automatically detect the face of the person to be photographed by the photographer, and can perform automatic focusing control, automatic exposure control, automatic white balance adjustment, and the like on the person to be photographed. An object is to provide an imaging device.

  According to the first aspect of the present invention, an image pickup means for picking up an image, a display device for displaying an image picked up by the image pickup means, and a mark indicating a face detection point are added to the image data input from the image pickup means. And displaying on the display device, face detection means for detecting the face area of the person overlapping the face detection point from the image data input from the imaging means, and the face detected by the face detection means Means is provided for performing at least one of automatic focusing control, automatic exposure control, and automatic white balance adjustment with at least a part of the region as an adjustment target region.

  The invention described in claim 2 is characterized in that in the invention described in claim 1, when a face region is detected by the face detection means, a means for erasing the mark indicating the face detection point is provided. .

  According to a third aspect of the present invention, in the second aspect of the present invention, when the face area is detected by the face detecting means, the center of the detected face area is set as a face detection point. It is characterized by that.

  According to the present invention, the photographer can automatically detect the face of the person to be photographed, and can perform automatic focusing control, automatic exposure control, automatic white balance adjustment, etc. on the person to be photographed. It becomes.

  Hereinafter, embodiments of the present invention applied to a digital camera will be described with reference to the drawings.

[1] Explanation of the configuration of the digital camera

  FIG. 1 shows the configuration of a digital camera.

  The digital camera includes a control unit 10 including a CPU, a ROM, a RAM, and the like. An imaging unit 1, focus / exposure adjustment unit 2, ranging unit 3, operation unit 4, face detection unit 5, display unit 6, and nonvolatile memory 7 are connected to the control unit 10.

  The imaging unit 1 includes an imaging element, an optical system, a zoom mechanism, a diaphragm mechanism, and the like. The distance measuring unit 3 measures a distance to an area (an adjustment target area) that is a target of focus adjustment. The focus / exposure adjustment unit 2 performs focus adjustment by controlling the lens mechanism based on the distance measured by the distance measurement unit 3. Further, the focus / exposure adjustment unit 2 performs exposure adjustment by controlling the aperture mechanism and the shutter speed so that the adjustment target region has a suitable brightness. Note that focus adjustment may be performed based on the contrast of the captured image. In other words, the focus may be changed so that the focus with the highest contrast in the adjustment target region may be selected. In this case, the distance measuring unit 3 is unnecessary.

  The operation unit 4 includes various buttons such as a shutter button. The face detection unit 5 detects a face area of a person in a face search area described later in the input image. An input image or the like is displayed on the display unit 6. The non-volatile memory 7 holds captured images.

  This digital camera is provided with a portrait mode for imaging a person. Then, when the portrait mode is set, this digital camera automatically detects the face of the person that the photographer intends to shoot, and uses at least a part of the detected face area as an adjustment target area. It has a function to automatically adjust the focus and exposure.

[2] Explanation of face detection method

  A face detection method used in this embodiment will be described. When photographing in the portrait mode, a “face detection point” is set at the center of the display screen of the display unit 6. As shown in FIG. 2, a mark (face detection point mark) M indicating the position of the “face detection point” M is displayed on the display screen of the display unit 6 so that the photographer can know the position of the “face detection point”. . In the example of FIG. 2, a cross mark is used as the face detection point mark M. As the face detection point mark, an X mark, a circle mark, a dot mark, a bracket, or the like can be used.

  As shown in FIG. 3, the photographer adjusts the orientation of the digital camera so that the face detection point mark M overlaps the face of the person to be photographed that exists in the input image displayed on the display unit 6. . In FIG. 3, a indicates the face of the person A, and b indicates the face of the person B. The face detection unit 5 detects only the face matched with the face detection point mark.

  As the face detection method, a known method (for example, a method described in Document A) is used.

  Reference A: P. Viola and M. Jones, “Robust Real Time Object Detection”, IEEE ICCV Workshop Statistical and Computational Theories of Vision, July 2001.

  An outline of a general face detection method will be described. Effective for extracting whether or not a face is extracted by extracting various edge features from the face area including the eyes, nose, mouth, face contour, etc. A face classifier is constructed by learning feature quantities based on statistical methods.

  In order to detect a face from an input image, a similar feature amount is extracted while raster scanning from the edge of the input image with a face size (face matching frame) normalized at the time of learning. The discriminator determines whether or not the area is a face. The feature amount includes, for example, a horizontal edge, a vertical edge, a right diagonal edge, a left diagonal edge, and the like.

  If no face is detected, the input image is reduced at a certain rate, and the face is searched for the reduced image while performing raster scanning in the same manner as described above. By repeating such processing, a face of any size can be found in the image.

  In the present embodiment, the face search area S is limited as shown in FIG. 3 in order to appropriately detect only the face that matches the face detection point mark. That is, the face search area S centering on the face detection point is set. Then, while raster scanning the face search area S with the face size (face matching frame) normalized at the time of learning, feature quantities are extracted, and a face discriminator is used to determine whether or not the face is a face. Do. In this way, by providing face detection points and limiting the face search area, it is possible to detect only the faces that overlap the face detection points. In the example of FIG. 3, there are two faces a and b in the image, but the face b is excluded from the search target, and therefore only one face a that overlaps the face detection point can be detected.

  The size of the face search area S is set larger than the face matching frame F in consideration of the difficulty of aligning the center of the face with the center of the face detection point mark by the photographer as shown in FIG. . However, if the size of the face search area S is too large, a plurality of faces may be detected. Therefore, the size of the face search area S is set so that two faces do not fit. Here, the size of the face search area S is set to 1.5 times the face matching frame F.

  By the way, a face cannot be detected unless the size of the face in the image is substantially the same as the size of the face matching frame F, but the size of the face in the image is not constant. Therefore, when a face cannot be detected, the size of the face search area S is kept as it is, and the image is reduced at a certain rate to search for a face. By repeating such processing, a face of an arbitrary size is detected from the face search area S. The image reduction ratio and the number of reductions are determined in advance.

  Note that reducing the image while keeping the size of the face search region S is equivalent to increasing the size of the face search region S while keeping the image as it is.

  FIG. 5 shows the relative relationship between the face search area and the face size. In FIG. 5, in order to express the relative relationship between the face search area and the face size in an easy-to-understand manner, the size of the face search area S is increased without changing the image. S1 to S4 represent face search areas. A small face is searched in a small area near the face detection point. When detecting a large face, the face search area is expanded.

  When a face is detected, in order to make the photographer recognize this, as shown in FIG. 6, the face detection point mark (cross mark) M is deleted, and a rectangular mark (face area) surrounding the detected face is deleted. Display mark) W is displayed.

  The subject is also expected to move. Therefore, in this embodiment, after the face is detected, the face detection point is moved to the center point of the face detection area, and the face is searched similarly using the position as a reference. The position of the face detection point may be determined by predicting the movement of the subject. In this way, the subject can be tracked by continuously detecting the face. Further, in order to track the subject, a known tracking method (see Document B) may be used. Alternatively, the color information of the detected face area may be acquired and the subject may be tracked based on the color information (see Document C).

  Reference B: "An iterative image registration technique with an application to stereo vision" Proc. Of the international Joint Conference on Artificial Intelligence, pp.674-679, 1981.

  Reference C: “Proposal of a modified HSV color system effective for face region extraction”, Journal of Television Society Vol.49, No.6, pp.787-797 (1995).

[3] Explanation of processing procedure

  FIG. 7 shows a focus and exposure adjustment processing procedure based on face detection executed by the digital camera when the portrait mode is set.

  An image picked up by the image pickup unit 1 is input (step S1). Then, it is determined whether the flag F is reset (F = 0) or set (F = 1). The flag F is reset when the portrait mode is set, and is reset when the shutter button is pressed in the portrait mode. This flag F is set when a face is detected in the face detection process of step S5 described later.

  If it is immediately after the port trade mode is set, since the flag F is reset, the process proceeds to step S3, where the face detection point mark M is combined with the input image and displayed on the display unit 6. Then, the process proceeds to step S5. The photographer adjusts the orientation of the digital camera so that the face detection point mark M overlaps the face of the person who wants to photograph.

  In step S5, a face search area centered on the face detection point is set, and face detection processing is performed in the face search area as described above.

  If the face cannot be detected (NO in step S6), the process returns to step S1 and the image captured by the imaging unit 1 is input. When the face can be detected, the flag F is set (F = 1) (step S7), the face detection point mark M is deleted (when M is displayed), and the face surrounding the detected face is detected. An area display mark W is displayed (step S8). Further, focus adjustment and exposure adjustment are performed using at least a part of the detected face area as an adjustment target area (step S9).

  Next, the center of the detected face area is set as a face detection point (step S10). Then, the process returns to step S1. When the process returns to step S1 after the face is detected, the flag F is set, so when an image captured by the imaging unit 1 is input, the image is displayed as it is on the display unit 6 (step S4). ). Then, face detection processing is performed on the input image using the face search area centered on the face detection point set in step S10 (step S5).

  If the shutter button is pressed while such processing is being performed, the flag F is reset (F = 0). Therefore, when returning to step S1 thereafter, the image is captured by the imaging unit 1. When an image is input, a face detection point mark is combined with the input image and displayed on the display unit 6. Then, face detection processing is performed on the input image using a face search area centered on the face detection point (step S5).

  In step S9, focus adjustment and exposure adjustment are performed using the detected face area as an adjustment target area, but only one of the adjustments may be performed. Further, the white balance may be adjusted using at least a part of the detected face area as an adjustment target area.

  In the above embodiment, every time it is determined in step S6 that face detection has been performed, focus adjustment and exposure adjustment are performed, but the face is detected in the previous frame, and from the camera to the face detected in the previous frame. If the distance between the camera and the distance from the camera to the face detected in the current frame is not different, the focus adjustment may not be performed. Whether the distance from the camera to the face detected in the previous frame and the distance from the camera to the face detected in the current frame are different depends on the size of the face detected in the previous frame and the face detected in the current frame. It can be determined based on whether or not the size is different. The size of the face can be obtained from the relative relationship between the face search area and the face size when the face can be detected.

  Similarly, when a face is detected in the previous frame and the brightness of the face area detected in the previous frame is not different from the brightness of the face area detected in the current frame from the camera, exposure adjustment is not performed. You may do it.

  In the above embodiment, when the portrait mode is set, focus and exposure adjustment based on face detection is always performed. However, a switch for switching the face detection mode is provided, and this switch is turned on. The focus and exposure adjustment based on the face detection may be performed only in the case where it is performed. Regardless of the photographing mode such as the portrait mode, focus and exposure adjustment based on face detection may be performed when the shutter button is half-pressed. In this case, when a face cannot be detected even if face detection is performed on a plurality of frames, the focus and exposure adjustment based on the face detection is ended. When the face is not detected in this way and the shutter button is pressed, shooting is performed according to the “automatic scene determination function” that has been conventionally provided.

  The “automatic scene determination function” refers to a function that automatically determines each scene such as portrait, landscape, night view, etc., and automatically sets an aperture and shutter speed suitable for the determination result.

It is a block diagram which shows the structure of a digital camera. 5 is a schematic diagram showing a face detection point mark M. FIG. The schematic diagram which shows the example of a display image when direction of a digital camera is adjusted so that the face detection point mark M may overlap with the face of the person who wants to image | photograph which exists in the input image currently displayed on the display part 6 It is. 5 is a schematic diagram showing a face search area S. FIG. It is a schematic diagram which shows the relative relationship between a face search area | region and face size. FIG. 6 is a schematic diagram showing that when a face is detected, the face detection point mark M is erased and a rectangular mark (face area display mark) W surrounding the detected face is displayed. It is a flowchart which shows the focus and exposure adjustment processing procedure based on the face detection performed by a digital camera when portrait mode is set.

Explanation of symbols

DESCRIPTION OF SYMBOLS 10 Control part 1 Imaging part 2 Focus and exposure adjustment part 3 Distance measuring part 4 Operation part 5 Face detection part 6 Display part 7 Non-volatile memory

Claims (3)

  1. Imaging means for capturing an image;
    A display device for displaying an image captured by the imaging means;
    Means for adding a mark indicating a face detection point to the image data input from the imaging means and causing the display device to display the mark;
    Face detection means for detecting a face area of a person overlapping a face detection point from image data input from the imaging means, and at least a part of the face area detected by the face detection means is an adjustment target area Means for performing at least one of automatic focusing control, automatic exposure control, and automatic white balance adjustment,
    An imaging apparatus comprising:
  2. The imaging apparatus according to claim 1, further comprising a unit that erases a mark indicating a face detection point when a face area is detected by the face detection unit.
  3. The imaging apparatus according to claim 2, further comprising means for setting a center of the detected face area as a face detection point when the face area is detected by the face detecting means.
JP2005018603A 2005-01-26 2005-01-26 Imaging apparatus Pending JP2006211139A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2005018603A JP2006211139A (en) 2005-01-26 2005-01-26 Imaging apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2005018603A JP2006211139A (en) 2005-01-26 2005-01-26 Imaging apparatus

Publications (1)

Publication Number Publication Date
JP2006211139A true JP2006211139A (en) 2006-08-10

Family

ID=36967521

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2005018603A Pending JP2006211139A (en) 2005-01-26 2005-01-26 Imaging apparatus

Country Status (1)

Country Link
JP (1) JP2006211139A (en)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008029503A1 (en) * 2006-09-04 2008-03-13 Nikon Corporation Camera
WO2008032860A1 (en) * 2006-09-13 2008-03-20 Ricoh Company, Ltd. Imaging device and subject detection method
JP2008098954A (en) * 2006-10-12 2008-04-24 Nec Corp Imaging apparatus, method, and program
JP2008109336A (en) * 2006-10-25 2008-05-08 Matsushita Electric Ind Co Ltd Image processor and imaging apparatus
JP2008164839A (en) * 2006-12-27 2008-07-17 Fujifilm Corp Photographing apparatus and focusing method
JP2008187591A (en) * 2007-01-31 2008-08-14 Fujifilm Corp Imaging apparatus and imaging method
JP2008193278A (en) * 2007-02-02 2008-08-21 Ricoh Co Ltd Image pickup device, image pickup method, and program for computer to perform the same method
JP2008311920A (en) * 2007-06-14 2008-12-25 Fujifilm Corp Imaging apparatus
JP2009017124A (en) * 2007-07-03 2009-01-22 Canon Inc Imaging apparatus, and its control method
JP2009077321A (en) * 2007-09-25 2009-04-09 Casio Comput Co Ltd Image recording device and image recording processing method
JP2009088749A (en) * 2007-09-28 2009-04-23 Casio Comput Co Ltd Imaging apparatus, image photographing method by scenario, and program
JP2009100450A (en) * 2007-09-28 2009-05-07 Casio Comput Co Ltd Image capture device and program
JP2009147605A (en) * 2007-12-13 2009-07-02 Casio Comput Co Ltd Imaging apparatus, imaging method, and program
JP2010028354A (en) * 2008-07-17 2010-02-04 Canon Inc Imaging apparatus, imaging method, program, and recording medium
JP2010243731A (en) * 2009-04-03 2010-10-28 Fujifilm Corp Autofocus system
US8134604B2 (en) 2006-11-13 2012-03-13 Sanyo Electric Co., Ltd. Camera shake correction device, camera shake correction method and imaging device
CN102438105A (en) * 2007-06-14 2012-05-02 富士胶片株式会社 Photographing apparatus
US8368764B2 (en) 2007-01-17 2013-02-05 Samsung Electronics Co., Ltd. Digital photographing apparatus and method for controlling the same
US8432475B2 (en) 2008-03-03 2013-04-30 Sanyo Electric Co., Ltd. Imaging device
US8526761B2 (en) 2009-12-17 2013-09-03 Sanyo Electric Co., Ltd. Image processing apparatus and image sensing apparatus
KR101396333B1 (en) 2007-10-02 2014-05-26 삼성전자주식회사 Apparatus for processing digital image and method controlling thereof
KR101406799B1 (en) 2007-10-02 2014-06-12 삼성전자주식회사 Digital image processing apparatus displaying the face recognition mark and the method of controlling the same
KR101427651B1 (en) * 2007-11-07 2014-08-07 삼성전자주식회사 Apparatus and method for adjusting face detection processing speed in digital image processing device
US8929598B2 (en) 2011-06-29 2015-01-06 Olympus Imaging Corp. Tracking apparatus, tracking method, and storage medium to store tracking program

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008029503A1 (en) * 2006-09-04 2008-03-13 Nikon Corporation Camera
JP2008061157A (en) * 2006-09-04 2008-03-13 Nikon Corp Camera
US8538252B2 (en) 2006-09-04 2013-09-17 Nikon Corporation Camera
WO2008032860A1 (en) * 2006-09-13 2008-03-20 Ricoh Company, Ltd. Imaging device and subject detection method
JP2008099246A (en) * 2006-09-13 2008-04-24 Ricoh Co Ltd Imaging device and subject detection method
US8830346B2 (en) 2006-09-13 2014-09-09 Ricoh Company, Ltd. Imaging device and subject detection method
KR101117851B1 (en) * 2006-09-13 2012-03-20 가부시키가이샤 리코 Imaging device and subject detection method
US8358350B2 (en) 2006-09-13 2013-01-22 Ricoh Company, Ltd. Imaging device and subject detection method
JP2008098954A (en) * 2006-10-12 2008-04-24 Nec Corp Imaging apparatus, method, and program
JP2008109336A (en) * 2006-10-25 2008-05-08 Matsushita Electric Ind Co Ltd Image processor and imaging apparatus
US8134604B2 (en) 2006-11-13 2012-03-13 Sanyo Electric Co., Ltd. Camera shake correction device, camera shake correction method and imaging device
JP2008164839A (en) * 2006-12-27 2008-07-17 Fujifilm Corp Photographing apparatus and focusing method
US8368764B2 (en) 2007-01-17 2013-02-05 Samsung Electronics Co., Ltd. Digital photographing apparatus and method for controlling the same
JP2008187591A (en) * 2007-01-31 2008-08-14 Fujifilm Corp Imaging apparatus and imaging method
JP2008193278A (en) * 2007-02-02 2008-08-21 Ricoh Co Ltd Image pickup device, image pickup method, and program for computer to perform the same method
JP2008311920A (en) * 2007-06-14 2008-12-25 Fujifilm Corp Imaging apparatus
CN102438099A (en) * 2007-06-14 2012-05-02 富士胶片株式会社 Digital image pickup apparatus
US9065998B2 (en) 2007-06-14 2015-06-23 Fujifilm Corporation Photographing apparatus provided with an object detection function
CN102438105A (en) * 2007-06-14 2012-05-02 富士胶片株式会社 Photographing apparatus
US9131138B2 (en) 2007-06-14 2015-09-08 Fujifilm Corporation Photographing apparatus
JP2009017124A (en) * 2007-07-03 2009-01-22 Canon Inc Imaging apparatus, and its control method
JP2009077321A (en) * 2007-09-25 2009-04-09 Casio Comput Co Ltd Image recording device and image recording processing method
JP2009100450A (en) * 2007-09-28 2009-05-07 Casio Comput Co Ltd Image capture device and program
JP2009088749A (en) * 2007-09-28 2009-04-23 Casio Comput Co Ltd Imaging apparatus, image photographing method by scenario, and program
KR101406799B1 (en) 2007-10-02 2014-06-12 삼성전자주식회사 Digital image processing apparatus displaying the face recognition mark and the method of controlling the same
KR101396333B1 (en) 2007-10-02 2014-05-26 삼성전자주식회사 Apparatus for processing digital image and method controlling thereof
KR101427651B1 (en) * 2007-11-07 2014-08-07 삼성전자주식회사 Apparatus and method for adjusting face detection processing speed in digital image processing device
JP2009147605A (en) * 2007-12-13 2009-07-02 Casio Comput Co Ltd Imaging apparatus, imaging method, and program
US8432475B2 (en) 2008-03-03 2013-04-30 Sanyo Electric Co., Ltd. Imaging device
JP2010028354A (en) * 2008-07-17 2010-02-04 Canon Inc Imaging apparatus, imaging method, program, and recording medium
JP2010243731A (en) * 2009-04-03 2010-10-28 Fujifilm Corp Autofocus system
US8526761B2 (en) 2009-12-17 2013-09-03 Sanyo Electric Co., Ltd. Image processing apparatus and image sensing apparatus
US8929598B2 (en) 2011-06-29 2015-01-06 Olympus Imaging Corp. Tracking apparatus, tracking method, and storage medium to store tracking program

Similar Documents

Publication Publication Date Title
US9681040B2 (en) Face tracking for controlling imaging parameters
KR101142316B1 (en) Image selection device and method for selecting image
JP5090474B2 (en) Electronic camera and image processing method
KR100820850B1 (en) Image processing apparatus and image processing method
KR100839772B1 (en) Object decision device and imaging device
JP5098259B2 (en) Camera
TWI549501B (en) An imaging device, and a control method thereof
US7672580B2 (en) Imaging apparatus and method for controlling display device
JP5188071B2 (en) Focus adjustment device, imaging device, and focus adjustment method
KR101503333B1 (en) Image capturing apparatus and control method thereof
US7791668B2 (en) Digital camera
JP5088118B2 (en) Focus adjustment device
US7460782B2 (en) Picture composition guide
JP3888996B2 (en) Zoom method for small digital camera
JP4254873B2 (en) Image processing apparatus, image processing method, imaging apparatus, and computer program
JP4553346B2 (en) Focus adjustment device and focus adjustment method
US8605942B2 (en) Subject tracking apparatus, imaging apparatus and subject tracking method
KR100890949B1 (en) Electronic device and method in an electronic device for processing image data
JP6106921B2 (en) Imaging apparatus, imaging method, and imaging program
JP5268433B2 (en) Imaging device and imaging device control method
US5745175A (en) Method and system for providing automatic focus control for a still digital camera
US7660519B2 (en) Autofocus apparatus
US8581996B2 (en) Imaging device
JP4539729B2 (en) Image processing apparatus, camera apparatus, image processing method, and program
JP5335302B2 (en) Focus detection apparatus and control method thereof

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20070801

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20090716

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20090913

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20091015