JP2006227080A - Electronic camera - Google Patents

Electronic camera Download PDF

Info

Publication number
JP2006227080A
JP2006227080A JP2005037675A JP2005037675A JP2006227080A JP 2006227080 A JP2006227080 A JP 2006227080A JP 2005037675 A JP2005037675 A JP 2005037675A JP 2005037675 A JP2005037675 A JP 2005037675A JP 2006227080 A JP2006227080 A JP 2006227080A
Authority
JP
Japan
Prior art keywords
area
face
focus
focus detection
electronic camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2005037675A
Other languages
Japanese (ja)
Other versions
JP4639837B2 (en
Inventor
Takumi Kawahara
巧 河原
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nikon Corp
Original Assignee
Nikon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp filed Critical Nikon Corp
Priority to JP2005037675A priority Critical patent/JP4639837B2/en
Priority to US11/345,393 priority patent/US20060182433A1/en
Publication of JP2006227080A publication Critical patent/JP2006227080A/en
Priority to US12/289,747 priority patent/US7881601B2/en
Application granted granted Critical
Publication of JP4639837B2 publication Critical patent/JP4639837B2/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide an electronic camera by which a person being a subject is stably focused by face recognition. <P>SOLUTION: The electronic camera has: an imaging device to photoelectrically convert a subject image by a photographic optical system so as to generate the image signal of a photographic image plane; a face recognition part to detect a face area in the photographic image plane based on the image signal; a focus detection area designation part to set a focus detection area including the contour of the face area in a focus detection area group arranged within the photographic image plane as a designated area; and a focusing control part to calculate the focus evaluated value of the subject image based on the image signal corresponding to the designated area and detect the position of the photographic optical system where the focus evaluated value becomes maximum as a focusing position. <P>COPYRIGHT: (C)2006,JPO&NCIPI

Description

本発明は、顔認識の結果に基づいて焦点検出を行う電子カメラに関する。   The present invention relates to an electronic camera that performs focus detection based on a result of face recognition.

近年、撮像素子によって被写体像を電子的に記録する電子カメラが急速に普及しつつあり、かかる電子カメラには顔認識機能を有するものも知られている。例えば、特許文献1には、顔認識機能を有する電子カメラについて、被写体の目に基づいて合焦制御を行う電子カメラが開示されている。
特開2001−215403号公報
In recent years, electronic cameras that electronically record a subject image using an image sensor have been rapidly spreading, and such electronic cameras having a face recognition function are also known. For example, Patent Document 1 discloses an electronic camera that performs focusing control based on the eyes of a subject for an electronic camera having a face recognition function.
JP 2001-215403 A

しかし、上記特許文献1の電子カメラは、例えば、被写体の人物が目を閉じている場合や眼鏡をかけている場合などでは被写体の目に合焦させるのが困難であり、合焦動作の安定性が低い点でなお改善の余地があった。
本発明は上記従来技術の課題を解決するためのものであって、その目的は、顔認識により被写体の人物に安定して合焦させることのできる電子カメラを提供することである。
However, in the electronic camera disclosed in Patent Document 1, it is difficult to focus on the subject's eyes when the subject person has closed eyes or wearing glasses, for example. There was still room for improvement in terms of low nature.
The present invention is to solve the above-described problems of the prior art, and an object of the present invention is to provide an electronic camera that can stably focus on a subject person through face recognition.

第1の発明に係る電子カメラは、撮影光学系による被写体像を光電変換して撮影画面の画像信号を生成する撮像素子と、前記画像信号に基づいて撮影画面内の顔領域を検出する顔認識部と、前記撮影画面内に配置された焦点検出エリア群のうち、前記顔領域の輪郭を含む焦点検出エリアを指定エリアに設定する焦点検出エリア指定部と、前記指定エリアに対応する前記画像信号に基づいて前記被写体像の焦点評価値を算出し、該焦点評価値が最大となる前記撮影光学系の位置を合焦位置として検出する合焦制御部と、を有することを特徴とする。   An electronic camera according to a first aspect of the present invention is an image sensor that photoelectrically converts a subject image by a photographing optical system to generate an image signal of a photographing screen, and face recognition that detects a face area in the photographing screen based on the image signal. A focus detection area designating unit that sets, as a designated area, a focus detection area that includes the outline of the face region, among the focus detection area groups arranged in the shooting screen, and the image signal corresponding to the designated area And a focus control unit that calculates a focus evaluation value of the subject image based on the image and detects a position of the photographing optical system that maximizes the focus evaluation value as a focus position.

第2の発明は、第1の発明において、前記焦点検出エリア指定部は、前記顔領域の輪郭を含む複数の焦点検出エリアのうちの一部を前記指定エリアに設定することを特徴とする。
第3の発明は、第2の発明において、前記焦点検出エリア指定部は、前記顔領域の上側または横側で輪郭と重複する焦点検出エリアを前記指定エリアに設定することを特徴とする。
According to a second aspect, in the first aspect, the focus detection area designating unit sets a part of the plurality of focus detection areas including the outline of the face area as the designated area.
According to a third aspect, in the second aspect, the focus detection area designating unit sets a focus detection area overlapping with a contour on the upper side or the side of the face region as the designated area.

第4の発明は、第1から第3のいずれかの発明において、前記焦点検出エリア指定部は、前記顔領域の輪郭を含む焦点検出エリアで前記合焦位置が検出できない場合には、前記顔領域の下方に位置する焦点検出エリアを前記指定エリアに変更することを特徴とする。
第5の発明は、第3または第4の発明において、前記顔認識部は、前記顔領域における顔パーツの位置関係に基づいて顔の方向を検出し、前記焦点検出エリア指定部は、前記顔の方向に応じて前記指定エリアとなる焦点検出エリアの位置を変化させることを特徴とする。
In a fourth aspect based on any one of the first to third aspects, the focus detection area designating unit is configured to detect the face when the focus position cannot be detected in a focus detection area including a contour of the face area. The focus detection area located below the area is changed to the designated area.
In a fifth aspect based on the third or fourth aspect, the face recognition unit detects a face direction based on a positional relationship of face parts in the face region, and the focus detection area designation unit includes the face The position of the focus detection area, which is the designated area, is changed according to the direction.

第6の発明は、第3または第4の発明において、前記電子カメラの撮影姿勢を検出する姿勢検出部をさらに有し、前記焦点検出エリア指定部は、前記撮影姿勢に応じて前記指定エリアとなる焦点検出エリアの位置を変化させることを特徴とする。
第7の発明は、第1から第6のいずれかの発明において、前記画像信号に基づいて撮影画面のファインダ画像を表示し、前記指定エリアで前記合焦位置が検出できない場合には、前記ファインダ画像の顔領域と関連付けされた合焦不能表示を表示する電子ファインダ部をさらに有することを特徴とする。
According to a sixth invention, in the third or fourth invention, the camera further includes a posture detection unit that detects a shooting posture of the electronic camera, and the focus detection area designating unit is configured to detect the designated area according to the photographing posture. The position of the focus detection area is changed.
According to a seventh invention, in any one of the first to sixth inventions, when a finder image of a shooting screen is displayed based on the image signal and the in-focus position cannot be detected in the designated area, the finder is displayed. It further has an electronic finder unit for displaying a non-focusable display associated with the face area of the image.

(作用)
第1の発明は、顔領域の輪郭を含む焦点検出エリア(指定エリア)の画像信号に基づいて被写体像の焦点評価値を算出し、いわゆるコントラスト検出方式の合焦動作を行う。すなわち、第1の発明ではコントラストが高い顔の輪郭部分の情報も用いて焦点評価値を演算するので、検出された被写体の顔への合焦精度を向上させることができる。また、顔領域の輪郭の検出は比較的容易であって、被写体の顔の表情などで合焦精度が左右されるおそれも低下する。
(Function)
In the first invention, a focus evaluation value of a subject image is calculated based on an image signal of a focus detection area (designated area) including a contour of a face region, and a so-called contrast detection type focusing operation is performed. That is, in the first invention, since the focus evaluation value is calculated using information on the contour portion of the face with high contrast, it is possible to improve the focusing accuracy of the detected subject on the face. Further, the detection of the contour of the face area is relatively easy, and the possibility that the focusing accuracy is affected by the facial expression of the subject is reduced.

第2の発明は、顔領域の輪郭を含む焦点検出エリアの一部の画像信号によって合焦動作を行う。したがって、第2の発明では第1の発明とほぼ同様の作用に加え、合焦制御部による焦点評価値の演算負荷が軽減されるので、合焦制御部の構成の簡素化や合焦動作の一層の高速化を図ることが可能となる。
第3の発明は、第2の発明で指定エリアを設定する場合に顔領域の上側の輪郭または横側の輪郭と重複する焦点検出エリアを優先する。すなわち、第3の発明では、首の肌色部分によってコントラストが低くなる顔領域の下側を除外して、顔領域の上側または横側でコントラストが高くなる箇所を指定エリアとするので、第2の発明においても高い合焦精度を確保できる。
In the second invention, a focusing operation is performed using an image signal of a part of the focus detection area including the outline of the face region. Therefore, in the second invention, in addition to the operation almost the same as that of the first invention, the calculation load of the focus evaluation value by the focusing control unit is reduced, so that the configuration of the focusing control unit can be simplified and the focusing operation can be performed. It is possible to further increase the speed.
In the third invention, when the designated area is set in the second invention, priority is given to the focus detection area overlapping with the upper contour or the lateral contour of the face region. That is, in the third invention, the lower part of the face area where the contrast is lowered due to the skin color part of the neck is excluded, and the place where the contrast is higher on the upper side or the side of the face area is set as the designated area. Also in the invention, high focusing accuracy can be ensured.

第4の発明では、顔領域の輪郭を含む焦点検出エリアで合焦位置が検出できない場合には、人物の体が位置する顔領域の下方の焦点検出エリアで合焦動作を行う。したがって、顔領域で合焦位置が検出できなかった場合でも被写体の人物に合焦できる可能性が向上する。
第5の発明では、顔領域の顔パーツの位置関係に基づいて顔の方向を検出して指定エリアの位置を変更する。したがって、正位置や縦位置などの撮影姿勢の変化に係わらず安定した合焦精度を確保できる。
In the fourth invention, when the focus position cannot be detected in the focus detection area including the outline of the face area, the focus operation is performed in the focus detection area below the face area where the human body is located. Therefore, even if the focus position cannot be detected in the face area, the possibility that the subject person can be focused is improved.
In the fifth invention, the position of the designated area is changed by detecting the direction of the face based on the positional relationship of the facial parts in the face area. Therefore, stable focusing accuracy can be ensured regardless of changes in the photographing posture such as the normal position and the vertical position.

第6の発明では、電子カメラの撮影姿勢を検出する姿勢検出部によって指定エリアの位置を変更する。したがって、正位置や縦位置などの撮影姿勢の変化に係わらず安定した合焦精度を確保できる。
第7の発明では、指定エリアで合焦位置が検出できない場合には、ファインダ画像の顔領域と関連付けされた合焦不能表示が電子ファインダ部に表示される。したがって、電子ファインダ部の表示に基づいて、被写体の人物に合焦しているか否かをユーザーが容易に判断できる。
In the sixth invention, the position of the designated area is changed by the posture detection unit that detects the photographing posture of the electronic camera. Therefore, stable focusing accuracy can be ensured regardless of changes in the photographing posture such as the normal position and the vertical position.
In the seventh invention, when the in-focus position cannot be detected in the designated area, the in-focus indication associated with the face area of the finder image is displayed on the electronic finder unit. Therefore, the user can easily determine whether or not the subject person is in focus based on the display of the electronic viewfinder.

本発明によれば、コントラストが高く、かつ検出の容易な顔の輪郭部分の情報を用いて焦点評価値を演算するので、検出された被写体の顔への合焦精度を向上させることができる。   According to the present invention, since the focus evaluation value is calculated using information on the face contour portion having high contrast and easy detection, it is possible to improve the focusing accuracy of the detected subject on the face.

(第1実施形態の説明)
図1は第1実施形態の電子カメラの概要を示すブロック図である。第1実施形態の電子カメラは、撮影レンズ11と、レンズ駆動機構12と、撮像素子13と、アナログ信号処理部14と、A/D変換部15と、画像処理部16と、圧縮伸長処理部17と、メモリ18と、カードI/F19と、モニタI/F20および液晶モニタ21と、操作部22と、CPU23と、データバス24とを有している。なお、画像処理部16、圧縮伸長処理部17、メモリ18、カードI/F19、モニタI/F20およびCPU23はそれぞれデータバス24を介して接続されている。
(Description of the first embodiment)
FIG. 1 is a block diagram showing an outline of the electronic camera of the first embodiment. The electronic camera according to the first embodiment includes a photographing lens 11, a lens driving mechanism 12, an image sensor 13, an analog signal processing unit 14, an A / D conversion unit 15, an image processing unit 16, and a compression / decompression processing unit. 17, a memory 18, a card I / F 19, a monitor I / F 20 and a liquid crystal monitor 21, an operation unit 22, a CPU 23, and a data bus 24. The image processing unit 16, the compression / decompression processing unit 17, the memory 18, the card I / F 19, the monitor I / F 20, and the CPU 23 are connected via a data bus 24.

撮影レンズ11は、合焦位置調節用のフォーカシングレンズを含む複数のレンズ群で構成されている。この撮影レンズ11はレンズ駆動機構12によって光軸方向の位置が調整される。
撮像素子13は撮影レンズ11の像空間側に配置されている。撮像素子13の受光面(撮影レンズ11と相対する面)には被写体像を光電変換してアナログ画像信号を生成する受光画素が2次元配列されている。この撮像素子13の出力はアナログ信号処理部14に接続されている。
The taking lens 11 is composed of a plurality of lens groups including a focusing lens for adjusting the in-focus position. The photographing lens 11 is adjusted in position in the optical axis direction by a lens driving mechanism 12.
The image sensor 13 is disposed on the image space side of the photographing lens 11. On the light receiving surface of the image sensor 13 (the surface opposite to the photographing lens 11), light receiving pixels for photoelectrically converting a subject image to generate an analog image signal are two-dimensionally arranged. The output of the image sensor 13 is connected to an analog signal processing unit 14.

また、撮像素子13は非レリーズ時においても所定間隔毎に被写体を露光し、間引き読み出し等でアナログ画像信号(スルー画像信号)を出力する。このスルー画像信号は、CPU23によるAF演算、AE演算および顔認識や、画像処理部26によるファインダ動画像の生成などに使用される。なお、第1実施形態の撮像素子13は、電荷順次転送方式(CCD等)またはXYアドレス方式(CMOS等)のいずれであってもよい。   In addition, the imaging device 13 exposes the subject at predetermined intervals even when the release is not performed, and outputs an analog image signal (through image signal) by thinning readout or the like. This through image signal is used for AF calculation, AE calculation and face recognition by the CPU 23, and generation of a finder moving image by the image processing unit 26. Note that the image sensor 13 of the first embodiment may be either a charge sequential transfer method (CCD or the like) or an XY address method (CMOS or the like).

アナログ信号処理部14は、相関二重サンプリングを行うCDS回路、アナログ画像信号の出力を増幅するゲイン回路、入力信号の波形を一定の電圧レベルにクランプするクランプ回路等で構成されている。A/D変換部15はアナログ信号処理部14から出力されたアナログ画像信号をデジタル画像信号に変換する。このA/D変換部15の出力は、画像処理部16とCPU23とにそれぞれ接続されている。   The analog signal processing unit 14 includes a CDS circuit that performs correlated double sampling, a gain circuit that amplifies the output of the analog image signal, a clamp circuit that clamps the waveform of the input signal to a constant voltage level, and the like. The A / D conversion unit 15 converts the analog image signal output from the analog signal processing unit 14 into a digital image signal. The output of the A / D conversion unit 15 is connected to the image processing unit 16 and the CPU 23, respectively.

画像処理部16は、レリーズ時のデジタル画像信号に画像処理(欠陥画素補正、ガンマ補正、補間、色変換、エッジ強調など)を施して撮影画像データを生成する。また、画像処理部16は、非レリーズ時のデジタル画像信号(スルー画像信号)に基づいてファインダ画像を順次生成する。
さらに、画像処理部16は後述の顔認識情報に基づいてファインダ画像にAF対象の顔領域を示す矩形のAF枠を合成表示する(図4参照)。また、画像処理部16は後述の合焦不能情報に基づいて、上記のAF枠を用いてファインダ画像に合焦不能表示を行う。この合焦不能表示としては、例えば、AF枠の点滅表示や、通常時とAF枠の色を変更する等の手段が挙げられる。なお、後述のようにAF演算が2回実行されてそのいずれもが合焦不能の場合には、画像処理部16は第1回目と第2回目とでそれぞれ異なる合焦不能表示を行う。
The image processing unit 16 performs image processing (defective pixel correction, gamma correction, interpolation, color conversion, edge enhancement, etc.) on the digital image signal at the time of release to generate photographed image data. The image processing unit 16 sequentially generates a finder image based on the digital image signal (through image signal) at the time of non-release.
Further, the image processing unit 16 synthesizes and displays a rectangular AF frame indicating the AF target face area on the finder image based on face recognition information described later (see FIG. 4). Further, the image processing unit 16 performs an in-focus display on the finder image using the AF frame based on the in-focus inability information described later. Examples of the in-focus incapacity display include blinking display of the AF frame and means for changing the color of the AF frame during normal operation. As will be described later, when the AF calculation is executed twice and none of them is in focus, the image processing unit 16 performs different in-focus indications for the first time and the second time.

圧縮伸長処理部17は画像処理後の撮影画像データをJPEG形式で圧縮する処理や、JPEG形式の圧縮画像データを伸長して復元する処理を実行する。メモリ18はSDRAM等で構成され、複数フレーム分の画像データを記録できる容量を有している。このメモリ18には、画像処理部16による画像処理の前後で画像データが一時的に保存される。   The compression / decompression processing unit 17 executes processing for compressing the captured image data after image processing in the JPEG format, and processing for expanding and restoring the compressed image data in the JPEG format. The memory 18 is composed of SDRAM or the like, and has a capacity capable of recording image data for a plurality of frames. The memory 18 temporarily stores image data before and after image processing by the image processing unit 16.

カードI/F19には記録媒体25を接続するためのコネクタが形成されている。記録媒体25は公知の半導体メモリなどで構成され、この記録媒体25には上記の撮影画像データが最終的に記録される。なお、第1実施形態で生成される撮影画像データはExif(Exchangeable image file format for digital still cameras)規格に準拠し、撮影画像データ本体と、撮影画像データに関する付属情報(撮影情報等)とが関連付けされて記録されている。   A connector for connecting the recording medium 25 is formed on the card I / F 19. The recording medium 25 is composed of a known semiconductor memory or the like, and the photographic image data is finally recorded on the recording medium 25. The photographed image data generated in the first embodiment conforms to the Exif (Exchangeable image file format for digital still cameras) standard, and the photographed image data body is associated with attached information (shooting information, etc.) on the photographed image data. Has been recorded.

モニタI/F20には液晶モニタ21が接続されている。液晶モニタ21は主として電子カメラの背面部分に配置される。撮影中の液晶モニタ21には、画像処理部16から順次出力されるファインダ画像が動画表示される。また、液晶モニタ21には、撮影画像データの再生画面や電子カメラの各種設定を変更するための設定画面なども表示される。
操作部22は、電子カメラの各種撮影モード(撮影モードや再生モード等)の切り換えや設定入力を行う入力釦や、レリーズ釦などを備えている。
A liquid crystal monitor 21 is connected to the monitor I / F 20. The liquid crystal monitor 21 is mainly disposed on the back portion of the electronic camera. The finder images sequentially output from the image processing unit 16 are displayed as moving images on the liquid crystal monitor 21 during shooting. The liquid crystal monitor 21 also displays a captured image data playback screen, a setting screen for changing various settings of the electronic camera, and the like.
The operation unit 22 includes an input button for performing switching and setting input of various shooting modes (such as a shooting mode and a playback mode) of the electronic camera, a release button, and the like.

CPU23は、図示しないROMに格納されたシーケンスプログラムに従って電子カメラの各部動作を制御する。例えば、CPU23はスルー画像信号に基づいてAE演算やホワイトバランスゲインの演算等を実行する。また、CPU23はレリーズ時にExif規格に基づいて撮影画像データの付属情報を生成する。そして、特に第1実施形態ではCPU23は以下の機能を有している。   The CPU 23 controls the operation of each part of the electronic camera according to a sequence program stored in a ROM (not shown). For example, the CPU 23 executes AE calculation, white balance gain calculation, and the like based on the through image signal. Further, the CPU 23 generates attached information of the captured image data based on the Exif standard at the time of release. And especially in 1st Embodiment, CPU23 has the following functions.

第1に、CPU23はスルー画像信号に対して公知の顔認識処理を施し、撮影画面内の人物の顔領域を検出する。そして、CPU23は撮影画面内の顔領域の位置を示す顔認識情報を生成する。また、第1実施形態では、CPU23は顔認識時において顔パーツ(目、鼻、口等)の位置関係に基づいて顔の上下方向も検出する。
なお、顔認識処理の1例として特開平8−63597号公報には、色に基づいて肌色領域の輪郭を抽出し、予め用意する顔の輪郭テンプレートとのマッチング度で顔を検出する方法や、更に目候補領域を求めて目テンプレートとのマッチング度を用いて顔を検出する方法や、顔の輪郭テンプレートで求まる顔候補領域の2次元フーリエ変換結果と、目、鼻、口等を含む予め用意した顔テンプレート画像の2次元フーリエ変換結果とから定義される特徴量を求め、この特徴量を閾値処理して顔検出を行なう方法等が開示されている。
First, the CPU 23 performs a known face recognition process on the through image signal, and detects a human face area in the shooting screen. Then, the CPU 23 generates face recognition information indicating the position of the face area in the shooting screen. In the first embodiment, the CPU 23 also detects the vertical direction of the face based on the positional relationship of face parts (eyes, nose, mouth, etc.) during face recognition.
As an example of face recognition processing, Japanese Patent Application Laid-Open No. 8-63597 discloses a method for detecting a face with a degree of matching with a face outline template prepared in advance by extracting the outline of a skin color area based on color, In addition, a method for detecting the face using the matching degree with the eye template by obtaining the eye candidate area, the two-dimensional Fourier transform result of the face candidate area obtained from the face contour template, and the eyes, nose, mouth, etc. are prepared in advance. A method is disclosed in which a feature amount defined from the two-dimensional Fourier transform result of the face template image obtained is obtained, and the feature amount is subjected to threshold processing to perform face detection.

第2に、CPU23は撮影画面内に位置する指定エリアのスルー画像信号に基づいてコントラスト検出方式のAF演算を実行する。ここで、指定エリアは撮影画面内に規則的に配置された複数の焦点検出エリア(焦点検出エリア群)のうちから、CPU23が顔認識情報に基づいて選択する。第1実施形態では、顔領域の輪郭を囲む矩形領域内に位置するすべての焦点検出エリアが指定エリアを構成する。第1実施形態では指定エリアが上記のAF枠の範囲と一致するように設定される。なお、指定エリアの範囲と顔領域とが完全一致することは稀であるので、指定エリア内には顔領域と隣接する周辺部分も含まれ、顔領域の輪郭部分で高いコントラストが生じることとなる(図3参照)。   Secondly, the CPU 23 executes a contrast detection AF operation based on a through image signal of a designated area located in the shooting screen. Here, the designated area is selected by the CPU 23 based on the face recognition information from among a plurality of focus detection areas (focus detection area group) regularly arranged in the shooting screen. In the first embodiment, all focus detection areas located in a rectangular area surrounding the outline of the face area constitute a designated area. In the first embodiment, the designated area is set to coincide with the range of the AF frame. Since it is rare that the range of the designated area and the face area completely coincide with each other, the designated area includes a peripheral portion adjacent to the face area, and high contrast occurs in the outline portion of the face area. (See FIG. 3).

また、第1実施形態では、第1回目のAF演算(顔領域を含む指定エリア)で合焦位置が検出できない場合には、CPU23は顔領域の下方に位置する焦点検出エリア(被写体の体が存在する可能性が高い焦点検出エリア)を指定エリアに変更する。なお、この指定エリアの変更設定は、CPU23が顔パーツで検出した顔の上下方向が基準となる。
ここで、コントラスト検出方式のAF演算は、「像のぼけの程度とコントラストには相関関係があって、合焦時に像のコントラストが最大になる」という原理に基づいて行われる。具体的には、まずCPU23は指定エリアに対応するスルー画像信号からバンドパスフィルタによって所定帯域の高周波成分を抽出する。そして、CPU23は高周波成分の絶対値を積分して指定エリア内の被写体像に関する焦点評価値を生成する。この焦点評価値は合焦位置でコントラストが最大となったときに最大値となる。
In the first embodiment, when the in-focus position cannot be detected by the first AF calculation (designated area including the face area), the CPU 23 detects the focus detection area (the body of the subject is located below the face area). The focus detection area that is likely to exist is changed to the designated area. The change setting of the designated area is based on the vertical direction of the face detected by the CPU 23 with the face parts.
Here, the AF calculation of the contrast detection method is performed based on the principle that “the degree of image blur and the contrast have a correlation, and the contrast of the image is maximized at the time of focusing”. Specifically, the CPU 23 first extracts a high-frequency component in a predetermined band from the through image signal corresponding to the designated area using a band-pass filter. Then, the CPU 23 integrates the absolute value of the high frequency component to generate a focus evaluation value related to the subject image in the designated area. This focus evaluation value becomes the maximum value when the contrast becomes maximum at the in-focus position.

次に、CPU23はフォーカシングレンズを所定方向に移動させて移動前後の焦点評価値の大小を比較する。移動後の焦点評価値が大きい場合はコントラストが高まる傾向であるとみなして、CPU23はフォーカシングレンズをさらに同一方向に移動させて同様の演算を行う。一方、移動後の焦点評価値が小さい場合はコントラストが低下する状態であるので、CPU23はフォーカシングレンズを逆方向に移動させて同様の演算を行う。上記処理を繰り返すことでCPU23は焦点評価値のピーク(合焦位置)を探索する。上記動作は一般的に「山登り動作」と称されている。なお、指定エリアで合焦位置が検出できない場合には、CPU23は合焦不能情報を画像処理部16に出力する。   Next, the CPU 23 moves the focusing lens in a predetermined direction and compares the focus evaluation values before and after the movement. If the focus evaluation value after the movement is large, it is considered that the contrast tends to increase, and the CPU 23 further performs the same calculation by moving the focusing lens further in the same direction. On the other hand, when the focus evaluation value after the movement is small, the contrast is in a lowered state. Therefore, the CPU 23 performs the same calculation by moving the focusing lens in the reverse direction. By repeating the above process, the CPU 23 searches for the peak (focus position) of the focus evaluation value. The above operation is generally referred to as “mountain climbing operation”. If the in-focus position cannot be detected in the designated area, the CPU 23 outputs in-focus inability information to the image processing unit 16.

以下、第1実施形態での撮影動作を図2の流れ図を参照しつつ説明する。
ステップS101:CPU23は所定間隔毎に撮像素子13にスルー画像信号を生成させる。画像処理部16はスルー画像信号に基づいてファインダ画像を生成し、CPU23は液晶モニタ21にファインダ画像を動画表示する。したがって、ユーザーは液晶モニタ21に表示されたファインダ画像によって被写体のフレーミングを行うことができる。
Hereinafter, the photographing operation in the first embodiment will be described with reference to the flowchart of FIG.
Step S101: The CPU 23 causes the image sensor 13 to generate a through image signal at predetermined intervals. The image processing unit 16 generates a finder image based on the through image signal, and the CPU 23 displays the finder image on the liquid crystal monitor 21 as a moving image. Therefore, the user can perform framing of the subject using the finder image displayed on the liquid crystal monitor 21.

ステップS102:CPU23はレリーズ釦が半押しされたか否かを判定する。レリーズ釦が半押しされた場合(YES側)にはS103に移行する。一方、レリーズ釦に入力がない場合(NO側)にはCPU23はレリーズ釦の半押しを待機する。
ステップS103:CPU23はスルー画像信号に基づいて撮影画面内の被写体の顔領域を検出し、撮影画面内に顔領域がある場合には顔認識情報を生成する。
Step S102: The CPU 23 determines whether or not the release button is half-pressed. When the release button is half-pressed (YES side), the process proceeds to S103. On the other hand, when there is no input to the release button (NO side), the CPU 23 waits for half pressing of the release button.
Step S103: The CPU 23 detects the face area of the subject in the shooting screen based on the through image signal, and generates face recognition information if there is a face area in the shooting screen.

ステップS104:CPU23はS103で顔領域が検出されたか否かを判定する。顔領域が検出された場合(YES側)にはS105に移行する。一方、顔領域が検出されない場合(NO側)にはS108に移行する。
ステップS105:CPU23は顔認識情報(S103)に基づいて、顔領域の輪郭を囲む矩形領域内の焦点検出エリアを指定エリアに設定する(図3参照)。そして、CPU23は指定エリアのスルー画像信号に基づいて山登り動作でAF演算を行う。なお、S105のAF演算時には、画像処理部16はファインダ画像における顔領域にAF枠を合成表示する(図4参照)。
Step S104: The CPU 23 determines whether or not a face area is detected in S103. If a face area is detected (YES side), the process proceeds to S105. On the other hand, when the face area is not detected (NO side), the process proceeds to S108.
Step S105: Based on the face recognition information (S103), the CPU 23 sets a focus detection area in a rectangular area surrounding the outline of the face area as a designated area (see FIG. 3). Then, the CPU 23 performs an AF calculation by a hill-climbing operation based on the through image signal in the designated area. At the time of AF calculation in S105, the image processing unit 16 compositely displays an AF frame on the face area in the finder image (see FIG. 4).

ステップS106:CPU23は、指定エリア(S105)で合焦位置が検出できたか否かを判定する。合焦位置が検出できた場合(YES側)にはS109に移行する。一方、合焦位置が検出できない場合(NO側)には、CPU23は合焦不能情報を生成してS107に移行する。
ステップS107:この場合には、CPU23は顔領域の下方に位置する焦点検出エリアを指定エリアに変更する。そして、CPU23は変更後の指定エリアでAF演算を再度実行し、その後にS109に移行する。この第2回目のAF演算で合焦位置が検出できない場合もCPU23は合焦不能情報を生成する。なお、S107のAF演算時には、画像処理部16は第1回目または第2回目の合焦不能情報に基づいて、ファインダ画像のAF枠により合焦不能表示を行う。
Step S106: The CPU 23 determines whether or not the in-focus position has been detected in the designated area (S105). If the in-focus position has been detected (YES side), the process proceeds to S109. On the other hand, if the in-focus position cannot be detected (NO side), the CPU 23 generates in-focus inability information and proceeds to S107.
Step S107: In this case, the CPU 23 changes the focus detection area located below the face area to the designated area. Then, the CPU 23 executes the AF calculation again in the designated area after the change, and then proceeds to S109. Even when the in-focus position cannot be detected by the second AF calculation, the CPU 23 generates in-focus inability information. Note that during the AF calculation in S107, the image processing unit 16 performs in-focus display using the AF frame of the finder image based on the first or second in-focus inability information.

ステップS108:一方、この場合は撮影画面内に人物が存在しないか、あるいは被写体の人物の顔が検出できない状態であるので、CPU23は通常の動作で焦点検出エリアを選択してAF演算を実行する。
ステップS109:そして、ユーザーのレリーズ釦の全押しにより、CPU23は被写体を撮影して撮影画像データを生成する。なお、撮影画像データの生成時には、Exif規格のMakerNoteタグを利用して、CPU23は「顔認識の有無」、「AF演算に用いた指定エリアの位置」などの付属情報を撮影画像データに記録する。
Step S108: On the other hand, in this case, there is no person in the shooting screen, or the face of the subject person cannot be detected, so the CPU 23 selects the focus detection area by a normal operation and executes the AF calculation. .
Step S109: Then, when the user fully presses the release button, the CPU 23 captures the subject and generates captured image data. At the time of generating the photographed image data, the CPU 23 records the attached information such as “presence / absence of face recognition” and “position of the designated area used for the AF calculation” in the photographed image data by using an Exif MakerNote tag. .

ステップS110:CPU23は、ユーザーの入力による撮影終了指示があるか否かを判定する。撮影終了指示がある場合(YES側)にはCPU23はスルー画像信号の生成等を停止して撮影を終了する。一方、撮影終了指示がない場合(NO側)にはS102に戻って、CPU23は一連の動作を繰り返す。以上で第1実施形態の撮影動作の説明を終了する。   Step S110: The CPU 23 determines whether or not there is a photographing end instruction by a user input. When there is a photographing end instruction (YES side), the CPU 23 stops the generation of the through image signal and ends the photographing. On the other hand, when there is no instruction to end photographing (NO side), the process returns to S102, and the CPU 23 repeats a series of operations. This is the end of the description of the shooting operation of the first embodiment.

次に上記第1実施形態の効果を説明する。
(1)第1実施形態では、CPU23は顔領域の輪郭を含む指定エリアでAF演算を行うので、撮影画面内の人物に容易に合焦させることができる。特に、指定エリアでは顔領域の輪郭部分で高いコントラストが生じる。したがって、顔領域内のコントラストが低い部分のみで焦点検出する場合と比べて、第1実施形態ではコントラストのピークを探索し易くなることから被写体の顔での合焦精度も向上する。また、顔領域の輪郭の検出は比較的容易であるので、被写体の顔の表情などで合焦精度が左右されるおそれも低下する。
Next, the effect of the first embodiment will be described.
(1) In the first embodiment, the CPU 23 performs the AF calculation in the designated area including the outline of the face area, so that it is possible to easily focus on a person in the shooting screen. In particular, in the designated area, high contrast occurs in the contour portion of the face area. Therefore, compared with the case where focus detection is performed only in a portion having a low contrast in the face region, the first embodiment makes it easy to search for a peak of contrast, and thus the focusing accuracy on the face of the subject is improved. Further, since the detection of the contour of the face area is relatively easy, the possibility that the focusing accuracy is affected by the facial expression of the subject is reduced.

(2)第1実施形態では、顔領域を指定エリアとした第1回目のAF演算で合焦位置が検出できない場合には、CPU23は人物の体が位置する指定エリアで第2回目のAF演算を行う(S107)。したがって、顔領域によって合焦できない場合にも被写体の人物に高い確率で合焦させることができる。また、CPU23は顔の方向から体の位置を推定して第2回目の指定エリアを設定する。したがって、第2回目のAF演算では、電子カメラの正位置や縦位置などの撮影姿勢に係わらずに安定した合焦精度を確保できる。   (2) In the first embodiment, when the in-focus position cannot be detected by the first AF calculation using the face area as the designated area, the CPU 23 performs the second AF calculation in the designated area where the human body is located. (S107). Therefore, even when the subject cannot be focused by the face area, the subject person can be focused with high probability. Further, the CPU 23 estimates the position of the body from the face direction and sets the second designated area. Therefore, in the second AF calculation, stable focusing accuracy can be ensured regardless of the photographing posture such as the normal position and the vertical position of the electronic camera.

(3)第1実施形態では、AF演算時にはファインダ画像における顔領域にAF枠が合成表示される(S105)。したがって、液晶モニタ21のファインダ画像でユーザーはAF対象となる人物を容易に把握することができる。また、第2回目のAF演算時には、ファインダ画像にAF枠を用いた合焦不能表示が行われる(S107)。この合焦不能表示は、第1回目の合焦不能と第2回目の合焦不能とで表示が異なるので、ユーザーはAF枠の表示状態により人物に合焦しているかを比較的容易に判断できる。   (3) In the first embodiment, at the time of AF calculation, an AF frame is synthesized and displayed on the face area in the finder image (S105). Therefore, the user can easily grasp the person who is the AF target from the finder image on the liquid crystal monitor 21. Further, during the second AF calculation, an in-focus display using the AF frame is performed on the finder image (S107). Since the in-focus state display differs between the first in-focus state and the second in-focus state, it is relatively easy for the user to determine whether the person is in focus based on the AF frame display state. it can.

(4)第1実施形態では、撮影画像データには付属情報として「顔認識の有無」、「AF演算に用いた指定エリアの位置」などが記録される。したがって、パーソナルコンピュータ等のビューアーにより撮影画像データの付属情報を参照することで、撮影時の状況をユーザーが事後的に把握することができる。
(第2実施形態の説明)
図5は第2実施形態の電子カメラの概要を示すブロック図である。なお、以後の実施形態の説明で第1実施形態と共通の構成には同一符号を付して説明を省略する。
(4) In the first embodiment, “presence / absence of face recognition”, “position of designated area used for AF calculation”, and the like are recorded in the captured image data as attached information. Therefore, by referring to the attached information of the photographed image data by a viewer such as a personal computer, the user can grasp the situation at the time of photographing afterwards.
(Description of Second Embodiment)
FIG. 5 is a block diagram showing an outline of the electronic camera of the second embodiment. In the following description of the embodiments, the same components as those in the first embodiment are denoted by the same reference numerals, and description thereof is omitted.

第2実施形態は第1実施形態の変形例であって、姿勢センサ26がCPU23に接続されている点で第1実施形態と構成が相違する。姿勢センサ26は、電子カメラを正位置に構えた撮影姿勢、電子カメラの右手側を上にした右上縦位置撮影姿勢、電子カメラの左手側を上にした左上縦位置撮影姿勢、および電子カメラを逆さにした逆位置撮影姿勢を検出する。そして、顔領域を含む指定エリアで合焦位置が検出できない場合には、CPU23は姿勢センサ26の出力に基づいて顔領域の下方に位置する焦点検出エリアを指定エリアに変更する。   The second embodiment is a modification of the first embodiment, and differs in configuration from the first embodiment in that the attitude sensor 26 is connected to the CPU 23. The posture sensor 26 has a shooting posture with the electronic camera held at the normal position, a right upper vertical shooting posture with the right hand side of the electronic camera up, a left upper vertical shooting posture with the left hand side of the electronic camera up, and an electronic camera. An inverted reverse position shooting posture is detected. If the in-focus position cannot be detected in the designated area including the face area, the CPU 23 changes the focus detection area located below the face area to the designated area based on the output of the posture sensor 26.

第2実施形態では第1実施形態とほぼ同一の効果を得ることができる。また、姿勢センサ26の出力で指定エリアの位置を変更するので、顔の上下方向検出に関するCPU23の演算負荷を軽減させることができる。
(第3実施形態の説明)
図6は第3実施形態での撮影動作を示す流れ図である。ここで、第3実施形態の各ステップはS205を除いて第1実施形態の各ステップにそれぞれ対応するので重複説明は省略する。また、第3実施形態の電子カメラのブロック図は第1実施形態または第2実施形態と共通であるので図示を省略する。
In the second embodiment, substantially the same effect as in the first embodiment can be obtained. In addition, since the position of the designated area is changed by the output of the posture sensor 26, it is possible to reduce the calculation load of the CPU 23 related to the detection of the vertical direction of the face.
(Description of the third embodiment)
FIG. 6 is a flowchart showing the photographing operation in the third embodiment. Here, since each step of the third embodiment corresponds to each step of the first embodiment except for S205, a duplicate description is omitted. The block diagram of the electronic camera of the third embodiment is common to that of the first embodiment or the second embodiment, and is not shown.

ステップS205:CPU23は、顔認識情報(S203)に基づいて、顔領域の輪郭と重複する焦点検出エリアのうち、一部の焦点検出エリアのみを指定エリアに設定する(図7参照)。そして、CPU23は指定エリアのスルー画像信号に基づいて山登り動作でAF演算を行う。
このS205でも顔領域の輪郭部分を指定エリアとするので、第1実施形態と同様に被写体の顔での合焦精度を向上させることができる。もっとも、顔領域の下側の輪郭と重複する焦点検出エリアでは首の肌色部分によりコントラストが低めになる可能性が大きい。そのため、S205では、CPU23は顔領域の上側の輪郭または横側の輪郭と重複する焦点検出エリアを指定エリアとするのが好ましい。なお、顔領域の上側または横側の焦点検出エリアを用いる場合には、CPU23が検出した顔の上下方向や姿勢センサ26の出力に基づいて指定エリアが選択される。
Step S205: Based on the face recognition information (S203), the CPU 23 sets only a part of the focus detection areas among the focus detection areas overlapping the contour of the face area as the designated area (see FIG. 7). Then, the CPU 23 performs an AF calculation by a hill-climbing operation based on the through image signal in the designated area.
Even in S205, since the contour portion of the face area is set as the designated area, the focusing accuracy on the face of the subject can be improved as in the first embodiment. However, in the focus detection area that overlaps the lower contour of the face area, there is a high possibility that the contrast will be lower due to the skin color part of the neck. Therefore, in S205, it is preferable that the CPU 23 sets the focus detection area overlapping with the upper contour or the lateral contour of the face area as the designated area. When the focus detection area on the upper side or the lateral side of the face area is used, the designated area is selected based on the vertical direction of the face detected by the CPU 23 and the output of the posture sensor 26.

この第3実施形態では、第1実施形態と比べて指定エリアが小さいので、AF演算時の演算量も少なくなる。したがって、第3実施形態ではCPU23の回路構成の簡素化やAF演算の一層の高速化を図ることができる。
(請求項と実施形態との対応関係)
ここで、請求項と実施形態との対応関係を示しておく。なお、以下に示す対応関係はあくまで参考のために示した解釈であって、本発明の技術的範囲を限定するものではない。
In the third embodiment, since the designated area is smaller than that in the first embodiment, the calculation amount during AF calculation is also reduced. Therefore, in the third embodiment, it is possible to simplify the circuit configuration of the CPU 23 and further speed up the AF calculation.
(Correspondence between Claims and Embodiments)
Here, the correspondence between the claims and the embodiment is shown. Note that the correspondence shown below is an interpretation given for reference only, and does not limit the technical scope of the present invention.

第1実施形態は請求項1、4、5、7に対応する。第2実施形態は請求項1、4、6に対応する。第3実施形態は請求項1、2、3、5、6に対応する。
また、「撮影光学系」には撮影レンズ11が対応する。「撮像素子」には撮像素子13が対応する。「顔認識部」、「焦点検出エリア指定部」、「合焦制御部」にはCPU23の各機能が対応する。「姿勢検出部」には姿勢センサ26が対応する。「電子ファインダ部」には主に画像処理部16、モニタI/F20、液晶モニタ21が対応する。
The first embodiment corresponds to claims 1, 4, 5, and 7. The second embodiment corresponds to the first, fourth, and sixth aspects. The third embodiment corresponds to claims 1, 2, 3, 5, and 6.
The photographing lens 11 corresponds to the “photographing optical system”. The imaging element 13 corresponds to “imaging element”. Each function of the CPU 23 corresponds to the “face recognition unit”, “focus detection area designation unit”, and “focus control unit”. The posture sensor 26 corresponds to the “posture detection unit”. The “electronic finder unit” mainly corresponds to the image processing unit 16, the monitor I / F 20, and the liquid crystal monitor 21.

(実施形態の補足事項)
以上、本発明を上記の実施形態によって説明してきたが、本発明の技術的範囲は上記実施形態に限定されるものではない。
上記実施形態において顔領域を含む指定エリアで合焦不能の場合には、例えば撮影画面中央の焦点検出エリア等で顔認識の結果に関わりなく第2回目のAF演算を行うようにしてもよい。また、第3実施形態において最初の指定エリアで合焦不能の場合には、顔領域の他の焦点検出エリアで第2回目のAF演算を行うようにしてもよい。さらに、上記実施形態において、第2回目に合焦不能である場合のみ合焦不能表示を行うようにしてもよい。
(Supplementary items of the embodiment)
As mentioned above, although this invention has been demonstrated by said embodiment, the technical scope of this invention is not limited to the said embodiment.
In the above embodiment, when it is impossible to focus in the designated area including the face area, the second AF calculation may be performed regardless of the result of the face recognition in the focus detection area at the center of the shooting screen, for example. Further, in the third embodiment, when in-focus is impossible in the first designated area, the second AF calculation may be performed in another focus detection area of the face area. Furthermore, in the above-described embodiment, the in-focus state display may be performed only when the in-focus state is not possible in the second time.

第1実施形態の電子カメラの概要を示すブロック図1 is a block diagram showing an outline of an electronic camera according to a first embodiment. 第1実施形態での撮影動作を示す流れ図Flow chart showing photographing operation in the first embodiment 第1実施形態での指定エリアの位置を示す図The figure which shows the position of the designation | designated area in 1st Embodiment 第1実施形態での顔認識時のファインダ画像を示す図The figure which shows the finder image at the time of face recognition in 1st Embodiment 第2実施形態の電子カメラの概要を示すブロック図The block diagram which shows the outline | summary of the electronic camera of 2nd Embodiment 第3実施形態での撮影動作を示す流れ図Flow chart showing photographing operation in the third embodiment 第3実施形態の指定エリアの位置を示す図The figure which shows the position of the designation | designated area of 3rd Embodiment

符号の説明Explanation of symbols

11 撮影レンズ
12 レンズ駆動機構
13 撮像素子
14 アナログ信号処理部
15 A/D変換部
16 画像処理部
17 圧縮伸長処理部
18 メモリ
19 カードI/F
20 モニタI/F
21 液晶モニタ
22 操作部
23 CPU
24 データバス
25 記録媒体
26 姿勢センサ
DESCRIPTION OF SYMBOLS 11 Shooting lens 12 Lens drive mechanism 13 Image sensor 14 Analog signal processing part 15 A / D conversion part 16 Image processing part 17 Compression / decompression processing part 18 Memory 19 Card I / F
20 Monitor I / F
21 LCD monitor 22 Operation unit 23 CPU
24 Data bus 25 Recording medium 26 Attitude sensor

Claims (7)

撮影光学系による被写体像を光電変換して撮影画面の画像信号を生成する撮像素子と、
前記画像信号に基づいて撮影画面内の顔領域を検出する顔認識部と、
前記撮影画面内に配置された焦点検出エリア群のうち、前記顔領域の輪郭を含む焦点検出エリアを指定エリアに設定する焦点検出エリア指定部と、
前記指定エリアに対応する前記画像信号に基づいて前記被写体像の焦点評価値を算出し、該焦点評価値が最大となる前記撮影光学系の位置を合焦位置として検出する合焦制御部と、
を有することを特徴とする電子カメラ。
An image sensor that photoelectrically converts a subject image by the photographing optical system to generate an image signal of the photographing screen;
A face recognition unit for detecting a face area in the shooting screen based on the image signal;
A focus detection area designating unit that sets, as a designated area, a focus detection area that includes the outline of the face region, among the focus detection area groups arranged in the shooting screen;
A focus control unit that calculates a focus evaluation value of the subject image based on the image signal corresponding to the designated area, and detects a position of the photographing optical system that maximizes the focus evaluation value;
An electronic camera comprising:
前記焦点検出エリア指定部は、前記顔領域の輪郭を含む複数の焦点検出エリアのうちの一部を前記指定エリアに設定することを特徴とする請求項1に記載の電子カメラ。   The electronic camera according to claim 1, wherein the focus detection area designating unit sets a part of a plurality of focus detection areas including an outline of the face area as the designated area. 前記焦点検出エリア指定部は、前記顔領域の上側または横側で輪郭と重複する焦点検出エリアを前記指定エリアに設定することを特徴とする請求項2に記載の電子カメラ。   The electronic camera according to claim 2, wherein the focus detection area designating unit sets a focus detection area overlapping with a contour on the upper side or the side of the face area as the designated area. 前記焦点検出エリア指定部は、前記顔領域の輪郭を含む焦点検出エリアで前記合焦位置が検出できない場合には、前記顔領域の下方に位置する焦点検出エリアを前記指定エリアに変更することを特徴とする請求項1から請求項3のいずれか1項に記載の電子カメラ。   The focus detection area designating unit changes a focus detection area located below the face area to the designated area when the focus position cannot be detected in the focus detection area including the outline of the face area. The electronic camera according to any one of claims 1 to 3, wherein the electronic camera is characterized by the following. 前記顔認識部は、前記顔領域における顔パーツの位置関係に基づいて顔の方向を検出し、
前記焦点検出エリア指定部は、前記顔の方向に応じて前記指定エリアとなる焦点検出エリアの位置を変化させることを特徴とする請求項3または請求項4に記載の電子カメラ。
The face recognition unit detects a face direction based on a positional relationship of face parts in the face region;
5. The electronic camera according to claim 3, wherein the focus detection area designating unit changes a position of a focus detection area serving as the designated area according to a direction of the face.
前記電子カメラの撮影姿勢を検出する姿勢検出部をさらに有し、
前記焦点検出エリア指定部は、前記撮影姿勢に応じて前記指定エリアとなる焦点検出エリアの位置を変化させることを特徴とする請求項3または請求項4に記載の電子カメラ。
A posture detection unit for detecting a shooting posture of the electronic camera;
5. The electronic camera according to claim 3, wherein the focus detection area designating unit changes a position of a focus detection area serving as the designated area according to the photographing posture.
前記画像信号に基づいて撮影画面のファインダ画像を表示し、前記指定エリアで前記合焦位置が検出できない場合には、前記ファインダ画像の顔領域と関連付けされた合焦不能表示を表示する電子ファインダ部をさらに有することを特徴とする請求項1から請求項6のいずれか1項に記載の電子カメラ。





An electronic finder unit that displays a finder image on the shooting screen based on the image signal and displays an in-focus display associated with the face area of the finder image when the in-focus position cannot be detected in the designated area. The electronic camera according to claim 1, further comprising:





JP2005037675A 2005-02-15 2005-02-15 Electronic camera Expired - Fee Related JP4639837B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2005037675A JP4639837B2 (en) 2005-02-15 2005-02-15 Electronic camera
US11/345,393 US20060182433A1 (en) 2005-02-15 2006-02-02 Electronic camera
US12/289,747 US7881601B2 (en) 2005-02-15 2008-11-03 Electronic camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2005037675A JP4639837B2 (en) 2005-02-15 2005-02-15 Electronic camera

Publications (2)

Publication Number Publication Date
JP2006227080A true JP2006227080A (en) 2006-08-31
JP4639837B2 JP4639837B2 (en) 2011-02-23

Family

ID=36988535

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2005037675A Expired - Fee Related JP4639837B2 (en) 2005-02-15 2005-02-15 Electronic camera

Country Status (1)

Country Link
JP (1) JP4639837B2 (en)

Cited By (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008061157A (en) * 2006-09-04 2008-03-13 Nikon Corp Camera
JP2008064980A (en) * 2006-09-06 2008-03-21 Canon Inc Optical device and method for controlling optical device
JP2008099038A (en) * 2006-10-12 2008-04-24 Nikon Corp Digital camera
JP2008118387A (en) * 2006-11-02 2008-05-22 Canon Inc Imaging device
JP2008180906A (en) * 2007-01-24 2008-08-07 Fujifilm Corp Photographing device and focusing control method
WO2009008541A1 (en) * 2007-07-10 2009-01-15 Canon Kabushiki Kaisha Focus control appratus, image sensing apparatus, and control method therefor
JP2009025381A (en) * 2007-07-17 2009-02-05 Nikon Corp Digital camera
JP2009037263A (en) * 2008-11-04 2009-02-19 Canon Inc Point adjusting device, imaging device, control method for focus adjusting device, program, and recording medium
EP2037320A1 (en) 2007-09-14 2009-03-18 Sony Corporation Imaging apparatus, imaging apparatus control method, and computer program
JP2009098317A (en) * 2007-10-16 2009-05-07 Nec Electronics Corp Autofocus control circuit, autofocus control method and image pickup apparatus
JP2009224913A (en) * 2008-03-14 2009-10-01 Canon Inc Image device and its control method
JP2009251464A (en) * 2008-04-09 2009-10-29 Canon Inc Imaging apparatus and control method therefor
EP2141537A1 (en) * 2008-07-04 2010-01-06 Canon Kabushiki Kaisha Image pickup apparatus, method of controlling the same, computer program code, and storage medium
JP2010026009A (en) * 2008-07-15 2010-02-04 Nikon Corp Focal point detecting device and camera
JP2010078764A (en) * 2008-09-25 2010-04-08 Nikon Corp Focus detection device and method
US7747159B2 (en) 2007-02-09 2010-06-29 Canon Kabushiki Kaisha Focusing device and image-capturing device provided with the same
JP2010186004A (en) * 2009-02-12 2010-08-26 Sony Corp Imaging apparatus, method for controlling the same, and program
JP2010286772A (en) * 2009-06-15 2010-12-24 Casio Computer Co Ltd Imaging apparatus, focusing method and program
JP2011002814A (en) * 2009-05-19 2011-01-06 Nikon Corp Camera
JP2011002690A (en) * 2009-06-19 2011-01-06 Casio Computer Co Ltd Imaging apparatus, focusing method and program
JP2011048032A (en) * 2009-08-26 2011-03-10 Canon Inc Imaging apparatus
US7929042B2 (en) 2006-09-22 2011-04-19 Sony Corporation Imaging apparatus, control method of imaging apparatus, and computer program
US7945152B2 (en) 2006-05-10 2011-05-17 Canon Kabushiki Kaisha Focus adjustment method, focus adjustment apparatus, and control method thereof
JP2011137886A (en) * 2009-12-25 2011-07-14 Canon Inc Imaging apparatus, method for controlling the imaging apparatus and program
US8026975B2 (en) 2008-04-07 2011-09-27 Canon Kabushiki Kaisha Image capturing apparatus and control method therefor
US8063943B2 (en) 2007-01-17 2011-11-22 Samsung Electronics Co., Ltd. Digital photographing apparatus, method for controlling the same, and a recording medium for storing a program to implement the method
JP2011242796A (en) * 2011-07-22 2011-12-01 Casio Comput Co Ltd Imaging device, focusing method and program
US8077252B2 (en) 2007-08-27 2011-12-13 Sanyo Electric Co., Ltd. Electronic camera that adjusts a distance from an optical lens to an imaging surface so as to search the focal point
US8111321B2 (en) 2008-02-05 2012-02-07 Ricoh Company, Ltd. Imaging device and method for its image processing, with face region and focus degree information
US8248516B2 (en) 2008-06-30 2012-08-21 Canon Kabushiki Kaisha Focus detection apparatus and control method therefor
US8259214B2 (en) 2008-07-04 2012-09-04 Canon Kabushiki Kaisha Image pickup apparatus and auto-focus detection method
US8279323B2 (en) 2008-06-30 2012-10-02 Canon Kabushiki Kaisha Image capturing apparatus and control method for the same
JP2012234191A (en) * 2012-07-09 2012-11-29 Canon Inc Imaging apparatus
US8330849B2 (en) 2008-10-22 2012-12-11 Canon Kabushiki Kaisha Auto focusing apparatus and auto focusing method, and image sensing apparatus for stable focus detection
US8345147B2 (en) 2007-05-18 2013-01-01 Sony Corporation Image pickup apparatus
US8368764B2 (en) 2007-01-17 2013-02-05 Samsung Electronics Co., Ltd. Digital photographing apparatus and method for controlling the same
JP2013047833A (en) * 2009-02-18 2013-03-07 Panasonic Corp Imaging device
JP2013122611A (en) * 2013-02-01 2013-06-20 Casio Comput Co Ltd Imaging apparatus, focusing method, and program
US8477194B2 (en) 2009-01-07 2013-07-02 Canon Kabushiki Kaisha Image capturing apparatus, control method thereof, and program
US8502912B2 (en) 2008-02-19 2013-08-06 Canon Kabushiki Kaisha Focusing apparatus and method for controlling the same
US8570431B2 (en) 2009-02-03 2013-10-29 Fujitsu Mobile Communications Limited Mobile electronic device having camera
US8717490B2 (en) 2009-06-19 2014-05-06 Casio Computer Co., Ltd Imaging apparatus, focusing method, and computer-readable recording medium recording program
KR101406799B1 (en) 2007-10-02 2014-06-12 삼성전자주식회사 Digital image processing apparatus displaying the face recognition mark and the method of controlling the same
JP2014238554A (en) * 2013-06-10 2014-12-18 キヤノン株式会社 Imaging device and imaging method
KR101523694B1 (en) * 2008-04-02 2015-05-28 리코 이메징 가부시키가이샤 Photographic apparatus
US9088709B2 (en) 2012-12-19 2015-07-21 Canon Kabushiki Kaisha Image processing apparatus and method for controlling the same, and image pickup apparatus
US9148557B2 (en) 2010-07-16 2015-09-29 Canon Kabushiki Kaisha Focus adjustment apparatus and method, and image capturing apparatus and control method thereof
JP2016090838A (en) * 2014-11-06 2016-05-23 株式会社ソシオネクスト Image processor and image processing method
US9402020B2 (en) 2012-11-16 2016-07-26 Canon Kabushiki Kaisha Focus detection apparatus and control method for the same

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04133576A (en) * 1990-09-25 1992-05-07 Canon Inc Image pickup device
JPH07270671A (en) * 1994-03-31 1995-10-20 Nikon Corp Automatic focusing camera
JP2003107335A (en) * 2001-09-28 2003-04-09 Ricoh Co Ltd Image pickup device, automatic focusing method, and program for making computer execute the method
JP2004117776A (en) * 2002-09-26 2004-04-15 Fuji Photo Film Co Ltd Digital camera

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04133576A (en) * 1990-09-25 1992-05-07 Canon Inc Image pickup device
JPH07270671A (en) * 1994-03-31 1995-10-20 Nikon Corp Automatic focusing camera
JP2003107335A (en) * 2001-09-28 2003-04-09 Ricoh Co Ltd Image pickup device, automatic focusing method, and program for making computer execute the method
JP2004117776A (en) * 2002-09-26 2004-04-15 Fuji Photo Film Co Ltd Digital camera

Cited By (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8145049B2 (en) 2006-05-10 2012-03-27 Canon Kabushiki Kaisha Focus adjustment method, focus adjustment apparatus, and control method thereof
US7945152B2 (en) 2006-05-10 2011-05-17 Canon Kabushiki Kaisha Focus adjustment method, focus adjustment apparatus, and control method thereof
JP2008061157A (en) * 2006-09-04 2008-03-13 Nikon Corp Camera
US8538252B2 (en) 2006-09-04 2013-09-17 Nikon Corporation Camera
WO2008029503A1 (en) * 2006-09-04 2008-03-13 Nikon Corporation Camera
JP2008064980A (en) * 2006-09-06 2008-03-21 Canon Inc Optical device and method for controlling optical device
US7929042B2 (en) 2006-09-22 2011-04-19 Sony Corporation Imaging apparatus, control method of imaging apparatus, and computer program
JP2008099038A (en) * 2006-10-12 2008-04-24 Nikon Corp Digital camera
US7973824B2 (en) 2006-10-12 2011-07-05 Nikon Corporation Digital camera that uses object detection information at the time of shooting for processing image data after acquisition of an image
US8379103B2 (en) 2006-10-12 2013-02-19 Nikon Corporation Digital camera that uses object detection information at the time of shooting for processing image data after acquisition of an image
JP2008118387A (en) * 2006-11-02 2008-05-22 Canon Inc Imaging device
US8368764B2 (en) 2007-01-17 2013-02-05 Samsung Electronics Co., Ltd. Digital photographing apparatus and method for controlling the same
KR101323735B1 (en) * 2007-01-17 2013-10-30 삼성전자주식회사 Digital photographing apparatus, method for controlling the same, and recording medium storing program to implement the method
US8063943B2 (en) 2007-01-17 2011-11-22 Samsung Electronics Co., Ltd. Digital photographing apparatus, method for controlling the same, and a recording medium for storing a program to implement the method
JP2008180906A (en) * 2007-01-24 2008-08-07 Fujifilm Corp Photographing device and focusing control method
JP4697606B2 (en) * 2007-01-24 2011-06-08 富士フイルム株式会社 Imaging apparatus and focus control method
US7747159B2 (en) 2007-02-09 2010-06-29 Canon Kabushiki Kaisha Focusing device and image-capturing device provided with the same
US8345147B2 (en) 2007-05-18 2013-01-01 Sony Corporation Image pickup apparatus
US8279324B2 (en) 2007-07-10 2012-10-02 Canon Kabushiki Kaisha Focus control apparatus, image sensing apparatus, and control method therefor
WO2009008541A1 (en) * 2007-07-10 2009-01-15 Canon Kabushiki Kaisha Focus control appratus, image sensing apparatus, and control method therefor
JP2009025381A (en) * 2007-07-17 2009-02-05 Nikon Corp Digital camera
US8077252B2 (en) 2007-08-27 2011-12-13 Sanyo Electric Co., Ltd. Electronic camera that adjusts a distance from an optical lens to an imaging surface so as to search the focal point
US8068164B2 (en) 2007-09-14 2011-11-29 Sony Corporation Face recognition auto focus apparatus for a moving image
EP2037320A1 (en) 2007-09-14 2009-03-18 Sony Corporation Imaging apparatus, imaging apparatus control method, and computer program
KR101406799B1 (en) 2007-10-02 2014-06-12 삼성전자주식회사 Digital image processing apparatus displaying the face recognition mark and the method of controlling the same
KR101008864B1 (en) * 2007-10-16 2011-01-17 르네사스 일렉트로닉스 가부시키가이샤 Autofocus control circuit, autofocus control method and image pickup apparatus
JP2009098317A (en) * 2007-10-16 2009-05-07 Nec Electronics Corp Autofocus control circuit, autofocus control method and image pickup apparatus
US8111321B2 (en) 2008-02-05 2012-02-07 Ricoh Company, Ltd. Imaging device and method for its image processing, with face region and focus degree information
US8502912B2 (en) 2008-02-19 2013-08-06 Canon Kabushiki Kaisha Focusing apparatus and method for controlling the same
JP2009224913A (en) * 2008-03-14 2009-10-01 Canon Inc Image device and its control method
KR101523694B1 (en) * 2008-04-02 2015-05-28 리코 이메징 가부시키가이샤 Photographic apparatus
US8026975B2 (en) 2008-04-07 2011-09-27 Canon Kabushiki Kaisha Image capturing apparatus and control method therefor
US8289439B2 (en) 2008-04-09 2012-10-16 Canon Kabushiki Kaisha Image capturing apparatus and control method therefor
JP2009251464A (en) * 2008-04-09 2009-10-29 Canon Inc Imaging apparatus and control method therefor
US8248516B2 (en) 2008-06-30 2012-08-21 Canon Kabushiki Kaisha Focus detection apparatus and control method therefor
US8279323B2 (en) 2008-06-30 2012-10-02 Canon Kabushiki Kaisha Image capturing apparatus and control method for the same
EP2141537A1 (en) * 2008-07-04 2010-01-06 Canon Kabushiki Kaisha Image pickup apparatus, method of controlling the same, computer program code, and storage medium
US8259214B2 (en) 2008-07-04 2012-09-04 Canon Kabushiki Kaisha Image pickup apparatus and auto-focus detection method
US9188837B2 (en) 2008-07-04 2015-11-17 Canon Kabushiki Kaisha Image pickup apparatus, method of controlling the same, and storage medium
JP2010026009A (en) * 2008-07-15 2010-02-04 Nikon Corp Focal point detecting device and camera
JP2010078764A (en) * 2008-09-25 2010-04-08 Nikon Corp Focus detection device and method
US8330849B2 (en) 2008-10-22 2012-12-11 Canon Kabushiki Kaisha Auto focusing apparatus and auto focusing method, and image sensing apparatus for stable focus detection
JP2009037263A (en) * 2008-11-04 2009-02-19 Canon Inc Point adjusting device, imaging device, control method for focus adjusting device, program, and recording medium
US8477194B2 (en) 2009-01-07 2013-07-02 Canon Kabushiki Kaisha Image capturing apparatus, control method thereof, and program
US8570431B2 (en) 2009-02-03 2013-10-29 Fujitsu Mobile Communications Limited Mobile electronic device having camera
JP2010186004A (en) * 2009-02-12 2010-08-26 Sony Corp Imaging apparatus, method for controlling the same, and program
US8390730B2 (en) 2009-02-12 2013-03-05 Sony Corporation Image capturing apparatus, control method thereof, and program
JP2013047833A (en) * 2009-02-18 2013-03-07 Panasonic Corp Imaging device
JP2011002814A (en) * 2009-05-19 2011-01-06 Nikon Corp Camera
JP2010286772A (en) * 2009-06-15 2010-12-24 Casio Computer Co Ltd Imaging apparatus, focusing method and program
US8451366B2 (en) 2009-06-15 2013-05-28 Casio Computer Co., Ltd Image capturing device with automatic focus function
JP2011002690A (en) * 2009-06-19 2011-01-06 Casio Computer Co Ltd Imaging apparatus, focusing method and program
US8717490B2 (en) 2009-06-19 2014-05-06 Casio Computer Co., Ltd Imaging apparatus, focusing method, and computer-readable recording medium recording program
US8553133B2 (en) 2009-08-26 2013-10-08 Canon Kabushiki Kaisha Focusing apparatus
JP2011048032A (en) * 2009-08-26 2011-03-10 Canon Inc Imaging apparatus
JP2011137886A (en) * 2009-12-25 2011-07-14 Canon Inc Imaging apparatus, method for controlling the imaging apparatus and program
US9148557B2 (en) 2010-07-16 2015-09-29 Canon Kabushiki Kaisha Focus adjustment apparatus and method, and image capturing apparatus and control method thereof
JP2011242796A (en) * 2011-07-22 2011-12-01 Casio Comput Co Ltd Imaging device, focusing method and program
JP2012234191A (en) * 2012-07-09 2012-11-29 Canon Inc Imaging apparatus
US9402020B2 (en) 2012-11-16 2016-07-26 Canon Kabushiki Kaisha Focus detection apparatus and control method for the same
US9088709B2 (en) 2012-12-19 2015-07-21 Canon Kabushiki Kaisha Image processing apparatus and method for controlling the same, and image pickup apparatus
JP2013122611A (en) * 2013-02-01 2013-06-20 Casio Comput Co Ltd Imaging apparatus, focusing method, and program
JP2014238554A (en) * 2013-06-10 2014-12-18 キヤノン株式会社 Imaging device and imaging method
JP2016090838A (en) * 2014-11-06 2016-05-23 株式会社ソシオネクスト Image processor and image processing method

Also Published As

Publication number Publication date
JP4639837B2 (en) 2011-02-23

Similar Documents

Publication Publication Date Title
JP4639837B2 (en) Electronic camera
JP4674471B2 (en) Digital camera
US7881601B2 (en) Electronic camera
JP4665718B2 (en) Imaging device
JP4914045B2 (en) IMAGING DEVICE AND IMAGING DEVICE CONTROL METHOD
JP4935302B2 (en) Electronic camera and program
US20110234838A1 (en) Image processor, electronic camera, and image processing program
TW201119365A (en) Image selection device and method for selecting image
JP2004180298A (en) Camera system provided with eye monitoring function
KR101728042B1 (en) Digital photographing apparatus and control method thereof
JP4421151B2 (en) Digital camera imaging device
JP4127521B2 (en) Digital camera and control method thereof
JP2007336411A (en) Imaging apparatus, auto-bracketing photographing method, and program
JP2009123081A (en) Face detection method and photographing apparatus
JP2008092299A (en) Electronic camera
JP5407373B2 (en) Imaging apparatus and program
JP4953770B2 (en) Imaging device
JP4632417B2 (en) Imaging apparatus and control method thereof
JP6024135B2 (en) Subject tracking display control device, subject tracking display control method and program
JP2011107550A (en) Imaging apparatus
JP7444604B2 (en) Image processing device and method, and imaging device
JP2009252069A (en) Image processor, imaging device, image processing method and program
JP2007208355A (en) Photographing device, method, and program
JP4810440B2 (en) IMAGING DEVICE, ITS CONTROL METHOD, PROGRAM, AND STORAGE MEDIUM
JP2008028924A (en) Imaging apparatus and control method thereof

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20080214

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20100622

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20100823

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20101102

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20101115

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

Ref document number: 4639837

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20131210

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20131210

Year of fee payment: 3

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

LAPS Cancellation because of no payment of annual fees