JP2009025381A - Digital camera - Google Patents

Digital camera Download PDF

Info

Publication number
JP2009025381A
JP2009025381A JP2007185835A JP2007185835A JP2009025381A JP 2009025381 A JP2009025381 A JP 2009025381A JP 2007185835 A JP2007185835 A JP 2007185835A JP 2007185835 A JP2007185835 A JP 2007185835A JP 2009025381 A JP2009025381 A JP 2009025381A
Authority
JP
Japan
Prior art keywords
focus
face
positions
divided areas
focusing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2007185835A
Other languages
Japanese (ja)
Other versions
JP5446076B2 (en
Inventor
Toshiaki Maeda
敏彰 前田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nikon Corp
Original Assignee
Nikon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp filed Critical Nikon Corp
Priority to JP2007185835A priority Critical patent/JP5446076B2/en
Publication of JP2009025381A publication Critical patent/JP2009025381A/en
Application granted granted Critical
Publication of JP5446076B2 publication Critical patent/JP5446076B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Automatic Focus Adjustment (AREA)
  • Studio Devices (AREA)
  • Focusing (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To prevent false focusing by eliminating influence of a photographer's unintended subject which is included in a focus detection area. <P>SOLUTION: An image by a photographic lens is picked up by an imaging device, and image information is output, and then a specified subject (person's face) is recognized based on the image information, so as to detect the area (face detection area 26) of the specified subject in the picked-up image. The area (face detection area 26) of the specified subject is divided into a plurality of areas (face areas 26a to 26d), and the focusing position of the photographic lens is detected in each of the plurality of divided areas (face areas 26a to 26d) and also the final focusing position is decided based on a plurality of focusing positions detected in the plurality of divided areas (face areas 26a to 26d), then focus adjustment of the photographic lens is performed with the final focusing position as a target drive position. <P>COPYRIGHT: (C)2009,JPO&INPIT

Description

本発明はデジタルカメラに関する。   The present invention relates to a digital camera.

フォーカシングレンズを所定の間隔で少しずつ移動させながら焦点評価値を算出し(以下、AF(Auto focus;自動焦点調節)サーチという)、焦点評価値が最大となる位置を合焦位置として焦点調節を行うコントラスト検出方式自動焦点調節装置(以下、単にコントラスト方式AFという)において、撮像画像を解析して人物の顔を認識し、顔領域の大きさに基づいて顔領域までの撮影距離を推定し、この推定撮影距離に基づいてAFサーチの開始点とするレンズ位置を決定するようにしたデジタルカメラが知られている(例えば、特許文献1参照)。   The focus evaluation value is calculated while moving the focusing lens little by little at a predetermined interval (hereinafter referred to as AF (Auto focus) search), and the focus adjustment is performed with the position where the focus evaluation value is maximized as the in-focus position. In a contrast detection type automatic focus adjustment device (hereinafter simply referred to as contrast type AF), a captured image is analyzed to recognize a human face, and a shooting distance to the face region is estimated based on the size of the face region. There has been known a digital camera that determines a lens position as a starting point of an AF search based on the estimated shooting distance (see, for example, Patent Document 1).

この出願の発明に関連する先行技術文献としては次のものがある。
特開2006−201282号公報
Prior art documents related to the invention of this application include the following.
JP 2006-201282 A

一般に、人物の顔はコントラストが低く、風景は高周波成分が多く含まれているためコントラストが高い。コントラスト方式AFは焦点検出エリア内のコントラストの高さに大きく依存しているため、景勝地などで記念撮影を行う場合に背景の高周波部分が焦点検出エリアに含まれると、せっかく人物の顔を焦点検出エリア内に捕捉しても背景の方にピントが引っ張られてしまうことがある。   In general, a human face has a low contrast, and a landscape has a high contrast because it contains many high-frequency components. Since contrast AF greatly depends on the contrast level in the focus detection area, when taking a commemorative photo at a scenic spot or the like, if the high-frequency part of the background is included in the focus detection area, the person's face will be focused. Even when captured in the detection area, the focus may be pulled toward the background.

(1) 請求項1の発明は、撮影光学系による画像を撮像して画像情報を出力する撮像手段と、画像情報に基づいて特定の被写体を認識し、画像の中の特定被写体の領域を検出する被写体検出手段と、特定被写体の領域を複数の領域に分割し、複数の分割領域のそれぞれにおいて撮影光学系の合焦位置を検出する合焦位置検出手段と、複数の分割領域で検出された複数の合焦位置に基づいて最終的な合焦位置を決定する合焦位置決定手段と、最終的な合焦位置を目標駆動位置として撮影光学系の焦点調節を行う焦点調節手段とを備える。
(2) 請求項2のデジタルカメラは、合焦位置決定手段によって、複数の合焦位置の内の最も無限遠側の合焦位置と最至近側の合焦位置との差が所定値未満の場合には、複数の合焦位置の内の最至近側の合焦位置を最終的な合焦位置とするか、または複数の合焦位置の平均位置を最終的な合焦位置とするようにしたものである。
(3) 請求項3のデジタルカメラは、合焦位置決定手段によって、複数の合焦位置の内の最も無限遠側の合焦位置と最至近側の合焦位置との差が所定値以上の場合には、各分割領域の合焦位置と複数の分割領域の合焦位置の中央値との差が所定値以上の分割領域を除く残りの分割領域の合焦位置に基づいて最終的な合焦位置を決定するようにしたものである。
(4) 請求項4のデジタルカメラは、合焦位置決定手段によって、各分割領域の合焦位置と複数の分割領域の合焦位置の中央値との差が所定値以上の分割領域を除く残りの分割領域の合焦位置の内の最至近側の合焦位置を最終的な合焦位置とするか、または残りの分割領域の合焦位置の平均位置を最終的な合焦位置とするようにしたものである。
(1) The invention of claim 1 is an image pickup means for picking up an image by a photographing optical system and outputting image information, and a specific subject is recognized based on the image information, and a region of the specific subject in the image is detected. Subject detection means for dividing the area of the specific subject into a plurality of areas, and a focus position detection means for detecting the focus position of the photographing optical system in each of the plurality of divided areas, and detected by the plurality of divided areas A focus position determining unit that determines a final focus position based on a plurality of focus positions; and a focus adjustment unit that performs focus adjustment of the photographing optical system with the final focus position as a target drive position.
(2) In the digital camera according to claim 2, the difference between the infinitely far focus position and the closest focus position among the plurality of focus positions is less than a predetermined value by the focus position determining means. In such a case, the closest focusing position among the plurality of focusing positions is set as the final focusing position, or the average position of the plurality of focusing positions is set as the final focusing position. It is a thing.
(3) In the digital camera according to claim 3, the difference between the focus position on the infinity side and the focus position on the closest side among the plurality of focus positions is greater than or equal to a predetermined value by the focus position determination unit. In this case, the final focusing is performed based on the focusing positions of the remaining divided areas excluding the divided areas in which the difference between the focused position of each divided area and the median value of the focused positions of the plurality of divided areas is a predetermined value or more. The focal position is determined.
(4) In the digital camera according to claim 4, the in-focus position determining unit excludes the divided areas where the difference between the in-focus position of each divided area and the median value of the in-focus positions of the plurality of divided areas is a predetermined value or more. The focus position closest to the focus position of the divided areas is set as the final focus position, or the average position of the focus positions of the remaining divided areas is set as the final focus position. It is a thing.

本発明によれば、焦点検出エリアに含まれる撮影者が意図しない被写体の影響を排除して偽合焦を防止でき、特定被写体に正確に合焦させることができる。   According to the present invention, it is possible to prevent false focusing by eliminating the influence of a subject unintended by the photographer included in the focus detection area, and to accurately focus on a specific subject.

図1は一実施の形態のデジタルカメラの構成を示す図である。撮影レンズ1はフォーカシングレンズを備え、撮像素子2の撮像面に被写体像を結像する。撮影レンズ1のフォーカシングレンズは、フォーカスモーター10により駆動される。フォーカスモーター10には通常、ステッピングモーターが用いられ、CPU9によりオープンループ制御のパルス駆動が行われる。   FIG. 1 is a diagram illustrating a configuration of a digital camera according to an embodiment. The photographing lens 1 includes a focusing lens, and forms a subject image on the imaging surface of the image sensor 2. The focusing lens of the photographic lens 1 is driven by a focus motor 10. A stepping motor is normally used as the focus motor 10, and the open loop control pulse drive is performed by the CPU 9.

撮像素子2は、撮像面に結像された被写体像の光強度分布に応じた被写体像信号を出力する。アナログ信号処理部3は図示しないCDS回路、AGC回路および色分離回路などを備え、撮像素子2から出力される被写体像信号を処理する。CDS回路では被写体像信号が相関二重サンプリング(CDS)処理され、AGC回路では被写体像信号のレベル調整が行われる。A/D変換器4は、アナログ信号処理部3から出力される処理後の被写体像信号をデジタル信号に変換する。デジタル信号処理部5はγ補正回路、輝度信号および色差信号生成回路などの信号処理回路を備え、被写体像信号に各種処理を施す。   The image sensor 2 outputs a subject image signal corresponding to the light intensity distribution of the subject image formed on the imaging surface. The analog signal processing unit 3 includes a CDS circuit, an AGC circuit, a color separation circuit, and the like (not shown), and processes a subject image signal output from the image sensor 2. The subject image signal is subjected to correlated double sampling (CDS) processing in the CDS circuit, and the level of the subject image signal is adjusted in the AGC circuit. The A / D converter 4 converts the processed subject image signal output from the analog signal processing unit 3 into a digital signal. The digital signal processing unit 5 includes signal processing circuits such as a γ correction circuit, a luminance signal, and a color difference signal generation circuit, and performs various processes on the subject image signal.

CPU9はAE演算部、AF演算部、AWB演算部などを備え、カメラ全体のシーケンス制御、露出演算制御、焦点検出および焦点調節制御、ホワイトバランス制御などを行う。なお、従来のAFでは、図2に示すように撮影画面の中央とその上下左右5カ所に焦点検出エリア21〜25を設定し、これらの焦点検出エリア21〜25において撮影レンズ1の焦点評価値を演算し、演算結果に基づいてフォーカスモーター10により撮影レンズ1のフォーカシングレンズを焦点調節駆動している。   The CPU 9 includes an AE calculation unit, an AF calculation unit, an AWB calculation unit, and the like, and performs sequence control, exposure calculation control, focus detection and focus adjustment control, white balance control, and the like for the entire camera. In the conventional AF, as shown in FIG. 2, focus detection areas 21 to 25 are set at the center of the shooting screen and at five locations on the top, bottom, left, and right. Based on the calculation result, the focus motor 10 drives the focusing lens of the photographing lens 1 to adjust the focus.

バッファーメモリ12は、撮像素子2で撮像された複数フレーム分のデータを記憶することができるフレームメモリであり、デジタル信号処理部5で一連の処理が施された画像データが格納される。顔認識演算部11は、顔認識機能がユーザーにより選択された場合に、バッファーメモリ12に保持されたスルー画表示用の静止画に対して顔認識演算処理を施し、静止画に顔の存在が検出された場合は、CPU9に認識した顔検出エリアの位置と大きさの座標を出力する。図3に顔検出エリア26の例を示す。   The buffer memory 12 is a frame memory that can store data for a plurality of frames imaged by the imaging device 2, and stores image data that has been subjected to a series of processing by the digital signal processing unit 5. When the face recognition function is selected by the user, the face recognition calculation unit 11 performs face recognition calculation processing on the still image display still image held in the buffer memory 12, and the presence of a face in the still image. If detected, the CPU 9 outputs the coordinates of the position and size of the recognized face detection area. FIG. 3 shows an example of the face detection area 26.

CPU9は顔検出エリアの位置と大きさを入力し、図3に示すように顔検出エリア26をコントラスト方式AFの焦点評価値演算エリア、すなわち焦点検出エリアに設定する。顔検出エリア26の位置と大きさの情報が更新された場合や、ユーザーによりレリーズボタンの半押し操作がなされた場合に、コントラスト方式AFの焦点評価値演算シーケンスが開始される。コントラスト方式AFは像のボケの程度とコントラストとの間に相関があり、焦点が合ったときに像のコントラストが最大になることを利用して焦点合わせを行う。コントラストの高低は、被写体像信号の高周波分を抽出するような評価関数を与え、その評価関数から得られるAF評価値の大小により評価することができる。   The CPU 9 inputs the position and size of the face detection area, and sets the face detection area 26 as the focus evaluation value calculation area of the contrast AF, that is, the focus detection area as shown in FIG. When the position and size information of the face detection area 26 is updated, or when the user performs a half-press operation on the release button, the focus evaluation value calculation sequence of the contrast AF is started. In contrast AF, there is a correlation between the degree of image blur and the contrast, and focusing is performed using the fact that the contrast of the image is maximized when the image is focused. The level of contrast can be evaluated based on the magnitude of the AF evaluation value obtained from an evaluation function that extracts a high frequency component of the subject image signal.

焦点評価値のピークを探索する場合には、通常、フォーカシングレンズを無限遠側または至近側に所定量ずらし、そのときに算出される焦点評価値と移動前の焦点評価値とを比較する。移動後の焦点評価値が大きい場合は、より解像感(合焦度合い)が高まる傾向であると見なし、フォーカシングレンズをさらに同一方向に移動して同様の演算と比較を行う。一方、移動後の焦点評価値が小さい場合は、合焦度合いが低くなる傾向であると見なし、フォーカシングレンズを逆方向に移動して同様の演算と比較を行う。このような処理を繰り返し行う、いわゆる“山登り方式AF”によって焦点評価値が最大となる位置、すなわち合焦位置を探す。   When searching for the peak of the focus evaluation value, usually, the focusing lens is shifted by a predetermined amount toward the infinity side or the close side, and the focus evaluation value calculated at that time is compared with the focus evaluation value before the movement. When the focus evaluation value after the movement is large, it is considered that the resolution (in-focus level) tends to increase, and the focusing lens is further moved in the same direction, and the same calculation and comparison are performed. On the other hand, when the focus evaluation value after the movement is small, it is considered that the degree of focusing tends to be low, and the focusing lens is moved in the reverse direction to perform the same calculation and comparison. The position at which the focus evaluation value is maximized, that is, the in-focus position is searched by so-called “mountain climbing AF” in which such processing is repeated.

上記のAFサーチとは異なるが、所定位置、例えば無限端から至近端までフォーカシングレンズを移動させつつ、所定間隔で焦点評価値を取得し、全範囲から焦点評価値の最大値を探索する、いわゆる“全域サーチ”を行ってもよい。   Although different from the AF search described above, the focus evaluation value is acquired at predetermined intervals while moving the focusing lens from a predetermined position, for example, from the infinite end to the closest end, and the maximum value of the focus evaluation value is searched from the entire range. A so-called “entire search” may be performed.

ところで、検出された顔検出エリアを焦点検出エリア26(図3参照)としてコントラスト方式AFを行うと、図4に実線で示すようなフォーカシングレンズの位置に対する焦点評価値の特性曲線が得られる場合がある。上述したように、背景の被写体には高周波成分が多く含まれるため、背景のみの焦点評価値の特性曲線は図4に破線で示す特性曲線になる。一方、人の顔はコントラスト変化(空間周波数変化)が小さいため、人物のみの焦点評価値の特性曲線は図4に一点鎖線で示すような特性曲線になる。ところが、図3に示すように、焦点検出エリア26内において低コントラストの人の顔と高コントラストな背景などが重なっている、つまり遠近競合している場合には、撮影条件によっては図4に実線で示すように、背景の特性曲線(破線)と人物の特性曲線(一点鎖線)とが合成された焦点評価値の特性曲線になってしまうことがある。   By the way, when contrast type AF is performed using the detected face detection area as the focus detection area 26 (see FIG. 3), a characteristic curve of the focus evaluation value with respect to the position of the focusing lens as shown by a solid line in FIG. 4 may be obtained. is there. As described above, since the background subject includes a lot of high-frequency components, the characteristic curve of the focus evaluation value of only the background is a characteristic curve indicated by a broken line in FIG. On the other hand, since the human face has a small contrast change (spatial frequency change), the characteristic curve of the focus evaluation value of only the person becomes a characteristic curve as shown by a one-dot chain line in FIG. However, as shown in FIG. 3, when a low-contrast human face and a high-contrast background overlap in the focus detection area 26, that is, when there is a perspective conflict, a solid line in FIG. As shown, the background characteristic curve (broken line) and the person characteristic curve (one-dot chain line) may be combined into a focus evaluation value characteristic curve.

このような焦点評価値の特性曲線に基づいて合焦探索を行うと、本来の主要被写体である人物の位置(図4に示すレンズ位置L2)ではなく、レンズ位置L3を目標駆動位置としてレンズ駆動を行ってしまう。これではせっかく顔検出機能により焦点検出エリアを設定したにも関わらず、人物に正しく合焦していないピンボケ写真となってしまう。   When the focus search is performed based on the characteristic curve of the focus evaluation value, the lens driving is performed with the lens position L3 as a target driving position instead of the position of the person who is the original main subject (lens position L2 shown in FIG. 4). Will go. In this case, although the focus detection area is set by the face detection function, the out-of-focus photo is not correctly focused on the person.

また、他の理由により合焦位置が正しく検出できない場合もある。光学系の構成によっては焦点調節可動範囲内における焦点距離変動が大きい場合がある。例えば、撮影光学系が無限遠端における合焦状態と至近端における合焦状態とで実際の焦点距離の差が大きい場合、つまり至近側へ合焦点を移動させるために撮影光学系を被写体方向へ繰り出すほど、焦点距離が短く(ワイド寄りに)なる光学系でその変動量が比較的大きい鏡筒を用いた場合である。   In addition, the focus position may not be detected correctly for other reasons. Depending on the configuration of the optical system, the focal length variation within the focus adjustment movable range may be large. For example, if the photographic optical system has a large difference in actual focal length between the focused state at the infinity end and the focused state at the close end, that is, the photographic optical system is moved in the direction of the subject to move the focal point to the close side. This is a case where a lens barrel having a relatively large fluctuation amount is used in an optical system in which the focal length becomes shorter (closer to the wide) as the lens is moved forward.

この場合の様子を図5に示す。図中に破線で示すような焦点評価値の特性曲線となるはずが、図中に実線で示すような焦点評価値の特性曲線となってしまう場合がある。これは、特に人物の顔はコントラストが低く、サーチ中に人物より至近側へフォーカシングレンズが移動し、実質焦点距離が短く(ワイドに)なり、人の顔近くにある背景(高周波被写体)が入り込んだ場合にこのような現象となり、この場合、正しい合焦位置L1ではなく、至近側にぼけてしまう位置L2を合焦位置と誤認してしまう。   The situation in this case is shown in FIG. There may be a focus evaluation value characteristic curve as indicated by a broken line in the figure, but a focus evaluation value characteristic curve as indicated by a solid line in the figure. This is especially because the human face has low contrast, the focusing lens moves closer to the person during the search, the actual focal length becomes shorter (wider), and the background (high-frequency subject) near the human face enters. In this case, such a phenomenon occurs, and in this case, not the correct focus position L1, but the position L2 that is blurred toward the closest side is mistaken as the focus position.

これらの問題に対処するために、一実施の形態では、検出された顔検出エリアを複数の顔領域に分割し、各分割顔領域における焦点評価値の特性曲線を求める。これは、分割した各顔領域間の合焦位置の差は大きくないという前提に立っている。ここで、複数個算出された合焦位置を統計的に観測し、大きく外れた値は上述したような現象により正しい合焦位置を検出できていない状態とし、合焦位置決定候補から除外する。   In order to cope with these problems, in one embodiment, the detected face detection area is divided into a plurality of face areas, and a characteristic curve of a focus evaluation value in each divided face area is obtained. This is based on the premise that the difference in focus position between the divided face areas is not large. Here, a plurality of calculated in-focus positions are statistically observed, and a value greatly deviated is regarded as a state where a correct in-focus position cannot be detected due to the phenomenon described above, and is excluded from in-focus position determination candidates.

図6に、顔検出エリア26(図3参照)を横方向に4個の顔領域26a〜26dに分割した例を示す。この例では、顔検出エリア26から5個の焦点評価値を求める。すなわち、顔検出エリアを4分割した各顔領域26a〜26dの焦点評価値と、これらの4個の顔領域26a〜26dを合算した領域、つまり顔検出エリア26そのものの焦点評価値である。   FIG. 6 shows an example in which the face detection area 26 (see FIG. 3) is divided into four face regions 26a to 26d in the horizontal direction. In this example, five focus evaluation values are obtained from the face detection area 26. That is, the focus evaluation value of each of the face areas 26a to 26d obtained by dividing the face detection area into four parts and the area obtained by adding these four face areas 26a to 26d, that is, the focus evaluation value of the face detection area 26 itself.

これらの顔領域26a〜26dと顔検出エリア26において焦点評価値から求めた合焦位置に対して図7に示す判定処理を行う。まず、顔領域26a〜26dと顔検出エリア26の合焦位置、つまりコントラストが最大の位置を求める。図7のステップ1において、最も無限遠側となった顔領域と最至近側となった顔領域の合焦位置の差を求める。これは、結像面からの距離換算でもよいが、この一実施の形態のフォーカシングレンズはステッピングモーターで駆動制御されるので、基準位置からの相対パルス量どうしの差でも構わない。この無限遠側の顔領域と最至近側の顔領域の合焦位置の差が予め設定したしきい値th1より大きいかどうかを判定する。   The determination processing shown in FIG. 7 is performed on the in-focus positions obtained from the focus evaluation values in the face areas 26a to 26d and the face detection area 26. First, an in-focus position between the face areas 26a to 26d and the face detection area 26, that is, a position having the maximum contrast is obtained. In step 1 of FIG. 7, the difference between the in-focus positions of the face area on the most infinite side and the face area on the closest side is obtained. This may be converted from the distance from the image plane, but the focusing lens of this embodiment is driven and controlled by the stepping motor, so the difference between the relative pulse amounts from the reference position may be used. It is determined whether or not the difference between the in-focus face area and the closest face area is larger than a preset threshold th1.

ステップ1で無限遠側の顔領域と最至近側の顔領域の合焦位置の差がしきい値th1以上の場合は、上述した不具合現象が発生しているものと見なしてステップ2へ進み、しきい値th1未満の場合はステップ3へ進む。まず、合焦位置の差がしきい値th1以上の場合は、単一の被写体である人物の顔を分割して各顔領域で合焦位置探索を行ったにも拘わらず、顔領域ごとの合焦位置のバラツキが大きい場合である。このため、統計処理などを施して他の合焦位置と大きくずれた顔領域の影響を排除する必要がある。   If the difference in focus position between the face area on the infinity side and the face area on the nearest side is greater than or equal to the threshold th1 in step 1, it is considered that the above-described malfunction phenomenon has occurred, and the process proceeds to step 2. If it is less than the threshold th1, the process proceeds to Step 3. First, if the difference in focus position is greater than or equal to the threshold value th1, the face of a person who is a single subject is divided and a focus position search is performed in each face area, but each face area is searched. This is a case where the variation in focus position is large. For this reason, it is necessary to eliminate the influence of a face area greatly deviating from other in-focus positions by performing statistical processing or the like.

ステップ2において、各顔領域26a〜26dと顔検出エリア26の合焦位置と、それらの合焦位置の中央値(メジアン)との差δを次式により求める。
δ=|AF−median[AF]| ・・・(1)
(1)式において、AFは顔領域26a〜26dと顔検出エリア26の合焦位置、median[AF]は顔領域26a〜26dと顔検出エリア26の合焦位置の中央値である。
In step 2, the difference δ n between the in-focus positions of the face areas 26a to 26d and the face detection area 26 and the median value (median) of the in-focus positions is obtained by the following equation.
δ n = | AF n −median [AF] | (1)
(1) In the equation, AF n is the focus position of the face region 26a~26d and the face detection area 26, median [AF] is the median of the focus position of the face region 26a~26d and the face detection area 26.

上記差δが所定のしきい値th2より大きいものは、合焦位置が他の顔領域の合焦位置から大きくずれた顔領域であるとして合焦位置選択候補から除外し、残りの顔領域の合焦位置の内の最至近の合焦位置を目標駆動位置に決定する。なお、残りの顔領域の合焦位置の平均位置を目標駆動位置としてもよい。また、このしきい値th1、th2は被写界深度程度にするのが望ましいが、実際の撮影光学系の焦点距離(ズーム位置)や各種撮影条件などに応じて可変となるようにテーブル値で保持しておく。 If the difference δ n is greater than the predetermined threshold th2, the focus position is excluded from the focus position selection candidates as being a face area greatly deviating from the focus position of another face area, and the remaining face areas The closest focus position among the focus positions is determined as the target drive position. Note that the average position of the focus positions of the remaining face regions may be set as the target drive position. The threshold values th1 and th2 are preferably set to about the depth of field, but are set to table values so that they can be changed according to the focal length (zoom position) of the actual photographing optical system and various photographing conditions. Keep it.

一方、最も無限遠側となった顔領域と最至近側となった顔領域の合焦位置の差がしきい値th1未満の場合は、ステップ3において、各顔領域の合焦位置のバラツキが小さいので、従来通り顔領域間の合焦位置探索を行い、最至近の合焦位置を目標駆動位置に決定する。なお、複数の合焦位置の平均位置を目標駆動位置としてもよい。   On the other hand, when the difference in focus position between the face area closest to the infinity side and the face area closest to the closest side is less than the threshold th1, in step 3, there is a variation in the focus position of each face area. Since it is small, the focus position search between the face areas is performed as before, and the closest focus position is determined as the target drive position. Note that an average position of a plurality of in-focus positions may be set as the target drive position.

ステップ4において、フォーカスモーター10により決定された目標駆動位置にフォーカシングレンズを移動して焦点調節を行う。   In step 4, the focusing lens is moved to the target drive position determined by the focus motor 10 to adjust the focus.

なお、上述した一実施の形態では顔検出エリアの分割数を4個とする例を示したが、顔検出エリアの分割数は4個に限定されない。また、上述した一実施の形態では、各顔領域の合焦位置とそれらの合焦位置の中央値(メジアン)との差が所定のしきい値以上の顔領域を合焦位置選択候補から除外する例を示したが、除外する顔領域の決定手法は上記中央値に基づく手法に限定されず、平均値や他の統計的手法を用いても同様な効果が得られる。   In the above-described embodiment, an example in which the number of divisions of the face detection area is four is shown, but the number of divisions of the face detection area is not limited to four. Further, in the above-described embodiment, a face area whose difference between the focus position of each face area and the median value (median) of the focus positions is a predetermined threshold or more is excluded from the focus position selection candidates. However, the method for determining the face area to be excluded is not limited to the method based on the median value, and the same effect can be obtained by using an average value or other statistical methods.

上述した一実施の形態では特定の被写体として人物の顔を検出し、撮像画像中の特定被写体の領域、すなわち人物の顔領域を認識する例を示したが、特定被写体は人物の顔に限定されず、撮影対象とする主要被写体のパターンを記憶することによって種々の被写体を特定被写体とすることが可能である。   In the above-described embodiment, an example is shown in which a human face is detected as a specific subject and a region of the specific subject in the captured image, that is, a human face region is recognized. However, the specific subject is limited to a human face. First, by storing the pattern of the main subject to be photographed, various subjects can be specified subjects.

このように、一実施の形態によれば、撮影レンズ1による画像を撮像素子2により撮像して画像情報を出力し、この画像情報に基づいて特定の被写体(人物の顔)を認識して撮像画像の中の特定被写体の領域(顔検出エリア26)を検出する。そして、特定被写体の領域(顔検出エリア26)を複数の領域(顔領域26a〜26d)に分割し、複数の分割領域(顔領域26a〜26d)のそれぞれにおいて撮影レンズ1の合焦位置を検出するとともに、複数の分割領域(顔領域26a〜26d)で検出された複数の合焦位置に基づいて最終的な合焦位置を決定し、最終的な合焦位置を目標駆動位置として撮影レンズ1の焦点調節を行うようにした。これにより、焦点検出エリアに人物などの特定被写体を捕捉しても、焦点検出エリアに背景などの高コントラストな被写体が含まれていると、高コントラストな被写体に引っ張られて人物などの特定被写体に正確に合焦できないという不具合が防止され、特定被写体に正確に合焦させることができる。   As described above, according to one embodiment, an image taken by the photographing lens 1 is picked up by the image pickup device 2 and image information is output, and a specific subject (a person's face) is recognized based on the image information. An area of the specific subject (face detection area 26) in the image is detected. Then, the specific subject area (face detection area 26) is divided into a plurality of areas (face areas 26a to 26d), and the focus position of the photographing lens 1 is detected in each of the plurality of divided areas (face areas 26a to 26d). In addition, the final focus position is determined based on a plurality of focus positions detected in the plurality of divided areas (face areas 26a to 26d), and the photographing lens 1 is set with the final focus position as the target drive position. The focus was adjusted. As a result, even if a specific subject such as a person is captured in the focus detection area, if a high-contrast subject such as a background is included in the focus detection area, the subject is pulled by the high-contrast subject and becomes a specific subject such as a person. The inconvenience of being unable to focus accurately can be prevented, and the specific subject can be accurately focused.

また、一実施の形態によれば、複数の分割領域(顔領域26a〜26d)で検出された複数の合焦位置の内の、最も無限遠側の合焦位置と最至近側の合焦位置との差が所定値th1以上の場合には、各分割領域(顔領域26a〜26d)の合焦位置と複数の分割領域(顔領域26a〜26d)の合焦位置の中央値との差が所定値th2以上の分割領域を除く残りの分割領域の合焦位置に基づいて最終的な合焦位置(目標駆動位置)を決定するようにしたので、焦点検出エリアに含まれる撮影者が意図しない被写体を除外して特定被写体に正確に合焦させることができる。   In addition, according to the embodiment, the infinitely far-end focus position and the closest focus position among the plurality of focus positions detected in the plurality of divided areas (face areas 26a to 26d). Is equal to or greater than the predetermined value th1, the difference between the focus position of each divided area (face areas 26a to 26d) and the median value of the focus positions of the plurality of divided areas (face areas 26a to 26d) is Since the final in-focus position (target drive position) is determined based on the in-focus positions of the remaining divided areas excluding the divided area of the predetermined value th2 or more, the photographer included in the focus detection area does not intend. It is possible to accurately focus on a specific subject by excluding the subject.

一実施の形態の構成を示す図The figure which shows the structure of one embodiment 従来の焦点検出エリアの配置例を示す図The figure which shows the example of arrangement | positioning of the conventional focus detection area 一実施の形態の顔認識機能を用いた焦点検出エリアの設定例を示す図The figure which shows the example of a setting of the focus detection area using the face recognition function of one Embodiment 焦点検出エリアに低コントラストな人物の顔と高コントラストな背景とが存在する場合の、フォーカシングレンズ位置に対する焦点評価値の特性曲線を示す図The figure which shows the characteristic curve of the focus evaluation value with respect to a focusing lens position when a low contrast person's face and a high contrast background exist in a focus detection area 撮影光学系の焦点距離が変化する場合のフォーカシングレンズ位置に対する焦点評価値の特性曲線を示す図The figure which shows the characteristic curve of the focus evaluation value with respect to a focusing lens position when the focal distance of an imaging optical system changes 一実施の形態の顔認識による焦点検出エリアの分割例を示す図The figure which shows the example of a division | segmentation of the focus detection area by the face recognition of one Embodiment 一実施の形態のフォーカシングレンズの目標駆動位置決定処理を示すフローチャートThe flowchart which shows the target drive position determination process of the focusing lens of one embodiment

符号の説明Explanation of symbols

1 撮影レンズ
2 撮像素子
9 CPU
10 フォーカスモーター
11 顔認識演算部
1 Shooting Lens 2 Image Sensor 9 CPU
10 Focus motor 11 Face recognition calculation unit

Claims (4)

撮影光学系による画像を撮像して画像情報を出力する撮像手段と、
前記画像情報に基づいて特定の被写体を認識し、前記画像の中の前記特定被写体の領域を検出する被写体検出手段と、
前記特定被写体の領域を複数の領域に分割し、前記複数の分割領域のそれぞれにおいて前記撮影光学系の合焦位置を検出する合焦位置検出手段と、
前記複数の分割領域で検出された複数の合焦位置に基づいて最終的な合焦位置を決定する合焦位置決定手段と、
前記最終的な合焦位置を目標駆動位置として前記撮影光学系の焦点調節を行う焦点調節手段とを備えることを特徴とするデジタルカメラ。
An imaging means for capturing an image by the imaging optical system and outputting image information;
Subject detection means for recognizing a specific subject based on the image information and detecting a region of the specific subject in the image;
An in-focus position detecting means for dividing the area of the specific subject into a plurality of areas, and detecting the in-focus position of the imaging optical system in each of the plurality of divided areas;
Focusing position determination means for determining a final focusing position based on a plurality of focusing positions detected in the plurality of divided areas;
A digital camera comprising: a focus adjustment unit that performs focus adjustment of the photographing optical system with the final focus position as a target drive position.
請求項1に記載のデジタルカメラにおいて、
前記合焦位置決定手段は、前記複数の合焦位置の内の最も無限遠側の合焦位置と最至近側の合焦位置との差が所定値未満の場合には、前記複数の合焦位置の内の最至近側の合焦位置を前記最終的な合焦位置とするか、または前記複数の合焦位置の平均位置を前記最終的な合焦位置とすることを特徴とするデジタルカメラ。
The digital camera according to claim 1, wherein
The in-focus position determining means, when the difference between the infinitely far side in-focus position and the closest in-focus position among the plurality of in-focus positions is less than a predetermined value, the plurality of in-focus positions. A digital camera characterized in that a closest focusing position among the positions is set as the final focusing position, or an average position of the plurality of focusing positions is set as the final focusing position. .
請求項1または請求項2に記載のデジタルカメラにおいて、
前記合焦位置決定手段は、前記複数の合焦位置の内の最も無限遠側の合焦位置と最至近側の合焦位置との差が所定値以上の場合には、前記各分割領域の合焦位置と前記複数の分割領域の合焦位置の中央値との差が所定値以上の前記分割領域を除く残りの前記分割領域の合焦位置に基づいて前記最終的な合焦位置を決定することを特徴とするデジタルカメラ。
The digital camera according to claim 1 or 2,
The in-focus position determining means, when the difference between the infinitely far side in-focus position and the closest in-focus position among the plurality of in-focus positions is equal to or greater than a predetermined value, The final focus position is determined based on the focus positions of the remaining divided areas excluding the divided areas where the difference between the focus position and the median of the focus positions of the plurality of divided areas is a predetermined value or more. A digital camera characterized by
請求項3に記載のデジタルカメラにおいて、
前記合焦位置決定手段は、前記各分割領域の合焦位置と前記複数の分割領域の合焦位置の中央値との差が所定値以上の前記分割領域を除く残りの前記分割領域の合焦位置の内の最至近側の合焦位置を前記最終的な合焦位置とするか、または前記残りの前記分割領域の合焦位置の平均位置を前記最終的な合焦位置とすることを特徴とするデジタルカメラ。
The digital camera according to claim 3, wherein
The in-focus position determining unit is configured to focus in the remaining divided areas excluding the divided areas in which the difference between the in-focus position of each of the divided areas and the median value of the in-focus positions of the plurality of divided areas is a predetermined value or more. The closest focusing position among the positions is set as the final focusing position, or the average focusing position of the remaining divided areas is set as the final focusing position. A digital camera.
JP2007185835A 2007-07-17 2007-07-17 Digital camera Active JP5446076B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007185835A JP5446076B2 (en) 2007-07-17 2007-07-17 Digital camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2007185835A JP5446076B2 (en) 2007-07-17 2007-07-17 Digital camera

Publications (2)

Publication Number Publication Date
JP2009025381A true JP2009025381A (en) 2009-02-05
JP5446076B2 JP5446076B2 (en) 2014-03-19

Family

ID=40397268

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2007185835A Active JP5446076B2 (en) 2007-07-17 2007-07-17 Digital camera

Country Status (1)

Country Link
JP (1) JP5446076B2 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010286772A (en) * 2009-06-15 2010-12-24 Casio Computer Co Ltd Imaging apparatus, focusing method and program
JP2011002690A (en) * 2009-06-19 2011-01-06 Casio Computer Co Ltd Imaging apparatus, focusing method and program
JP2011242796A (en) * 2011-07-22 2011-12-01 Casio Comput Co Ltd Imaging device, focusing method and program
JP2012022334A (en) * 2011-09-22 2012-02-02 Sony Corp Imaging device and control method and program for imaging device
JP2013122611A (en) * 2013-02-01 2013-06-20 Casio Comput Co Ltd Imaging apparatus, focusing method, and program
US8717490B2 (en) 2009-06-19 2014-05-06 Casio Computer Co., Ltd Imaging apparatus, focusing method, and computer-readable recording medium recording program
JP2014132321A (en) * 2013-01-07 2014-07-17 Canon Inc Focus adjustment device and method
JP2015118337A (en) * 2013-12-19 2015-06-25 キヤノン株式会社 Image capturing device, control method therefor, program, and storage medium
US9389895B2 (en) 2009-12-17 2016-07-12 Microsoft Technology Licensing, Llc Virtual storage target offload techniques
CN106454289A (en) * 2016-11-29 2017-02-22 广东欧珀移动通信有限公司 Control method, control device and electronic device
CN106507069A (en) * 2016-11-29 2017-03-15 广东欧珀移动通信有限公司 Control method, control device and electronic installation
US10382675B2 (en) 2016-11-29 2019-08-13 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Image processing method and apparatus, and electronic device including a simulation true-color image
US10382709B2 (en) 2016-11-29 2019-08-13 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Image processing method and apparatus, and electronic device
US10432905B2 (en) 2016-11-29 2019-10-01 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method and apparatus for obtaining high resolution image, and electronic device for same
CN113572957A (en) * 2021-06-26 2021-10-29 荣耀终端有限公司 Shooting focusing method and related equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0698239A (en) * 1992-09-10 1994-04-08 Canon Inc Automatic focusing controller
JPH09200597A (en) * 1996-01-17 1997-07-31 Olympus Optical Co Ltd Automatic focusing detector
JP2003241067A (en) * 2002-02-18 2003-08-27 Minolta Co Ltd Image pickup unit
JP2006201282A (en) * 2005-01-18 2006-08-03 Nikon Corp Digital camera
JP2006227080A (en) * 2005-02-15 2006-08-31 Nikon Corp Electronic camera

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0698239A (en) * 1992-09-10 1994-04-08 Canon Inc Automatic focusing controller
JPH09200597A (en) * 1996-01-17 1997-07-31 Olympus Optical Co Ltd Automatic focusing detector
JP2003241067A (en) * 2002-02-18 2003-08-27 Minolta Co Ltd Image pickup unit
JP2006201282A (en) * 2005-01-18 2006-08-03 Nikon Corp Digital camera
JP2006227080A (en) * 2005-02-15 2006-08-31 Nikon Corp Electronic camera

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010286772A (en) * 2009-06-15 2010-12-24 Casio Computer Co Ltd Imaging apparatus, focusing method and program
JP2011002690A (en) * 2009-06-19 2011-01-06 Casio Computer Co Ltd Imaging apparatus, focusing method and program
US8717490B2 (en) 2009-06-19 2014-05-06 Casio Computer Co., Ltd Imaging apparatus, focusing method, and computer-readable recording medium recording program
TWI468770B (en) * 2009-06-19 2015-01-11 Casio Computer Co Ltd Imaging apparatus, focusing method, and computer-readable recording medium recording program
US10248334B2 (en) 2009-12-17 2019-04-02 Microsoft Technology Licensing, Llc Virtual storage target offload techniques
US9389895B2 (en) 2009-12-17 2016-07-12 Microsoft Technology Licensing, Llc Virtual storage target offload techniques
JP2011242796A (en) * 2011-07-22 2011-12-01 Casio Comput Co Ltd Imaging device, focusing method and program
JP2012022334A (en) * 2011-09-22 2012-02-02 Sony Corp Imaging device and control method and program for imaging device
JP2014132321A (en) * 2013-01-07 2014-07-17 Canon Inc Focus adjustment device and method
JP2013122611A (en) * 2013-02-01 2013-06-20 Casio Comput Co Ltd Imaging apparatus, focusing method, and program
JP2015118337A (en) * 2013-12-19 2015-06-25 キヤノン株式会社 Image capturing device, control method therefor, program, and storage medium
CN106454289A (en) * 2016-11-29 2017-02-22 广东欧珀移动通信有限公司 Control method, control device and electronic device
US10110809B2 (en) 2016-11-29 2018-10-23 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Control method and apparatus, and electronic device
CN106507069A (en) * 2016-11-29 2017-03-15 广东欧珀移动通信有限公司 Control method, control device and electronic installation
US10348962B2 (en) 2016-11-29 2019-07-09 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Image processing method and apparatus, and electronic device
US10382675B2 (en) 2016-11-29 2019-08-13 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Image processing method and apparatus, and electronic device including a simulation true-color image
US10382709B2 (en) 2016-11-29 2019-08-13 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Image processing method and apparatus, and electronic device
US10432905B2 (en) 2016-11-29 2019-10-01 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method and apparatus for obtaining high resolution image, and electronic device for same
CN113572957A (en) * 2021-06-26 2021-10-29 荣耀终端有限公司 Shooting focusing method and related equipment
CN113572957B (en) * 2021-06-26 2022-08-05 荣耀终端有限公司 Shooting focusing method and related equipment

Also Published As

Publication number Publication date
JP5446076B2 (en) 2014-03-19

Similar Documents

Publication Publication Date Title
JP5446076B2 (en) Digital camera
US7929042B2 (en) Imaging apparatus, control method of imaging apparatus, and computer program
JP4674471B2 (en) Digital camera
US7801432B2 (en) Imaging apparatus and method for controlling the same
JP4858849B2 (en) Imaging apparatus and program thereof
US9826140B2 (en) Image capturing apparatus and control method thereof
US9160922B2 (en) Subject detection device and control method for the same, imaging apparatus, and storage medium
JP4364078B2 (en) Imaging method and imaging apparatus
JP5780756B2 (en) Focus adjustment apparatus and method
TWI436142B (en) Method and apparatus for auto-focusing in image sensor
JP2007108412A (en) Autofocus device and its program
JP6116277B2 (en) Imaging apparatus and control method thereof
US8823863B2 (en) Image capturing apparatus and control method therefor
JP4552997B2 (en) Imaging apparatus and program
JP2011150281A (en) Imaging apparatus, method for controlling the imaging apparatus, and computer program
JP2007129310A (en) Imaging apparatus
JP2009009072A (en) Dynamic focus zone for camera
JP2009265239A (en) Focus detecting apparatus, focus detection method, and camera
JP2007133301A (en) Autofocus camera
JP2010156851A (en) Focus adjustment device and method
JP2007328360A (en) Automatic focusing camera and photographing method
JP2008176113A (en) Focus detecting device and camera
JP2007328213A (en) Imaging apparatus, imaging apparatus control method, and computer program
JP4902946B2 (en) Auto focus camera
JP2008051871A (en) Automatic focusing device

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20100715

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20110706

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110712

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110912

RD02 Notification of acceptance of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7422

Effective date: 20110912

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20120522

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120723

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20130205

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20130408

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20131203

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20131216

R150 Certificate of patent or registration of utility model

Ref document number: 5446076

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250