JP2002269545A - Face image processing method and face image processing device - Google Patents

Face image processing method and face image processing device

Info

Publication number
JP2002269545A
JP2002269545A JP2001068820A JP2001068820A JP2002269545A JP 2002269545 A JP2002269545 A JP 2002269545A JP 2001068820 A JP2001068820 A JP 2001068820A JP 2001068820 A JP2001068820 A JP 2001068820A JP 2002269545 A JP2002269545 A JP 2002269545A
Authority
JP
Japan
Prior art keywords
luminance
face image
image
brightness
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2001068820A
Other languages
Japanese (ja)
Inventor
Tsukasa Kanda
宰 神田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Holdings Corp
Original Assignee
Matsushita Electric Industrial Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Industrial Co Ltd filed Critical Matsushita Electric Industrial Co Ltd
Priority to JP2001068820A priority Critical patent/JP2002269545A/en
Publication of JP2002269545A publication Critical patent/JP2002269545A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Processing (AREA)
  • Studio Circuits (AREA)
  • Image Analysis (AREA)

Abstract

PROBLEM TO BE SOLVED: To provide a face image processing method and face image processing device, capable of providing a face image minimized in the influence by reflected light by removing the part enhanced in luminance due to the reflection of eyeglasses. SOLUTION: This device comprises an imaging camera 1 for taking the face image including a background image; an image-reading means 2 for reading the face image including the background image; an image-storing means 3 for storing the face image including the background image; a background region cutting means 4 for cutting a background region from the face image, including the background image; a background region memory means 5 for storing the cut background region; a high luminance part determining means 6 for judging the part enhanced in luminance by the reflection of eyeglasses from the face region image; a high luminance part memory means 7 for storing the high luminance part; a high luminance part removing means 8 for removing the high luminance part from the face image stored in the image storage means 3; and an image output means 9 for outputting the face image, from which the high luminance part is removed.

Description

【発明の詳細な説明】DETAILED DESCRIPTION OF THE INVENTION

【0001】[0001]

【発明の属する技術分野】本発明は、眼鏡の反射によっ
て高輝度となった部分を除去することで、反射光による
影響を最小限に抑えた顔画像が得られるようにした顔画
像処理方法及び顔画像処理装置に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a face image processing method and a face image processing method for removing a portion having high brightness due to the reflection of spectacles, thereby obtaining a face image in which the influence of reflected light is minimized. The present invention relates to a face image processing device.

【0002】[0002]

【従来の技術】従来、眼鏡の反射によって目が検出でき
なくなるのを防止した顔画像撮像装置として、例えば特
開平9-21611号公報に記載されたものが知られている。
すなわち、前記特開平9-21611号公報に記載されている
顔画像撮像装置は、目を検出する処理ルーチンを実行し
て目を検出できず、かつ眼鏡フレームを検出するフレー
ム検出処理ルーチンを実行して眼鏡フレームを検出した
ときは、眼鏡レンズの表面反射により目が検出できない
ものとして、目を検出するために近赤外光源を点灯して
目を検出するようにしたものである。
2. Description of the Related Art Conventionally, as a face image pickup apparatus for preventing the eyes from being undetectable due to the reflection of spectacles, for example, an apparatus described in Japanese Patent Application Laid-Open No. 9-21611 is known.
That is, the face image capturing apparatus described in Japanese Patent Application Laid-Open No. 9-21611 executes a frame detection processing routine that executes a processing routine for detecting eyes, cannot detect eyes, and detects eyeglass frames. When the eyeglass frame is detected, the eye is not detected due to the surface reflection of the eyeglass lens, and the near infrared light source is turned on to detect the eye to detect the eye.

【0003】[0003]

【発明が解決しようとする課題】しかしながら、上記の
ような従来の方法では、目を検出する処理ルーチン、眼
鏡フレームを検出するフレーム検出処理ルーチンおよび
目を検出するために近赤外光源を点灯するルーチンとい
った複雑な処理が必要であるという問題点があった。
However, in the conventional method as described above, the processing routine for detecting the eyes, the frame detection processing routine for detecting the eyeglass frames, and turning on the near-infrared light source to detect the eyes are performed. There is a problem that complicated processing such as a routine is required.

【0004】そこで本発明は、このような従来の問題点
を解決するものであり、眼鏡の反射によって高輝度とな
った部分を除去することで、反射光による影響を最小限
に抑えた顔画像が得られるようにした顔画像処理方法及
び顔画像処理装置を提供することを目的とする。
Accordingly, the present invention is to solve such a conventional problem, and removes a portion having high brightness due to reflection of spectacles, thereby minimizing the influence of reflected light. It is an object of the present invention to provide a face image processing method and a face image processing apparatus which can obtain a face image.

【0005】[0005]

【課題を解決するための手段】上記の目的を達成するた
めに本発明は、顔画像中で高輝度画素を検出し、検出さ
れた高輝度画素に対して最近接画素の平均色で置き換え
る処理を各画素単位で繰り返し適用するように構成した
ものである。
SUMMARY OF THE INVENTION In order to achieve the above object, the present invention provides a process for detecting a high luminance pixel in a face image and replacing the detected high luminance pixel with the average color of the nearest pixel. Is repeatedly applied for each pixel unit.

【0006】このように本発明によれば、眼鏡反射光に
由来するスポット状高輝度部分により目の一部が隠され
ている場合でも、反射光がない顔画像を得ることがで
き、顔認識において重要な目の検出精度を向上させるこ
とができる。
As described above, according to the present invention, a face image without reflected light can be obtained even when a part of eyes is hidden by a spot-like high-luminance portion derived from spectacle reflected light. In the above, the detection accuracy of important eyes can be improved.

【0007】[0007]

【発明の実施の形態】以下、本発明の実施の形態につい
て、図面を用いて説明する。
Embodiments of the present invention will be described below with reference to the drawings.

【0008】図1は本発明の実施形態に係る顔画像処理
装置の構成を示すブロック図であり、図1において顔画
像処理装置は、背景画像を含んだ顔画像を撮像する撮像
カメラ1と、撮像カメラ1で撮像した背景画像を含んだ
顔画像を取り込む画像取込手段2と、画像取込手段2で
取り込んだ背景画像を含んだ顔画像を記憶する画像記憶
手段3と、画像取込手段2で取り込んだ背景画像を含ん
だ顔画像から背景領域を切り出す背景領域切出手段4
と、背景領域切出手段4が切り出した背景領域を記憶す
る背景領域記憶手段5と、画像記憶手段3に記憶された
顔領域画像から眼鏡の反射により輝度の高くなっている
部分を判別する高輝度部位判別手段6と、高輝度部位判
別手段6が判別した高輝度部位を記憶する高輝度部位記
憶手段7と、高輝度部位判別手段6によって判別された
高輝度部位を画像記憶手段3に記憶された顔画像から除
去する高輝度部位除去手段8と、高輝度部位除去手段8
によって高輝度部位が除去された顔画像を例えば顔認識
による個人同定装置などに出力する画像出力手段9とか
ら構成されている。
FIG. 1 is a block diagram showing a configuration of a face image processing apparatus according to an embodiment of the present invention. In FIG. 1, the face image processing apparatus includes an imaging camera 1 for capturing a face image including a background image, Image capturing means 2 for capturing a face image including a background image captured by an image capturing camera 1, image storing means 3 for storing a face image including a background image captured by the image capturing means 2, and image capturing means Background area extracting means 4 for extracting a background area from a face image including the background image captured in 2
A background area storage means 5 for storing the background area cut out by the background area cutout means 4; and a height for judging, from the face area image stored in the image storage means 3, a portion having a high luminance due to reflection of spectacles. Brightness part determination means 6, high brightness part storage means 7 for storing the high brightness parts determined by high brightness part determination means 6, and high brightness parts determined by high brightness part determination means 6 are stored in image storage means 3. High-brightness part removing means 8 for removing from the extracted face image, and high-brightness part removing means 8
Image output means 9 for outputting a face image from which a high-luminance part has been removed to, for example, a personal identification device based on face recognition.

【0009】上記のように構成された顔画像処理装置の
動作を図2のフローチャートを用いて説明する。図2の
ステップ(図ではステップをSと略記する。以下同じ)
11では撮像カメラ1から背景画像を含んだ顔画像を取り
込み、取り込んだ背景画像を含んだ顔画像を画像記憶手
段3に記憶する。
The operation of the thus configured face image processing apparatus will be described with reference to the flowchart of FIG. Steps in FIG. 2 (Steps are abbreviated as S in the figure; the same applies hereinafter)
In step 11, a face image including a background image is captured from the imaging camera 1, and the face image including the captured background image is stored in the image storage unit 3.

【0010】ステップ12では、画像取込手段2で取り込
んだ背景画像を含んだ顔画像から背景領域を切り出す背
景領域切出手段4により背景差分法を用いて背景領域を
取得する。そして切り出した背景領域を背景領域記憶手
段5に記憶する。
In step 12, a background region is obtained by a background region extracting unit 4 for extracting a background region from a face image including a background image captured by the image capturing unit 2 using a background subtraction method. Then, the extracted background area is stored in the background area storage unit 5.

【0011】画像記憶手段3に書き込まれたデータはス
テップ13で図3に示されるように4×4=16個の矩形
処理領域に等分され、ステップ14における処理領域の選
択において、目が含まれる領域を選択する。これは顔画
像のうち眼鏡を掛けている人の目の画像が眼鏡により反
射がおこって顔画像を正確に把握しづらいことに対応し
ている。
The data written in the image storage means 3 is equally divided in step 13 into 4 × 4 = 16 rectangular processing regions as shown in FIG. Select the area to be copied. This corresponds to the fact that the eye image of the person wearing glasses among the face images is reflected by the glasses, making it difficult to accurately grasp the face image.

【0012】ステップ15では、ステップ14において選択
された領域で未処理の領域があるかないかが判定され
る。そしてステップ15で選択された領域で未処理の領域
がある場合、ステップ16で処理対象とする領域を指定す
る。ステップ17〜21は、ステップ16で選択された領域に
対して行われる。すなわちステップ17では、領域内の輝
度ヒストグラムを作成し、それに基づいてステップ18
で、輝度閾値を決定する。
In step 15, it is determined whether there is any unprocessed area in the area selected in step 14. If there is an unprocessed area in the area selected in step 15, an area to be processed is specified in step 16. Steps 17 to 21 are performed on the area selected in step 16. That is, in step 17, a luminance histogram in the region is created, and based on the
Determines the luminance threshold.

【0013】ここで、図4を用いて上記ステップ18にお
ける輝度閾値決定方法を説明する。輝度閾値を決定する
には、 処理対象領域全体を図4に示されるような輝度ヒスト
グラムで表したとき、輝度値g以下の輝度に全度数の10
%が存在する場合、この輝度をg10と表し、全度数の90
%が存在する場合はg90と表す。 g10からg90までの間で度数の平均値を求め、これを
Thn1とする。 Thn1の20%をThn2とする。 g90から輝度が高くなる方向に探索をかけ、探索位置
およびその前後の輝度における度数の平均値(以下の式
1参照)をThn2と比較していき、はじめてThn2を下回っ
た輝度位置を最小の輝度値gとして輝度閾値として採用
する。すなわち、
Here, a method of determining the luminance threshold value in step 18 will be described with reference to FIG. In order to determine the luminance threshold, when the entire processing target area is represented by a luminance histogram as shown in FIG.
% If there is, it represents the luminance and g 10, the total power 90
% If there is represented as g 90. the average value of the frequency between the g 10 to g 90, this
Thn1. 20% of Thn1 is defined as Thn2. applying a search from g 90 in the direction in which the luminance is high, the search position and power of the average value in the preceding and succeeding the brightness (see the following equation 1) compare them with Thn2, the minimum luminance position for the first time below the Thn2 The luminance value g is adopted as a luminance threshold. That is,

【数1】 を満たす最小の輝度値gを輝度閾値として採用する。但
し、Hは度数を表す。上記までの方法で輝度閾値を
決定しえなかった場合、Thn2にThn1の20%を加えて、再
度の方法を実行する。
(Equation 1) Is adopted as the luminance threshold. Here, H represents a frequency. If the luminance threshold cannot be determined by the above method, 20% of Thn1 is added to Thn2, and the method is executed again.

【0014】次に、ステップ19ではステップ18で求めら
れた輝度閾値を元に領域内での高輝度部位判別手段6に
より高輝度部位を判別し、判別した高輝度部位を高輝度
部位記憶手段7に記憶する。
Next, in step 19, the high-brightness part in the area is determined by the high-brightness part determination means 6 based on the brightness threshold value obtained in step 18, and the determined high-brightness part is stored in the high-brightness part storage means 7. To memorize.

【0015】ここで、図5を用いて高輝度領域の決定方
法を説明する。説明の都合上、いま単一の高輝度領域が
存在するものと仮定する。なお高輝度領域が複数ある場
合も、これから説明する単一の高輝度領域の決定方法と
同様に処理することができる。
Here, a method of determining a high luminance area will be described with reference to FIG. For the sake of explanation, it is assumed that there is now a single high brightness area. Even when there are a plurality of high-luminance regions, the same processing can be performed as in the method for determining a single high-luminance region described below.

【0016】単一の高輝度領域の高輝度領域の決定条件
は、以下に説明する(イ)の条件を満たす場合、或いは
(ロ)または(ハ)の条件を満たす場合、その画素を高
輝度部分とみなす。すなわち、(イ)高輝度を規定する
第1の閾値Th1(図5参照)より高い輝度を持つこと、ま
たは、前記(イ)の条件を満たす画素を探索開始画素と
して2次元座標上で+x方向、−x方向、+y方向、−
y方向のそれぞれの方向に画素の探索を行い、(ロ)探
索対象画素直前までの画素が、連続して、第2の閾値Th
2(<Th1)(図5参照)より大きいこと、(ハ)探索対象
画素直前までの画素が、連続して、前記(ロ)の条件を
満たしているか、または探索対象となっている画素の輝
度に対する直前の画素の輝度の差が、(第1の閾値Th1−
第2の閾値Th2)より大きいこと。
When the condition for determining a high-luminance region of a single high-luminance region satisfies the condition (a) described below, or satisfies the condition (b) or (c), the pixel is determined to have high luminance. Considered part. That is, (a) having a luminance higher than the first threshold value Th1 (see FIG. 5) that defines high luminance, or a pixel satisfying the condition (a) as a search start pixel in the + x direction on two-dimensional coordinates. , -X direction, + y direction,-
A pixel search is performed in each direction of the y direction, and (b) the pixels immediately before the pixel to be searched are continuously determined by the second threshold value Th.
2 (<Th1) (see FIG. 5), (c) the pixels up to immediately before the pixel to be searched continuously satisfy the above condition (b), or The difference between the luminance of the immediately preceding pixel and the luminance is (first threshold Th1−
Be larger than the second threshold value Th2).

【0017】以上のようにして上記(イ)の条件を満た
す場合、或いは(ロ)または(ハ)の条件を満たす場
合、その画素を高輝度部分と捉えるようにしている。図
5は画素についてx方向の探索で高輝度部分が探索でき
た結果を示し、画素位置3及び4で上記(イ)の条件を
満足する高輝度部分を見つけ、次いで画素位置3及び4
を探索開始画素として+x方向、−x方向に向かって上
記(ロ)、(ハ)の条件を満たす画素を探索して高輝度
部分を見つけている。なお、画素についてy方向の探索
で高輝度部分を探索する場合は、上記した画素について
x方向の高輝度部分の探索と同様であるので図示を省略
しており、また、高輝度部分(領域)が複数ある場合
も、上記した単一の高輝度領域の決定方法と同様に決定
することができる。
As described above, when the above condition (a) is satisfied, or when the condition (b) or (c) is satisfied, the pixel is regarded as a high luminance portion. FIG. 5 shows the result of a search for a high-luminance portion in the x-direction for a pixel. A high-luminance portion that satisfies the above condition (a) is found at pixel positions 3 and 4, and then pixel positions 3 and 4 are obtained.
Is used as a search start pixel, a pixel that satisfies the above conditions (b) and (c) is searched in the + x direction and the −x direction to find a high luminance portion. When a high-luminance portion is searched for in a y-direction for a pixel, the search is the same as the search for a high-luminance portion in the x-direction for the above-described pixel. Can be determined in the same manner as the above-described method of determining a single high-luminance area.

【0018】図2のフローチャートに戻って説明する
と、ステップ20では、顔画像記憶手段5に格納されたデ
ータを元に、高輝度部位で顔の外にある部分を選び出
し、高輝度部位記憶手段7から削除する。次いで、ステ
ップ21では、高輝度部位記憶手段7に格納されている高
輝度部位を背景推定色で塗りつぶす、具体的には高輝度
部位の各画素を4近傍画素の平均輝度値で置き換える。
Returning to the flowchart of FIG. 2, in step 20, based on the data stored in the face image storage means 5, a portion outside the face is selected as a high brightness part, and the high brightness part storage means 7 is selected. Remove from. Next, in step 21, the high-brightness part stored in the high-brightness part storage means 7 is painted out with the estimated background color. Specifically, each pixel in the high-brightness part is replaced with the average luminance value of four neighboring pixels.

【0019】一般にステップ21は1回で高輝度部位を取
り除くことはできず、高輝度部位がステップ21の処理に
対して定常とみなせる程度に処理前後の変化が小さくな
るまでステップ21を繰り返さなければならない。十分に
多くの回数を繰り返した場合、高輝度部位は周囲画素に
よって平均された状態となり、高輝度部分が取り除かれ
る。すなわちステップ22で、ステップ21の処理がまだ不
十分と判断されれば、ステップ21に戻される。処理回数
が十分であると判断された場合、その領域の処理を終了
し、ステップ15に戻る。
In general, step 21 cannot remove a high-brightness portion at one time, and step 21 must be repeated until the change before and after the high-brightness portion becomes small enough to be regarded as steady with respect to the process of step 21. No. When the number of repetitions is sufficiently large, the high-luminance portion is averaged by surrounding pixels, and the high-luminance portion is removed. That is, if it is determined in step 22 that the processing in step 21 is still insufficient, the process returns to step 21. If it is determined that the number of times of processing is sufficient, the processing of the area is terminated, and the process returns to step S15.

【0020】ここで、図6〜図8を用いて上記ステップ
21、ステップ22によって実行される高輝度部位の除去方
法を説明する。図6は高輝度部位除去の基本的な流れを
示しており、顔画像記憶手段5に格納されたデータ(図
6(a)参照)から高輝度領域を選択する(図6(b)
参照)。次いで、高輝度領域内の画素を選択する(図6
(c)参照)。
Here, the above steps will be described with reference to FIGS.
A method of removing a high-luminance part performed in step 21 and step 22 will be described. FIG. 6 shows a basic flow of high-luminance part removal, in which a high-luminance area is selected from data (see FIG. 6A) stored in the face image storage unit 5 (FIG. 6B).
reference). Next, a pixel in the high luminance area is selected (FIG. 6).
(C)).

【0021】次いで、注目する画素の4近傍画素の平均
値を計算し、値を別領域に格納する(図6(d)参
照)。すなわち、ある画素を周囲画素に対して目立たな
い色で置き換えようとした場合、周囲画素の画素の平均
値を利用することが得策である。そのとき周囲画素とし
て4近傍画素を選択した場合、置き換え対象領域A0
注目画素M0(i,j)を以下の式2で示すM1(i,j)で置き換
えれば良い。
Next, the average value of the four neighboring pixels of the pixel of interest is calculated, and the value is stored in another area (see FIG. 6D). That is, when an attempt is made to replace a certain pixel with a color that is inconspicuous with respect to the surrounding pixels, it is advisable to use the average value of the surrounding pixels. At this time, when four neighboring pixels are selected as surrounding pixels, the target pixel M 0 (i, j) in the replacement target area A 0 may be replaced with M 1 (i, j) expressed by the following equation 2.

【数2】 (Equation 2)

【0022】この処理を、領域A0全体に対して施すこ
とにより、領域A1を得る。領域A0を構成する連結領域
が1画素から構成される場合は領域A1が目的の解とな
るが、一般に領域A0の連結領域は複数画素で構成され
ており、その場合M0(i,j)の近傍のうち少なくとも1つ
の画素M0(i´,j´)は、領域A0に含まれているため、
一般にM1(i´,j´)はM0(i´,j´)と異なる。したがっ
てM1(i,j)の値は領域A1において必ずしも定常ではな
く、領域A1に対してさらに上記の処理を繰り返し施す
ことで定常解Aを得なければならない。
[0022] The process, by subjecting to the entire area A 0, obtain area A 1. When the connected region constituting the region A 0 is composed of one pixel, the region A 1 is a target solution. In general, the connected region of the region A 0 is composed of a plurality of pixels. In this case, M 0 (i , j), since at least one pixel M 0 (i ′, j ′) is included in the region A 0 ,
Generally, M 1 (i ′, j ′) is different from M 0 (i ′, j ′). The value of the thus M 1 (i, j) is not necessarily constant in the region A 1, it shall give an additional stationary solution A by performing repeat the above processing with respect to regions A 1.

【0023】すなわち図7に示されるように、ある領域
kの注目画素をMk(i,j)とし、その周辺画素、すなわ
ちMk(i−1,j)、Mk(i,j−1)、Mk(i+1,j)、Mk(i,j+
1)の4近傍画素の平均値として得られるMk+1(i,j)で構
成される領域を新たにAk+1とすると、maxi,j|Mn(i,j)
-Mk(i,j)|が充分に小さくなるような回数nだけ処理を
繰り返すことで解が得られる(maxi,jP(i,j)は、考え
得る(i,j)の組合せの中でのP(i,j)の最大値とする)。
[0023] That is, as shown in FIG. 7, a pixel of interest of a region A k M k (i, j ) and its peripheral pixels, i.e. M k (i-1, j ), M k (i, j −1), M k (i + 1, j), M k (i, j +
If a region composed of M k + 1 (i, j) obtained as an average value of the four neighboring pixels in 1) is newly defined as A k + 1 , max i, j | M n (i, j)
A solution can be obtained by repeating the process n times such that -M k (i, j) | is sufficiently small (max i, j P (i, j) is a possible combination of (i, j) Is the maximum value of P (i, j).

【0024】なお、上記で定義したMk+1(i,j)は、一般
的には以下の式3によって求められる。
Note that M k + 1 (i, j) defined above is generally obtained by the following equation (3).

【数3】 (Equation 3)

【0025】図8は、高輝度部位の除去の様子を示すイ
メージ図であり、図8(a)に示すように、処理前にあ
った高輝度部位(斜線が引かれた白矩形部の集合)が、
図8(b)に示す1回処理によって高輝度部位が少し小
さくなり、やがて図8(c)に示すn回処理することに
よってもはやこれ以上処理を繰り返してもほとんど値が
変化しないようになる。
FIG. 8 is an image diagram showing a state of removal of a high-brightness portion. As shown in FIG. 8A, a high-brightness portion (a set of hatched white rectangular portions) which was present before the process was performed. But,
The one-time processing shown in FIG. 8B slightly reduces the high-brightness portion. After the n-times processing shown in FIG. 8C, the value hardly changes even if the processing is repeated any more.

【0026】再び図2のフローチャートに戻って説明す
ると、ステップ22で処理回数が十分であると判断され場
合には、ステップ15に戻り、ステップ15で、ステップ14
で選択された領域全てに対する処理が終了していると判
断されると、ステップ23に進んで、処理結果画像を出力
し、次の画像が入ってくるまで待機して処理を終了す
る。
Returning to the flowchart of FIG. 2 again, if it is determined in step 22 that the number of processes is sufficient, the process returns to step 15 and in step 15,
If it is determined that the processing has been completed for all the areas selected in step, the process proceeds to step 23, where a processing result image is output, and the processing is terminated after waiting for the next image to be input.

【0027】図9は、本発明の顔画像処理装置の動作を
説明する図であり、撮像カメラ1によって取り込まれた
顔画像(図9(a)参照)を、顔画像切出手段4によっ
て部分領域に分割(図9(b)参照)し、顔画像以外を
除去した後に高輝度部分を有する領域を選択(図9
(c)参照)し、これに対して高輝度部分を除去(図9
(d)参照)され、したがって眼鏡による反射光の目立
たない顔画像を得ることができ、目の検出精度を向上さ
せることができる。
FIG. 9 is a diagram for explaining the operation of the face image processing apparatus according to the present invention. The face image (see FIG. The image is divided into regions (see FIG. 9B), and after removing portions other than the face image, a region having a high luminance portion is selected (see FIG. 9B).
(See FIG. 9 (c)), and a high-luminance portion is removed (FIG. 9).
(Refer to (d)), so that it is possible to obtain a face image in which reflected light from the glasses is inconspicuous, and it is possible to improve eye detection accuracy.

【0028】[0028]

【発明の効果】上記の説明から明らかなように本発明
は、眼鏡反射光に由来するスポット状高輝度部分により
目の一部が隠されている場合でも、反射光がない画像に
近づけることができるため、テンプレートマッチング法
による目の位置検出精度を向上させることができる。
As is apparent from the above description, according to the present invention, even when a part of the eyes is hidden by the spot-like high-brightness portion derived from the spectacle reflected light, it is possible to bring the image closer to an image without reflected light. Therefore, the eye position detection accuracy by the template matching method can be improved.

【0029】また、眼鏡の反射を除去することで、反射
光による影響を最小限に抑えた顔のエッジ画像を取得す
ることができる。
Further, by removing the reflection of the spectacles, it is possible to obtain a face edge image in which the influence of the reflected light is minimized.

【図面の簡単な説明】[Brief description of the drawings]

【図1】本発明の実施形態に係る顔画像処理装置の構成
を説明する図、
FIG. 1 is a diagram illustrating a configuration of a face image processing apparatus according to an embodiment of the present invention;

【図2】本発明の実施形態に係る顔画像処理装置の動作
を説明するフローチャート、
FIG. 2 is a flowchart illustrating an operation of the face image processing apparatus according to the embodiment of the present invention;

【図3】画像処理領域を16個の矩形処理領域に等分し
た様子を示す図、
FIG. 3 is a diagram showing a state in which an image processing area is equally divided into 16 rectangular processing areas;

【図4】本発明に係る輝度閾値決定方法を説明する図、FIG. 4 is a view for explaining a luminance threshold value determining method according to the present invention;

【図5】本発明に係る高輝度領域の決定方法を説明する
図、
FIG. 5 is a view for explaining a method for determining a high-luminance area according to the present invention;

【図6】本発明の実施形態に係る高輝度部位除去の基本
的な流れを示す図、
FIG. 6 is a diagram showing a basic flow of high-luminance site removal according to the embodiment of the present invention;

【図7】注目画素Mk(i,j)とそれを取り巻く4近傍画素
の関係を示す図、
FIG. 7 is a diagram showing a relationship between a target pixel M k (i, j) and four neighboring pixels surrounding the target pixel M k (i, j);

【図8】本発明の実施形態に係る高輝度部位の除去の様
子を示すイメージ図、
FIG. 8 is an image diagram showing a state of removal of a high-luminance part according to the embodiment of the present invention;

【図9】本発明の顔画像処理装置の動作を説明する図で
ある。
FIG. 9 is a diagram for explaining the operation of the face image processing apparatus of the present invention.

【符号の説明】[Explanation of symbols]

1 撮像カメラ 2 画像取込手段 3 画像記憶手段 4 顔画像切出手段 5 顔画像記憶手段 6 高輝度部位判別手段 7 高輝度部位記憶手段 8 高輝度部位除去手段 9 画像出力手段 REFERENCE SIGNS LIST 1 imaging camera 2 image capture means 3 image storage means 4 face image cutout means 5 face image storage means 6 high brightness part determination means 7 high brightness part storage means 8 high brightness part removal means 9 image output means

───────────────────────────────────────────────────── フロントページの続き Fターム(参考) 5B057 BA02 CA01 CA08 CA12 CA16 CB01 CB08 CB12 CB16 CE09 CE16 DA08 DB02 DB06 DB09 DC23 5C023 AA06 AA11 AA27 AA38 BA11 CA08 DA04 DA08 EA03 EA06 5L096 AA02 AA06 CA02 EA35 FA19 FA37 GA07 GA51  ──────────────────────────────────────────────────の Continued on the front page F-term (reference)

Claims (6)

【特許請求の範囲】[Claims] 【請求項1】 撮像カメラから背景画像を含んだ顔画像
を取り込む段階と、取り込んだ背景画像を含んだ顔画像
を矩形領域に切出し、切出した領域から背景差分法を用
いて顔画像を取得する段階と、取得した顔画像を矩形処
理領域に等分し、目が含まれる領域を選択する段階と、
選択した領域内の輝度ヒストグラムを作成し、それに基
づいて輝度閾値を決定する段階と、輝度閾値を元に選択
した領域内での高輝度部位を判別する段階と、判別した
高輝度部位を背景推定色で塗りつぶす段階とを含む顔画
像処理方法。
1. A step of capturing a face image including a background image from an imaging camera, cutting out the face image including the captured background image into a rectangular area, and acquiring the face image from the cut-out area using a background subtraction method. And a step of equally dividing the acquired face image into rectangular processing areas and selecting an area including eyes.
Creating a luminance histogram in the selected area and determining a luminance threshold based on the luminance histogram, determining a high-luminance part in the selected area based on the luminance threshold, and estimating the determined high-luminance part as a background. A face image processing method including a step of painting with a color.
【請求項2】 前記高輝度部位を背景推定色で塗りつぶ
す段階として、高輝度部位の各画素を4近傍画素の加重
平均値に置き換え、この処理を選択した領域が無くなる
までフィードバックして背景推定色で塗りつぶすことを
特徴とする請求項1に記載の顔画像処理方法。
2. In the step of painting the high-brightness part with a background estimation color, each pixel in the high-brightness part is replaced with a weighted average value of four neighboring pixels, and this processing is fed back until there is no more selected area. The face image processing method according to claim 1, wherein the face is painted.
【請求項3】 前記選択した領域内の輝度ヒストグラム
を作成し、それに基づいて輝度閾値を決定する段階に
は、 処理対象領域全体を輝度ヒストグラムで表したとき、最
小の輝度値g以下の輝度に全度数の10%が存在する場合
の輝度をg10と表し、全度数の90%が存在する場合の輝
度をg90と表して、前記g10から前記g90までの間で度
数の平均値を求めてこれをThn1とする段階と、前記Thn1
の20%をThn2とする段階と、前記g90から輝度が高くな
る方向に探索をかけ、探索位置およびその前後の輝度に
おける度数の平均値を前記Thn2と比較していき、はじめ
て前記Thn2を下回った輝度位置を最小の輝度値gを輝度
閾値として採用する段階と、前記までの段階で輝度閾値
を決定しえなかった場合、前記Thn2に前記Thn1の20%を
加えて新しいThn2とし、輝度閾値を決定するまで前記段
階を繰り返す段階を含む請求項1に記載の顔画像処理方
法。
3. The step of creating a luminance histogram in the selected area and determining a luminance threshold based on the luminance histogram includes: when the entire processing target area is represented by a luminance histogram, the luminance is set to a luminance equal to or less than the minimum luminance value g. the brightness of the case where 10% of the total power is present expressed as g 10, the brightness of the case where 90% of the total power is present represents a g 90, the frequency of the average value between the said g 10 until the g 90 Calculating Thn1 and Thn1;
A step of 20% and Thn2, the bet search from g 90 in the direction in which the luminance is high, the average value of the frequencies in the search position and before and after luminance that compare them with the Thn2, first the Thn2 below the Using the minimum luminance value g as the luminance threshold, and if the luminance threshold could not be determined in the preceding steps, add 20% of the Thn1 to the Thn2 to obtain a new Thn2; 2. The face image processing method according to claim 1, further comprising the step of repeating the steps until the determination is made.
【請求項4】 前記輝度閾値を元に選択した領域内での
高輝度部位を判別する段階として、次に示す(イ)の条
件を満たす場合、或いは(ロ)または(ハ)の条件を満
たす場合、その画素を高輝度部分とみなすことを特徴と
する請求項1に記載の顔画像処理方法。 (イ)高輝度を規定する第1の閾値Th1より高い輝度を
持つこと、または、前記(イ)の条件を満たす画素を探
索開始画素として2次元座標上で+x方向、−x方向、
+y方向、−y方向のそれぞれの方向に画素の探索を行
い、(ロ)探索対象画素直前までの画素が、連続して、
第2の閾値Th2(<Th1)より大きいこと、(ハ)探索対
象画素直前までの画素が、連続して、前記(ロ)の条件
を満たしているか、または探索対象となっている画素の
輝度に対する直前の画素の輝度の差が、(第1の閾値Th1
−第2の閾値Th2)より大きいこと。
4. A step of judging a high-luminance portion in a region selected based on the luminance threshold, when the following condition (a) is satisfied, or the condition (b) or (c) is satisfied 2. The face image processing method according to claim 1, wherein in that case, the pixel is regarded as a high luminance portion. (A) having a luminance higher than a first threshold value Th1 defining high luminance, or a pixel satisfying the condition (a) as a search start pixel in + x direction, -x direction on two-dimensional coordinates,
A pixel search is performed in each of the + y direction and the −y direction, and (b) pixels immediately before the pixel to be searched are continuously
(C) the pixel immediately before the pixel immediately before the search target pixel continuously satisfies the condition (b) or the luminance of the pixel to be searched. , The difference between the luminance of the immediately preceding pixel and (first threshold Th1
Being greater than a second threshold Th2);
【請求項5】 コンピュータに、撮像カメラから背景画
像を含んだ顔画像を矩形領域に切出し、切出した領域か
ら背景差分法を用いて顔画像を取得する手順と、取得し
た顔画像を矩形処理領域に等分し、目が含まれる領域を
選択する手順と、選択した領域内の輝度ヒストグラムを
作成し、それに基づいて輝度閾値を決定する手順と、輝
度閾値を元に選択した領域内での高輝度部位を判別する
手順と、判別した高輝度部位を背景推定色で塗りつぶす
手順を実行させるためのプログラム。
5. A computer, comprising: a step of cutting out a face image including a background image from an imaging camera into a rectangular area; and obtaining a face image from the cut-out area using a background subtraction method; A procedure for selecting an area including the eyes, a procedure for creating a brightness histogram in the selected area and determining a brightness threshold based on the procedure, and a procedure for determining a brightness threshold in the area selected based on the brightness threshold. A program for executing a procedure for determining a luminance part and a procedure for filling the determined high luminance part with a background estimation color.
【請求項6】 背景画像を含んだ顔画像を撮像する撮像
カメラと、前記撮像カメラで撮像した背景画像を含んだ
顔画像を取り込む画像取込手段と、前記画像取込手段で
取り込んだ背景画像を含んだ顔画像を記憶する画像記憶
手段と、前記画像取込手段で取り込んだ背景画像を含ん
だ顔画像を矩形領域に分け顔領域画像を切り出す顔画像
切出手段と、前記顔画像切出手段が切り出した顔領域画
像を記憶する顔画像記憶手段と、前記顔画像記憶手段に
記憶された顔領域画像から眼鏡の反射により輝度の高く
なっている部分を判別する高輝度部位判別手段と、前記
高輝度部位判別手段が判別した高輝度部位を記憶する高
輝度部位記憶手段と、前記高輝度部位判別手段によって
判別された高輝度部位を前記顔画像記憶手段に記憶され
た顔画像から除去する高輝度部位除去手段と、前記高輝
度部位除去手段によって高輝度部位が除去された顔画像
を出力する画像出力手段とから構成されていることを特
徴とする顔画像処理装置。
6. An image capturing camera for capturing a face image including a background image, image capturing means for capturing a face image including the background image captured by the image capturing camera, and a background image captured by the image capturing means Image storing means for storing a face image including a background image, a face image extracting means for dividing a face image including a background image captured by the image capturing means into a rectangular area, and extracting a face area image; A face image storage unit that stores a face region image cut out by the unit, and a high-brightness region determination unit that determines a portion having high brightness due to reflection of glasses from the face region image stored in the face image storage unit, A high-brightness region storage unit that stores the high-brightness region determined by the high-brightness region determination unit; and a high-brightness region determined by the high-brightness region determination unit is removed from the face image stored in the face image storage unit. A face image processing apparatus comprising: a high-brightness part removing unit; and an image output unit that outputs a face image from which the high-brightness part has been removed by the high-brightness part removing unit.
JP2001068820A 2001-03-12 2001-03-12 Face image processing method and face image processing device Pending JP2002269545A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2001068820A JP2002269545A (en) 2001-03-12 2001-03-12 Face image processing method and face image processing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2001068820A JP2002269545A (en) 2001-03-12 2001-03-12 Face image processing method and face image processing device

Publications (1)

Publication Number Publication Date
JP2002269545A true JP2002269545A (en) 2002-09-20

Family

ID=18926950

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2001068820A Pending JP2002269545A (en) 2001-03-12 2001-03-12 Face image processing method and face image processing device

Country Status (1)

Country Link
JP (1) JP2002269545A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005327009A (en) * 2004-05-13 2005-11-24 Omron Corp Image correction device
JP2007272421A (en) * 2006-03-30 2007-10-18 Toyota Central Res & Dev Lab Inc Device, method and program for detecting object
US7391900B2 (en) * 2002-10-31 2008-06-24 Korea Institute Of Science And Technology Image processing method for removing glasses from color facial images
JP2009060289A (en) * 2007-08-30 2009-03-19 Honda Motor Co Ltd Exposure controller for camera
DE102009046481A1 (en) 2008-11-17 2010-07-22 Denso Corporation, Kariya-City Image pickup device and method for image acquisition
US7787025B2 (en) 2001-09-18 2010-08-31 Ricoh Company, Limited Image pickup device that cuts out a face image from subject image data
US8026955B2 (en) 2007-08-30 2011-09-27 Honda Motor Co., Ltd. Camera exposure controller including imaging devices for capturing an image using stereo-imaging
JP2013020352A (en) * 2011-07-08 2013-01-31 Fujifilm Corp Object detection device, method and program
KR20150107598A (en) * 2014-03-14 2015-09-23 오므론 가부시키가이샤 Image processing apparatus and image processing method
CN111582005A (en) * 2019-02-18 2020-08-25 Oppo广东移动通信有限公司 Image processing method, image processing device, computer readable medium and electronic equipment
US11126824B2 (en) * 2019-12-23 2021-09-21 Ubtech Robotics Corp Ltd Face image quality evaluating method and apparatus and computer readable storage medium using the same

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7978261B2 (en) 2001-09-18 2011-07-12 Ricoh Company, Limited Image pickup device, automatic focusing method, automatic exposure method, electronic flash control method and computer program
US8421899B2 (en) 2001-09-18 2013-04-16 Ricoh Company, Limited Image pickup device, automatic focusing method, automatic exposure method, electronic flash control method and computer program
US7787025B2 (en) 2001-09-18 2010-08-31 Ricoh Company, Limited Image pickup device that cuts out a face image from subject image data
US7903163B2 (en) 2001-09-18 2011-03-08 Ricoh Company, Limited Image pickup device, automatic focusing method, automatic exposure method, electronic flash control method and computer program
US7920187B2 (en) 2001-09-18 2011-04-05 Ricoh Company, Limited Image pickup device that identifies portions of a face
US7973853B2 (en) 2001-09-18 2011-07-05 Ricoh Company, Limited Image pickup device, automatic focusing method, automatic exposure method calculating an exposure based on a detected face
US7391900B2 (en) * 2002-10-31 2008-06-24 Korea Institute Of Science And Technology Image processing method for removing glasses from color facial images
JP2005327009A (en) * 2004-05-13 2005-11-24 Omron Corp Image correction device
JP2007272421A (en) * 2006-03-30 2007-10-18 Toyota Central Res & Dev Lab Inc Device, method and program for detecting object
US8026955B2 (en) 2007-08-30 2011-09-27 Honda Motor Co., Ltd. Camera exposure controller including imaging devices for capturing an image using stereo-imaging
JP2009060289A (en) * 2007-08-30 2009-03-19 Honda Motor Co Ltd Exposure controller for camera
DE102009046481A1 (en) 2008-11-17 2010-07-22 Denso Corporation, Kariya-City Image pickup device and method for image acquisition
US8208027B2 (en) 2008-11-17 2012-06-26 Denso Corporation Image shooting device and method for shooting image
DE102009046481B4 (en) 2008-11-17 2018-12-13 Denso Corporation Image pickup device and method for image acquisition
JP2013020352A (en) * 2011-07-08 2013-01-31 Fujifilm Corp Object detection device, method and program
US8644625B2 (en) 2011-07-08 2014-02-04 Fujifilm Corporation Object detection device, method and program
KR20150107598A (en) * 2014-03-14 2015-09-23 오므론 가부시키가이샤 Image processing apparatus and image processing method
KR101631012B1 (en) * 2014-03-14 2016-06-15 오므론 가부시키가이샤 Image processing apparatus and image processing method
US9811888B2 (en) 2014-03-14 2017-11-07 Omron Corporation Image processing apparatus and image processing method
CN111582005A (en) * 2019-02-18 2020-08-25 Oppo广东移动通信有限公司 Image processing method, image processing device, computer readable medium and electronic equipment
CN111582005B (en) * 2019-02-18 2023-08-15 Oppo广东移动通信有限公司 Image processing method, device, computer readable medium and electronic equipment
US11126824B2 (en) * 2019-12-23 2021-09-21 Ubtech Robotics Corp Ltd Face image quality evaluating method and apparatus and computer readable storage medium using the same

Similar Documents

Publication Publication Date Title
JP5090474B2 (en) Electronic camera and image processing method
JP4078334B2 (en) Image processing apparatus and image processing method
US20150003740A1 (en) Image processing device, method of controlling image processing device, and program for enabling computer to execute same method
KR20190028349A (en) Electronic device and method for human segmentation in image
CN107368806B (en) Image rectification method, image rectification device, computer-readable storage medium and computer equipment
JP5361524B2 (en) Pattern recognition system and pattern recognition method
US20230334235A1 (en) Detecting occlusion of digital ink
EP3798975B1 (en) Method and apparatus for detecting subject, electronic device, and computer readable storage medium
JP2010045613A (en) Image identifying method and imaging device
JP2006260397A (en) Eye opening degree estimating device
JP2000105829A (en) Method and device for face parts image detection
WO2006081018A1 (en) Object-of-interest image capture
JP2002269545A (en) Face image processing method and face image processing device
JP2009123081A (en) Face detection method and photographing apparatus
CN112581481B (en) Image processing method and device, electronic equipment and computer readable storage medium
JP2007080136A (en) Specification of object represented within image
RU2542876C2 (en) Apparatus for selecting highly detailed objects on scene image
US20060010582A1 (en) Chin detecting method, chin detecting system and chin detecting program for a chin of a human face
JP5128454B2 (en) Wrinkle detection device, wrinkle detection method and program
US8538142B2 (en) Face-detection processing methods, image processing devices, and articles of manufacture
JP2007219899A (en) Personal identification device, personal identification method, and personal identification program
JP6467817B2 (en) Image processing apparatus, image processing method, and program
JP2018147046A (en) Face detection device and method for controlling the same
JP2013029996A (en) Image processing device
JP2004030006A (en) Eye detection apparatus, eye detection program, recording medium for recording the program, and eye detection method