JP2000235648A - Extracting device for eye and blink detecting device - Google Patents

Extracting device for eye and blink detecting device

Info

Publication number
JP2000235648A
JP2000235648A JP11038652A JP3865299A JP2000235648A JP 2000235648 A JP2000235648 A JP 2000235648A JP 11038652 A JP11038652 A JP 11038652A JP 3865299 A JP3865299 A JP 3865299A JP 2000235648 A JP2000235648 A JP 2000235648A
Authority
JP
Japan
Prior art keywords
eye
image
extracted
eyes
pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
JP11038652A
Other languages
Japanese (ja)
Inventor
Katsutoshi Shimizu
勝敏 清水
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujitsu Ltd
Original Assignee
Fujitsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Ltd filed Critical Fujitsu Ltd
Priority to JP11038652A priority Critical patent/JP2000235648A/en
Publication of JP2000235648A publication Critical patent/JP2000235648A/en
Withdrawn legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

PROBLEM TO BE SOLVED: To correctly extract the positions of eyes by eliminating effects such as glasses and eyebrows from a facial image about an extracting device for eyes. SOLUTION: The extracting device for eyes which extracts a candidate position for eyes from an image and extracts a pattern image being similar to the eyes from a prescribed area of the image including the candidate position has an image processing part 8 which binarizes the image in the prescribed area including the extracted eye candidate position and extracts the pattern image and a recognizing part 9 which performs comparison processing among the plural extracted pattern images about prescribed evaluation items and recognizes the pattern image whose evaluation points are the highest as the eyes.

Description

【発明の詳細な説明】DETAILED DESCRIPTION OF THE INVENTION

【0001】[0001]

【発明の属する技術分野】本発明は、顔画像から目を抽
出する目の抽出装置および動画像中、目を追跡して瞬き
を検出する瞬き検出装置の改良に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an improvement of an eye extraction device for extracting eyes from a face image and a blink detection device for detecting blinks by tracking eyes in a moving image.

【0002】近年、車両運転者の目を監視し、瞬きの有
無検出等により、居眠りなど、運転者の状態を検出して
アラームを発生する居眠り警報装置が開発されつつある
が、眼鏡の有無,眉の形,目の大きさ等の個人差の影響
を排除して、正確に、目を抽出,追跡し、且つ瞬きを検
出することが必要とされる。
[0002] In recent years, a drowsiness alarm device that monitors the eyes of a vehicle driver and detects the driver's state, such as drowsiness, by detecting the presence or absence of blinking and the like and generates an alarm has been developed. It is necessary to accurately extract and track eyes and detect blinks while eliminating the effects of individual differences such as eyebrow shape and eye size.

【0003】[0003]

【従来の技術および発明が解決しようとする課題】目を
リアルタイムに追跡する方法として、顔画像と標準の目
の型(テンプレート)との間でパターンマッチングを行
い目を発見する方法が一般に行われているが、目と似て
いる部分(眉,眼鏡の枠等)を目と誤認識したり、判定
基準を厳しく設定したため目を追跡できないといった課
題がある。また、瞬き検出は、目と眉の間隔の変化を追
跡する方法等が提案されているが、目の大きさや睫毛の
長さ等に個人差があるため、固定したしきい値で正確に
認識することは困難であった。
2. Description of the Related Art As a method of tracking an eye in real time, a method of finding an eye by performing pattern matching between a face image and a standard eye type (template) is generally performed. However, there is a problem that a part similar to the eyes (eyebrows, frames of eyeglasses, etc.) is erroneously recognized as an eye, and the eyes cannot be tracked due to strict setting of the criterion. For blink detection, a method of tracking changes in the distance between eyes and eyebrows has been proposed, but since there are individual differences in eye size, eyelash length, etc., accurate recognition with a fixed threshold value is performed. It was difficult to do.

【0004】本発明は上記課題に鑑み、個人差を排除
し、簡易に、且つ迅速に目を抽出する目の抽出装置、お
よび精度の高い瞬き検出装置を提供することを目的とす
る。
SUMMARY OF THE INVENTION In view of the above-mentioned problems, an object of the present invention is to provide an eye extraction device that eliminates individual differences and that simply and quickly extracts an eye, and a highly accurate blink detection device.

【0005】[0005]

【課題を解決するための手段】前記課題を解決するた
め、本発明の目の抽出装置および瞬き検出装置は以下の
ように構成される。 (第1の発明)第1の発明は、図1本発明の原理図に示
すように、画像から目の候補位置を抽出し、該候補位置
を含む該画像の所定領域中から目と類似するパターン像
を抽出し、該パターン像を所定の評価項目に基づき比較
して目の位置を確定する目の抽出装置であって、抽出し
た目の候補位置を含む所定領域の画像を二値化してパタ
ーン像を抽出する画像処理部8と、所定の評価項目につ
いて前記抽出した複数のパターン像との間で比較処理を
行い、最も評価点の高いパターン像を目と認識する認識
部9と、を有する。
In order to solve the above-mentioned problems, an eye extracting device and a blink detecting device according to the present invention are configured as follows. (First Invention) In the first invention, as shown in the principle diagram of FIG. 1, the candidate positions of an eye are extracted from an image, and similar to the eyes from a predetermined area of the image including the candidate position. An eye extraction device that extracts a pattern image, determines the eye position by comparing the pattern image based on a predetermined evaluation item, and binarizes an image of a predetermined region including the extracted eye candidate position. An image processing unit 8 for extracting a pattern image, and a recognition unit 9 for performing a comparison process between the extracted plurality of pattern images for a predetermined evaluation item and recognizing a pattern image having the highest evaluation point as an eye. Have.

【0006】以上のごとく、パターンマッチング等によ
り目の候補位置を抽出しても、これをただちに目の位置
とせず、その候補位置の周辺のパターン像と比較して評
価することにより目を認識するので、目を眉等の他の部
分と誤認識することが防止される。 (第2の発明)第2の発明は、画像から目の候補位置を
抽出し、該候補位置を含む該画像の所定領域中から目と
類似するパターン像を抽出し、該パターン像を所定の評
価項目に基づき比較して目の位置を確定する目の抽出装
置であって、抽出した目の候補位置を含む所定領域の画
像を二値化してパターン像を抽出する画像処理部と、複
数の評価項目について前記抽出したパターン像に対しそ
れぞれファジー的な評価処理を行い、最も評価点の高い
パターン像を目と認識する認識部と、を有する。
As described above, even if a candidate eye position is extracted by pattern matching or the like, the eye is recognized by comparing it with a pattern image around the candidate position without immediately determining the candidate eye position. This prevents the eyes from being erroneously recognized as other parts such as eyebrows. (Second Invention) A second invention extracts a candidate position of an eye from an image, extracts a pattern image similar to the eye from a predetermined region of the image including the candidate position, and An eye extraction device that determines an eye position by comparing based on an evaluation item, an image processing unit that binarizes an image of a predetermined region including the extracted eye candidate position and extracts a pattern image, A recognizing unit that performs fuzzy evaluation processing on the extracted pattern images for the evaluation items and recognizes the pattern image with the highest evaluation point as an eye.

【0007】評価項目はその基準が比較的あいまいなの
で、以上のごとく複数の評価項目によりファジー的に評
価することにより、より一層正確に目を認識することが
できる。 (第3の発明)第3の発明は、抽出した動画像中の目を
監視追跡して瞬きを検出する瞬き検出装置であって、目
の輪郭の変化、目の開き量の変化、目と他の部分との間
の間隔の変化等、瞬きを表す複数の評価項目について追
跡し、それぞれファジー的な評価処理を行って、総合評
価点が所定値に達した時点で目の瞬きと判定する判定部
を有する。
[0007] Since the criteria for the evaluation items are relatively ambiguous, the eyes can be more accurately recognized by fuzzy evaluation using a plurality of evaluation items as described above. (Third invention) A third invention is a blink detection device which detects and blinks by monitoring and tracking eyes in an extracted moving image. A plurality of evaluation items representing blinking, such as a change in the interval between other parts, are tracked, and fuzzy evaluation processing is performed for each item. When the total evaluation point reaches a predetermined value, it is determined that the eye blinks. It has a determination unit.

【0008】以上のごとく、複数の評価項目によるファ
ジー的な推量により、より一層正確に瞬きを検出するこ
とが可能となる。
As described above, blinking can be detected more accurately by fuzzy inference based on a plurality of evaluation items.

【0009】[0009]

【発明の実施の形態】以下図を用いて本発明の実施の形
態例を詳細に説明する。図2は一実施例の構成図、図3
は濃淡画像処理例を表す図、図4は目抽出フローチャー
ト図、図5は黒ブロックの長さ説明図、図6はフレーム
毎のギャップを表す図、図7は目の輪郭を表す図、図8
は瞬き検出フローチャート図である。なお、全図を通じ
て同一符号は同一対象物を表す。
DESCRIPTION OF THE PREFERRED EMBODIMENTS Embodiments of the present invention will be described below in detail with reference to the drawings. FIG. 2 is a configuration diagram of one embodiment, and FIG.
4 is a diagram showing an example of grayscale image processing, FIG. 4 is a flowchart of eye extraction, FIG. 5 is an explanatory diagram of the length of a black block, FIG. 6 is a diagram showing a gap for each frame, FIG. 8
FIG. 4 is a flowchart of blink detection. Note that the same reference numerals represent the same object throughout the drawings.

【0010】図2は、人の顔を撮像し、得られた顔画像
から目を抽出する機能と、抽出した目を追跡しつつ瞬き
を検出する機能の両方を備えた装置の構成例を示したも
ので、以下に示す第1〜第3の実施例に適用される。
FIG. 2 shows a configuration example of an apparatus having both a function of imaging a human face and extracting eyes from the obtained face image and a function of detecting blinks while tracking the extracted eyes. This is applied to the following first to third embodiments.

【0011】図2において、1は撮像部で、イメージセ
ンサ等で構成され、例えば、運転中の人の顔に向けられ
る。そして得られた画像をフレーム単位にメモリ2に格
納する。3はメモリに格納された標準テンプレートで、
例えば複数の人の目を平均した目の画像から構成され
る。4は目追跡用メモリで、追跡処理のワーク用として
使用され、また追跡結果の目の位置情報が格納される。
5は瞬き追跡用メモリで、例えば計測した目の太さ情
報,目と他の部分との間隔等がフレーム単位に格納され
る。
In FIG. 2, reference numeral 1 denotes an image pickup unit which is constituted by an image sensor or the like and is directed to, for example, the face of a driving person. Then, the obtained image is stored in the memory 2 for each frame. 3 is a standard template stored in memory,
For example, it is composed of an eye image obtained by averaging a plurality of human eyes. Reference numeral 4 denotes an eye tracking memory, which is used as a work for tracking processing, and stores eye position information as a tracking result.
Reference numeral 5 denotes a blink tracking memory which stores, for example, measured eye thickness information, intervals between the eyes and other parts, and the like in frame units.

【0012】6は目抽出部で、パターンマッチング部
7,画像処理部8,認識部9等より構成され、後述する
ように、標準テンプレート3を使用してパターンマッチ
ングにより目の候補位置を抽出し、この目の候補位置を
含む近傍の所定領域の画像を切り出して評価項目につい
て評価処理を行い、目の位置を確定する。10は目追跡部
で、確定した目の位置の近傍を常時パターンマッチング
により探索して目の位置を追跡する。11は瞬き検出部
で、追跡部12,判定部13等で構成され、後述するよう
に、上瞼と下瞼との距離の変化、目の太ささの変化等、
瞬きを特徴的に表す複数の評価項目について時間的に追
跡し、各評価項目についてファジー的に評価して瞬きを
検出する。
Reference numeral 6 denotes an eye extraction unit which is composed of a pattern matching unit 7, an image processing unit 8, a recognition unit 9 and the like. As will be described later, an eye candidate position is extracted by pattern matching using a standard template 3. Then, an image of a predetermined area in the vicinity including the eye candidate position is cut out, an evaluation process is performed on the evaluation item, and the eye position is determined. Reference numeral 10 denotes an eye tracking unit that constantly searches the vicinity of the determined eye position by pattern matching to track the eye position. Reference numeral 11 denotes a blink detection unit, which includes a tracking unit 12, a determination unit 13, and the like. As will be described later, a change in the distance between the upper and lower eyelids, a change in the thickness of the eyes, and the like.
A plurality of evaluation items characteristically representing blinking are tracked temporally, and each evaluation item is evaluated fuzzy to detect blinking.

【0013】(第1の実施例)第1の実施例は、前述し
た第1の発明の実施例であって、先ず画像と標準テン
プレートとの間のパターンマッチングにより目の候補位
置を求め、その候補位置を含む所定領域から複数本の
縦方向の画像を切り出して二値化し、目に類似した複
数のパターン像のうちの下から2組のパターン像を抽出
して、1つの評価項目、ここでは太さの一様性を比較
して2組のパターン像のうちのいずれかを目と判定す
る、例を示したものである。
(First Embodiment) The first embodiment is an embodiment of the above-described first invention. First, eye candidate positions are obtained by pattern matching between an image and a standard template. A plurality of vertical images are cut out from a predetermined area including a candidate position and binarized, and two sets of pattern images are extracted from the bottom of a plurality of eye-like pattern images to obtain one evaluation item. 5 shows an example in which one of the two sets of pattern images is determined to be an eye by comparing the uniformity of the thickness.

【0014】先ず、パターンマッチング部7は、メモリ
2に格納された画像中、標準テンプレート3を移動しつ
つパターンマッチング処理を行い、最大パターンマッチ
ング率が得られた位置を目の候補位置とする。図3の探
索結果は、顔画像中標準テンプレート3が目と一致した
模様を表したものである。次に、画像処理部8は、パタ
ーンマッチングにより発見した目の候補位置を中心とし
て、所定長さの複数の縦方向ライン状に濃淡データを取
得する。この際、縦ラインは多いほど詳細なデータが得
られるが、余り多いと処理時間が1フレーム内に納まら
なくなるため、本実施例では9本としている。この濃淡
データを二値化し皮膚(白)とその他(黒)の部分に分
離する。黒部分には目の他に眉,眼鏡フレーム,前髪、
黒子等などが含まれる。そこで目と他の部分を分離す
る。
First, the pattern matching unit 7 performs a pattern matching process while moving the standard template 3 in the image stored in the memory 2 and sets a position where the maximum pattern matching rate is obtained as an eye candidate position. The search result of FIG. 3 shows a pattern in which the standard template 3 in the face image matches the eyes. Next, the image processing unit 8 acquires grayscale data in a plurality of vertical lines having a predetermined length with the eye candidate position found by pattern matching as the center. At this time, the more vertical lines, the more detailed data can be obtained. However, if the number of vertical lines is too large, the processing time will not be within one frame. The grayscale data is binarized and separated into skin (white) and other (black) parts. In addition to the eyes, the eyebrows, eyeglass frames, bangs,
Mole etc. are included. So separate the eyes from the rest.

【0015】次に、二値化データの正射影を求める。そ
して各正射影の黒部分(以下黒ブロック)の中心位置で
横方向に続く黒部の長さを求める。この黒部の長さがし
きい値以下のものについては、前髪や黒子等と判断し除
外する。残った黒部は、目、眉または眼鏡フレームとな
る。
Next, an orthogonal projection of the binarized data is obtained. Then, the length of the black portion that continues in the horizontal direction at the center position of the black portion (hereinafter, black block) of each orthographic projection is obtained. If the length of the black portion is equal to or less than the threshold value, it is judged as bangs or moles and excluded. The remaining black parts become eyes, eyebrows, or eyeglass frames.

【0016】次に、認識部9は、一番下の2つの黒部に
ついて判定を行う。この結果、眼鏡をかけている人は下
側の眼鏡フレームと目、眼鏡をかけていない人は目と眉
との比較になる。実施例では、眉や眼鏡は目に比べて太
さが一様であることを利用し、各ラインの濃淡データで
縦方向の長さを求め、その差の大きい方のパターン像を
目と判定する。 (第2の実施例)第1の実施例では、1つの評価項目、
例えば太さの一様性で目を判定したが、目の細い人では
判定がやや不正確となる。第2の実施例では、高精度
で、このような個人差によるばらつきにも影響を受けな
いように複数の評価項目を設定し、各項目別に黒部の評
価点(スコア)を加点し、得点の高い方を目とするよう
に、ファジー的な総合判断を行う。
Next, the recognizing unit 9 makes a determination on the two lowermost black portions. As a result, a person wearing spectacles makes a comparison between the lower spectacle frame and the eye, and a person who does not wear spectacles makes a comparison between the eye and the eyebrow. In the embodiment, utilizing the fact that the thickness of eyebrows and eyeglasses is uniform compared to the eyes, the length in the vertical direction is obtained from the grayscale data of each line, and the pattern image having the larger difference is determined as the eye. I do. (Second embodiment) In the first embodiment, one evaluation item,
For example, the eyes are determined based on the uniformity of the thickness. However, the determination is slightly inaccurate for a person with fine eyes. In the second embodiment, a plurality of evaluation items are set with high accuracy so as not to be affected by such variations due to individual differences, and an evaluation point (score) of a black part is added for each item, and a score is calculated. Make a fuzzy comprehensive judgment so that you can see the higher one.

【0017】目の判定を行うのに先立つ処理、即ち目の
候補として得られたパターン像のうちの下側2つの黒部
に絞り込むまでの処理は、図4の処理フローのステップ
(1)〜(8) に示すように、第1の実施例と同様である。
この後、認識部9は、2つの黒部について、ステップ
(9) で一番下の黒部の評価を行い、ステップ(10)で下か
ら二番目の黒部の評価を行い、ステップ(11)で加点した
両スコアを比較して、評価点が高い方を目と判定する処
理を行う。
The processing prior to the determination of the eyes, that is, the processing until narrowing down to the lower two black portions of the pattern image obtained as eye candidates is performed in the processing flow shown in FIG.
As shown in (1) to (8), this is the same as in the first embodiment.
Thereafter, the recognizing unit 9 performs a step for the two black portions.
The lowermost black part is evaluated in (9), the second lowermost part is evaluated in step (10), and both scores added in step (11) are compared. The eye is determined.

【0018】以下、評価項目と加点基準の1例を以下に
説明する。 (1) 黒部の太さの一様性 一般的に眉や眼鏡フレームは太さが一様なため、一様性
が高いほど点数を低くした。 (2) 黒部の太さの最大値 目の幅は個人差により多少のばらつきはあるが、ある範
囲にあるので、その範囲内では加点し、それ以外では減
点とした。 (3) 黒部の横方向の長さ 眉や眼鏡フレームは、目より横に長い場合が多いので横
方向の長さが短い方に加点した。 (4) 2つの黒部間の距離 一般的に「眼鏡の下フレームと目の距離」は「目と眉の
距離」より長いため、黒部間の距離がしきい値以上の場
合は下から二番目の黒部に加点し、しきい値以下の場合
は一番下の黒部に加点した。
Hereinafter, one example of the evaluation items and the additional points will be described. (1) Uniformity of thickness of black part Generally, eyebrows and eyeglass frames are uniform in thickness, so the higher the uniformity, the lower the score. (2) The maximum value of the thickness of the black part The width of the eyes varies slightly depending on the individual, but it is within a certain range, so points were added within that range, and deductions were made at other points. (3) Horizontal length of the black part Eyebrows and eyeglass frames are often longer than the eyes. (4) Distance between two black parts In general, “distance between lower frame of eyeglasses and eyes” is longer than “distance between eyes and eyebrows”. Are added to the black part, and when the value is equal to or less than the threshold, the point is added to the lowest black part.

【0019】以上のように、評価項目を複数用意し、ス
コアを加点して比較することにより、あいまいな評価項
目による目の判定を、正確、且つ個人差の影響がないよ
うに行うことが可能となる。
As described above, by preparing a plurality of evaluation items and comparing them by adding scores, it is possible to perform eye judgment based on ambiguous evaluation items accurately and without influence of individual differences. Becomes

【0020】なお、以上は、最初に目の位置を抽出する
場合を示したが、目を追跡する場合にも適用できること
は勿論である。 (第3の実施例)第3の実施例は、前述した第3の発明
の瞬きを検出する実施例である。瞬きを検出する方法
は、目追跡部10で追跡して得られた目の状態を、例えば
フレーム単位に追跡し、瞬きに対応する状態変化が検出
された時点で瞬きと判定する。
Although the case where the position of the eye is first extracted has been described above, the present invention can be applied to the case where the eye is tracked. (Third Embodiment) A third embodiment is an embodiment for detecting blinks according to the third aspect of the present invention. In the method of detecting a blink, the state of the eye obtained by tracking by the eye tracking unit 10 is tracked, for example, on a frame basis, and when a state change corresponding to the blink is detected, it is determined to be a blink.

【0021】この状態変化は、目の太さ等の個人差、ま
たはノイズ、姿勢の変化等の影響を受けるので、複数の
評価項目によるスコアの加点を行い、しきい値を超えた
とき瞬きと判定する。
Since this state change is influenced by individual differences such as the size of eyes, noise, changes in posture, etc., scores are added to a plurality of evaluation items. judge.

【0022】図5は目の太さの変化による瞬き検出の不
正確さを説明したものである。図5の(1)-(a),(b) に示
すように、一般的に開眼時では、目の部分の黒ブロック
(正射影の長さ)は長く、開眼時には短い。しかし、睫
毛が長く、且つ目が細い人の場合、図5の(2)-(a),(b)
に示すように、開眼時と閉眼時で黒ブロックの長さがあ
まり変化しない。つまり、黒ブロックの長さだけでは、
瞬きを検出することができない場合が生じる。そこで、
複数の評価項目を設け、各項目別にスコアを加点するフ
ァジー的手法を用いる。
FIG. 5 illustrates the inaccuracy of blink detection due to a change in eye thickness. As shown in (1)-(a) and (b) of FIG. 5, the black block (orthogonal projection length) of the eye portion is generally long when the eye is open and short when the eye is open. However, if the eyelashes are long and the eyes are narrow, (2)-(a) and (b) in FIG.
As shown in (2), the length of the black block does not change much between when the eye is opened and when the eye is closed. In other words, the length of the black block alone
In some cases, blinking cannot be detected. Therefore,
A fuzzy method is used in which a plurality of evaluation items are provided and a score is added for each item.

【0023】複数の評価項目として本例では、上瞼と下
瞼間距離の最大値(ギャップ)の変化(ギャップ減少の
急峻性および元に戻るまでの時間)と、目の輪郭の変化
を採用し、スコアの合計がしきい値以上となった場合、
瞬きをしたと判定する。処理フローを図8に示す。な
お、評価項目としてはこれらに限るものではない。 (1) 目を含む所定領域の画像を抽出し二値化する。な
お、一般に目の追跡処理と同時に瞬き検出処理が行われ
るので、ここでの二値化処理は省略され、目追跡部10で
二値化したデータが抽出され使用される。 (2) 上瞼と下瞼間の最大距離(ギャップ) を取得し、 (3) 時系列に配列してメモリに格納する。取得したギャ
ップより、ギャップ減少(目を閉じる)時の急峻度を計
測し、急峻性に応じたスコアを加点する。瞬き時には、
急激にギャップが減少するため、急峻度が高い程高い点
数とする。 (4) ギャップが元に戻るまでの時間(フレーム数)を計
測し、フレーム数に応じたスコアを加点する。
In the present example, a change in the maximum value (gap) of the distance between the upper and lower eyelids (the steepness of the gap decrease and the time to return to the original state) and a change in the contour of the eyes are employed as a plurality of evaluation items. If the total score is above the threshold,
It is determined that the person blinked. FIG. 8 shows the processing flow. The evaluation items are not limited to these. (1) Extract and binarize an image of a predetermined area including the eyes. In addition, since the blink detection processing is generally performed simultaneously with the eye tracking processing, the binarization processing is omitted here, and the data binarized by the eye tracking unit 10 is extracted and used. (2) Obtain the maximum distance (gap) between the upper and lower eyelids, and (3) arrange them in chronological order and store them in memory. From the acquired gap, the steepness at the time of gap reduction (closed eyes) is measured, and a score according to the steepness is added. At the time of blink,
Since the gap sharply decreases, the higher the steepness, the higher the score. (4) Measure the time (the number of frames) until the gap returns to the original value, and add a score according to the number of frames.

【0024】瞬きは、数フレームから十数フレームで終
了するため、その範囲内で加点し、それ以外では減点と
した。 (5) 目の輪郭データを取得する。
Since blinking ends in several frames to several tens of frames, points are added within that range, and deductions are made in other cases. (5) Obtain eye contour data.

【0025】図7に示すように目の両端を結ぶ線と瞼の
輪郭との距離は開眼時には大きく閉眼時には小さくなる
ため、輪郭データ(縦方向の画素数)の平均値が小さい
ほど高い点数とした。これにより、瞬きをしたときのス
コアが大きくなる。 (6) 以上加算したスコアが設定したしきい値を超えた時
点で瞬きと判定する。
As shown in FIG. 7, since the distance between the line connecting both ends of the eye and the outline of the eyelid is large when the eye is open and small when the eye is closed, the smaller the average value of the outline data (the number of pixels in the vertical direction), the higher the score. did. Thereby, the score when blinking increases. (6) When the score added above exceeds the set threshold value, it is determined to be a blink.

【0026】以上のごとく、複数の評価項目によるファ
ジー的な評価で正確、且つ個人差を排除した瞬き検出を
行うことができる。
As described above, blink detection can be performed accurately and with individual differences eliminated by fuzzy evaluation using a plurality of evaluation items.

【0027】[0027]

【発明の効果】以上説明したように、本発明は、複数の
評価項目によるファジー的な推論で目の抽出を行うもの
で、画像上の目の追跡がパターンマッチングのみで行う
場合より高精度に実現することが可能となる。更に、フ
ァジー的な推論で瞬き検出を行うことにより、高精度な
瞬き検出が可能となり、ドライバの首振りの検出、居眠
りの検出等が可能となる。
As described above, according to the present invention, eyes are extracted by fuzzy inference based on a plurality of evaluation items, and the tracking of eyes on an image is performed with higher accuracy than when only pattern matching is performed. It can be realized. Further, by performing blink detection by fuzzy inference, blink detection with high accuracy is possible, and it is possible to detect a swing of the driver, a drowsiness, and the like.

【図面の簡単な説明】[Brief description of the drawings]

【図1】 本発明の原理図FIG. 1 is a principle diagram of the present invention.

【図2】 一実施例の構成図FIG. 2 is a configuration diagram of an embodiment.

【図3】 濃淡画像処理例を表す図FIG. 3 is a diagram showing an example of grayscale image processing;

【図4】 目抽出フローチャート図FIG. 4 is an eye extraction flowchart.

【図5】 黒ブロックの長さ説明図FIG. 5 is an explanatory diagram of the length of a black block.

【図6】 フレーム毎のギャップを表す図FIG. 6 is a diagram showing a gap for each frame.

【図7】 目の輪郭を表す図FIG. 7 is a diagram showing an outline of an eye;

【図8】 瞬き検出フローチャート図FIG. 8 is a flowchart of a blink detection process.

【符号の説明】[Explanation of symbols]

1は撮像部 2はメモリ 3は標準テンプレート 4は目追跡用メ
モリ 5は瞬き追跡用メモリ 6は目抽出部 7はパターンマッチング部 8は画像処理部 9は認識部 10は目追跡部 11は瞬き検出部 12は追跡部 13は判定部
1 is an imaging unit 2 is a memory 3 is a standard template 4 is an eye tracking memory 5 is a blink tracking memory 6 is an eye extraction unit 7 is a pattern matching unit 8 is an image processing unit 9 is a recognition unit 10 is an eye tracking unit 11 is a blink Detection unit 12 is tracking unit 13 is judgment unit

Claims (3)

【特許請求の範囲】[Claims] 【請求項1】 画像から目の候補位置を抽出し、該候
補位置を含む該画像の所定領域中から目と類似するパタ
ーン像を抽出し、該パターン像を所定の評価項目に基づ
き比較して目の位置を確定する目の抽出装置であって、 抽出した目の候補位置を含む所定領域の画像を二値化し
てパターン像を抽出する画像処理部と、 目の特徴を示す所定の評価項目について前記抽出した複
数のパターン像間で比較処理を行い、最も評価点の高い
パターン像を目と認識する認識部と、を有することを特
徴とする目の抽出装置。
1. An eye candidate position is extracted from an image, a pattern image similar to the eye is extracted from a predetermined region of the image including the candidate position, and the pattern image is compared based on a predetermined evaluation item. An eye extraction device for determining an eye position, comprising: an image processing unit that binarizes an image of a predetermined area including the extracted eye candidate position to extract a pattern image; and a predetermined evaluation item indicating a feature of the eye. And a recognizing unit that performs comparison processing between the plurality of extracted pattern images and recognizes a pattern image having the highest evaluation point as an eye.
【請求項2】 画像から目の候補位置を抽出し、該候
補位置を含む該画像の所定領域中から目と類似するパタ
ーン像を抽出し、該パターン像を所定の評価項目に基づ
き比較して目の位置を確定する目の抽出装置であって、 抽出した目の候補位置を含む所定領域の画像を二値化し
て複数のパターン像を抽出する画像処理部と、 目の特徴を示す複数の評価項目について前記抽出した複
数のパターン像に対しそれぞれファジー的に評価点を与
え、最も評価点の高いパターン像を目と認識する認識部
と、を有することを特徴とする目の抽出装置。
2. An eye candidate position is extracted from an image, a pattern image similar to the eye is extracted from a predetermined region of the image including the candidate position, and the pattern image is compared based on a predetermined evaluation item. An eye extraction device for determining an eye position, comprising: an image processing unit that binarizes an image of a predetermined region including the extracted eye candidate position to extract a plurality of pattern images; An eye extraction device, comprising: a recognition unit that gives fuzzy evaluation points to a plurality of extracted pattern images for evaluation items, and recognizes a pattern image with the highest evaluation point as an eye.
【請求項3】 動画像中の目を監視して瞬きを検出す
る瞬き検出装置であって、 瞬きを特徴的に表す複数の評価項目についてそれぞれフ
ァジー的に評価点を与え、総合評価点が所定値に達した
時点で目の瞬きと判定する判定部を有することを特徴と
する瞬き検出装置。
3. A blink detection apparatus for detecting blinks by monitoring eyes in a moving image, wherein a plurality of evaluation items characteristically representing blinks are given fuzzy evaluation points, and a total evaluation point is determined. A blink detection device, comprising: a determination unit that determines a blink of an eye when the value reaches a value.
JP11038652A 1999-02-17 1999-02-17 Extracting device for eye and blink detecting device Withdrawn JP2000235648A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP11038652A JP2000235648A (en) 1999-02-17 1999-02-17 Extracting device for eye and blink detecting device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP11038652A JP2000235648A (en) 1999-02-17 1999-02-17 Extracting device for eye and blink detecting device

Publications (1)

Publication Number Publication Date
JP2000235648A true JP2000235648A (en) 2000-08-29

Family

ID=12531200

Family Applications (1)

Application Number Title Priority Date Filing Date
JP11038652A Withdrawn JP2000235648A (en) 1999-02-17 1999-02-17 Extracting device for eye and blink detecting device

Country Status (1)

Country Link
JP (1) JP2000235648A (en)

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1328698C (en) * 2003-02-13 2007-07-25 富士胶片株式会社 Method ,device and program for correcting facial image
EP2391115A2 (en) 2010-05-24 2011-11-30 Canon Kabushiki Kaisha Image processing apparatus, control method, and program
US8374439B2 (en) 2008-06-25 2013-02-12 Canon Kabushiki Kaisha Image processing apparatus, image processing method, program, and computer-readable print medium
US8391595B2 (en) 2006-05-26 2013-03-05 Canon Kabushiki Kaisha Image processing method and image processing apparatus
US8442315B2 (en) 2010-07-16 2013-05-14 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and computer-readable medium
US8457397B2 (en) 2010-07-16 2013-06-04 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and computer-readable medium
US8532431B2 (en) 2007-05-08 2013-09-10 Canon Kabushiki Kaisha Image search apparatus, image search method, and storage medium for matching images with search conditions using image feature amounts
US8630503B2 (en) 2008-06-25 2014-01-14 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and computer program
US8699844B2 (en) 2005-12-22 2014-04-15 Fuji Xerox Co., Ltd. Content distribution apparatus
US9002107B2 (en) 2010-07-16 2015-04-07 Canon Kabushiki Kaisha Color balance correction based on skin color and highlight color
US9014487B2 (en) 2012-07-09 2015-04-21 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US9189681B2 (en) 2012-07-09 2015-11-17 Canon Kabushiki Kaisha Image processing apparatus, method thereof, and computer-readable storage medium
US9208595B2 (en) 2012-07-09 2015-12-08 Canon Kabushiki Kaisha Apparatus, image processing method and storage medium storing program
US9214027B2 (en) 2012-07-09 2015-12-15 Canon Kabushiki Kaisha Apparatus, method, and non-transitory computer-readable medium
US9275270B2 (en) 2012-07-09 2016-03-01 Canon Kabushiki Kaisha Information processing apparatus and control method thereof
US9280720B2 (en) 2012-07-09 2016-03-08 Canon Kabushiki Kaisha Apparatus, method, and computer-readable storage medium
US9292760B2 (en) 2012-07-09 2016-03-22 Canon Kabushiki Kaisha Apparatus, method, and non-transitory computer-readable medium
US9299177B2 (en) 2012-07-09 2016-03-29 Canon Kabushiki Kaisha Apparatus, method and non-transitory computer-readable medium using layout similarity
US9436706B2 (en) 2013-09-05 2016-09-06 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium for laying out images
US9501688B2 (en) 2012-07-09 2016-11-22 Canon Kabushiki Kaisha Apparatus, processing method and storage medium storing program
US9509870B2 (en) 2013-09-05 2016-11-29 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium enabling layout varations
US9519842B2 (en) 2012-07-09 2016-12-13 Canon Kabushiki Kaisha Apparatus and method for managing an object extracted from image data
US9542594B2 (en) 2013-06-28 2017-01-10 Canon Kabushiki Kaisha Information processing apparatus, method for processing information, and program
US9558212B2 (en) 2012-07-09 2017-01-31 Canon Kabushiki Kaisha Apparatus, image processing method and computer-readable storage medium for object identification based on dictionary information
US9563823B2 (en) 2012-07-09 2017-02-07 Canon Kabushiki Kaisha Apparatus and method for managing an object extracted from image data
US9846681B2 (en) 2012-07-09 2017-12-19 Canon Kabushiki Kaisha Apparatus and method for outputting layout image
US9904879B2 (en) 2013-09-05 2018-02-27 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
US10013395B2 (en) 2012-07-09 2018-07-03 Canon Kabushiki Kaisha Apparatus, control method thereof, and storage medium that determine a layout image from a generated plurality of layout images by evaluating selected target images

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1328698C (en) * 2003-02-13 2007-07-25 富士胶片株式会社 Method ,device and program for correcting facial image
US8699844B2 (en) 2005-12-22 2014-04-15 Fuji Xerox Co., Ltd. Content distribution apparatus
US8391595B2 (en) 2006-05-26 2013-03-05 Canon Kabushiki Kaisha Image processing method and image processing apparatus
US8532431B2 (en) 2007-05-08 2013-09-10 Canon Kabushiki Kaisha Image search apparatus, image search method, and storage medium for matching images with search conditions using image feature amounts
US8374439B2 (en) 2008-06-25 2013-02-12 Canon Kabushiki Kaisha Image processing apparatus, image processing method, program, and computer-readable print medium
US8630503B2 (en) 2008-06-25 2014-01-14 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and computer program
US8639030B2 (en) 2010-05-24 2014-01-28 Canon Kabushiki Kaisha Image processing using an adaptation rate
EP2391115A2 (en) 2010-05-24 2011-11-30 Canon Kabushiki Kaisha Image processing apparatus, control method, and program
US9398282B2 (en) 2010-05-24 2016-07-19 Canon Kabushiki Kaisha Image processing apparatus, control method, and computer-readable medium
US8442315B2 (en) 2010-07-16 2013-05-14 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and computer-readable medium
US8457397B2 (en) 2010-07-16 2013-06-04 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and computer-readable medium
US8842914B2 (en) 2010-07-16 2014-09-23 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and computer-readable medium
US8934712B2 (en) 2010-07-16 2015-01-13 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and computer-readable medium
US9002107B2 (en) 2010-07-16 2015-04-07 Canon Kabushiki Kaisha Color balance correction based on skin color and highlight color
US9406003B2 (en) 2010-07-16 2016-08-02 Canon Kabushiki Kaisha Image processing with color balance correction
US9280720B2 (en) 2012-07-09 2016-03-08 Canon Kabushiki Kaisha Apparatus, method, and computer-readable storage medium
US9519842B2 (en) 2012-07-09 2016-12-13 Canon Kabushiki Kaisha Apparatus and method for managing an object extracted from image data
US9275270B2 (en) 2012-07-09 2016-03-01 Canon Kabushiki Kaisha Information processing apparatus and control method thereof
US9208595B2 (en) 2012-07-09 2015-12-08 Canon Kabushiki Kaisha Apparatus, image processing method and storage medium storing program
US9292760B2 (en) 2012-07-09 2016-03-22 Canon Kabushiki Kaisha Apparatus, method, and non-transitory computer-readable medium
US9299177B2 (en) 2012-07-09 2016-03-29 Canon Kabushiki Kaisha Apparatus, method and non-transitory computer-readable medium using layout similarity
US9189681B2 (en) 2012-07-09 2015-11-17 Canon Kabushiki Kaisha Image processing apparatus, method thereof, and computer-readable storage medium
US9014487B2 (en) 2012-07-09 2015-04-21 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US10127436B2 (en) 2012-07-09 2018-11-13 Canon Kabushiki Kaisha Apparatus, image processing method and storage medium storing program
US9501688B2 (en) 2012-07-09 2016-11-22 Canon Kabushiki Kaisha Apparatus, processing method and storage medium storing program
US10055640B2 (en) 2012-07-09 2018-08-21 Canon Kabushiki Kaisha Classification of feature information into groups based upon similarity, and apparatus, image processing method, and computer-readable storage medium thereof
US9214027B2 (en) 2012-07-09 2015-12-15 Canon Kabushiki Kaisha Apparatus, method, and non-transitory computer-readable medium
US10013395B2 (en) 2012-07-09 2018-07-03 Canon Kabushiki Kaisha Apparatus, control method thereof, and storage medium that determine a layout image from a generated plurality of layout images by evaluating selected target images
US9558212B2 (en) 2012-07-09 2017-01-31 Canon Kabushiki Kaisha Apparatus, image processing method and computer-readable storage medium for object identification based on dictionary information
US9563823B2 (en) 2012-07-09 2017-02-07 Canon Kabushiki Kaisha Apparatus and method for managing an object extracted from image data
US9846681B2 (en) 2012-07-09 2017-12-19 Canon Kabushiki Kaisha Apparatus and method for outputting layout image
US9852325B2 (en) 2012-07-09 2017-12-26 Canon Kabushiki Kaisha Apparatus, image processing method and storage medium storing program
US9542594B2 (en) 2013-06-28 2017-01-10 Canon Kabushiki Kaisha Information processing apparatus, method for processing information, and program
US9904879B2 (en) 2013-09-05 2018-02-27 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
US9509870B2 (en) 2013-09-05 2016-11-29 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium enabling layout varations
US9436706B2 (en) 2013-09-05 2016-09-06 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium for laying out images

Similar Documents

Publication Publication Date Title
JP2000235648A (en) Extracting device for eye and blink detecting device
Lee et al. Blink detection robust to various facial poses
US7742621B2 (en) Dynamic eye tracking system
CN106846734B (en) A kind of fatigue driving detection device and method
JP4811259B2 (en) Gaze direction estimation apparatus and gaze direction estimation method
US6611613B1 (en) Apparatus and method for detecting speaking person's eyes and face
CN109840565A (en) A kind of blink detection method based on eye contour feature point aspect ratio
WO2017036160A1 (en) Glasses removal method for facial recognition
US11449590B2 (en) Device and method for user authentication on basis of iris recognition
Boehnen et al. A fast multi-modal approach to facial feature detection
JP2000137792A (en) Eye part detecting device
Darshana et al. Efficient PERCLOS and gaze measurement methodologies to estimate driver attention in real time
Liu et al. A practical driver fatigue detection algorithm based on eye state
CN111616718B (en) Method and system for detecting fatigue state of driver based on attitude characteristics
Alioua et al. Driver’s fatigue and drowsiness detection to reduce traffic accidents on road
JP2001043382A (en) Eye tracking device
Nawaldgi Review of automated glaucoma detection techniques
Mohsin et al. Pupil detection algorithm based on feature extraction for eye gaze
JP2000123188A (en) Eye open/close discriminating device
WO2022110917A1 (en) Method for determining driving state of driver, computer storage medium, and electronic device
JP2004192552A (en) Eye opening/closing determining apparatus
Guo et al. Iris extraction based on intensity gradient and texture difference
He et al. A novel iris segmentation method for hand-held capture device
Kumar Morphology based facial feature extraction and facial expression recognition for driver vigilance
EP3244346A1 (en) Determining device and determination method

Legal Events

Date Code Title Description
A300 Withdrawal of application because of no request for examination

Free format text: JAPANESE INTERMEDIATE CODE: A300

Effective date: 20060509