JP2005081101A - System and methodology for detecting visual axis direction - Google Patents

System and methodology for detecting visual axis direction Download PDF

Info

Publication number
JP2005081101A
JP2005081101A JP2003320290A JP2003320290A JP2005081101A JP 2005081101 A JP2005081101 A JP 2005081101A JP 2003320290 A JP2003320290 A JP 2003320290A JP 2003320290 A JP2003320290 A JP 2003320290A JP 2005081101 A JP2005081101 A JP 2005081101A
Authority
JP
Japan
Prior art keywords
face
image
imaging
nose
line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2003320290A
Other languages
Japanese (ja)
Other versions
JP4118773B2 (en
Inventor
Tatsuji Miyata
達司 宮田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
UD Trucks Corp
Original Assignee
UD Trucks Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by UD Trucks Corp filed Critical UD Trucks Corp
Priority to JP2003320290A priority Critical patent/JP4118773B2/en
Publication of JP2005081101A publication Critical patent/JP2005081101A/en
Application granted granted Critical
Publication of JP4118773B2 publication Critical patent/JP4118773B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • Eye Examination Apparatus (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To detect a visual axis direction indirectly and with high precision from an image of the face which is imaged. <P>SOLUTION: A visual axis direction detecting system is composed of an imaging device 20 which images the face, a face outline extraction means 30A which extracts a face profile from the image, a face aspect information calculation means 30B which computes the area ratio of the forehead and both cheeks to the whole face based on the image and the face profile, a face turning orientation presuming means 30C which presumes the face turning orientation to the imaging orientation based on the area ratio of the forehead and both the cheeks, a both pupil coordinate sensing means 30H and a nose coordinate sensing means 30I which detect coordinates, in which both the pupils and a nose are located, from the image, respectively, a centroid calculation means 30J which computes the centroid of an inverse triangle in which the coordinates of both the pupils and the nose are connected with a segment, a visual axis orientation presuming means 30K which presumes the visual axis orientation of in the face based on the inverse triangle and its centroid, and a visual axis orientation correction means 30L which rectifies the visual axis orientation based on the face turning orientation so that it may become the visual axis orientation to the imaging orientation. <P>COPYRIGHT: (C)2005,JPO&NCIPI

Description

本発明は、顔面を撮像した画像から、視線方向を間接的かつ高精度に検出する技術に関する。   The present invention relates to a technique for detecting a gaze direction indirectly and with high accuracy from an image of a face imaged.

従来より、車両運転者が低覚醒状態になったときに注意を喚起することで、居眠り運転などを効果的に防止し、車両運行時の安全性を高める技術が種々開発されている。そして、特許文献1又は特許文献2に記載されるように、車両運転者の顔面を撮像した画像から眼を抽出すると共に、その開閉動作である瞬きに着目して、覚醒状態を判定する技術が提案されている。
特開2000−137792号公報 特開平11−147428号公報
Conventionally, various technologies have been developed that effectively prevent a drowsy driving or the like and raise safety during vehicle operation by calling attention when the vehicle driver is in a low arousal state. And as described in patent document 1 or patent document 2, while extracting eyes from the image which imaged the vehicle driver's face, paying attention to the blink which is the opening / closing operation | movement, the technique which determines an arousal state Proposed.
Japanese Unexamined Patent Publication No. 2000-137792 JP-A-11-147428

しかしながら、車両運転者が居眠り状態まで至らなくとも、例えば、疲労などにより注意散漫となったときには、車両周囲の目視がおろそかになり、無意識に一点を見つめてしまう固視状態となってしまう。このような固視状態は、眼の瞬きから検出することができず、車両運行時の安全性をさらに高めるためには、車両運転者の視線方向を併用する必要がある。視線方向を検出する装置として「アイカメラ」が公知であるが、これを車両運転者の頭部などに装着することは、煩わしいばかりでなく車両運転に支障を来してしまうおそれがある。   However, even if the vehicle driver does not reach a drowsy state, for example, when he / she becomes distracted due to fatigue or the like, the visual observation around the vehicle is neglected, and a fixation state in which one point is unconsciously stared. Such a fixation state cannot be detected from blinking of the eyes, and it is necessary to use the line-of-sight direction of the vehicle driver in order to further improve safety during vehicle operation. An “eye camera” is known as a device for detecting the line-of-sight direction, but wearing it on the head of a vehicle driver is not only cumbersome but may cause trouble in driving the vehicle.

そこで、本発明は以上のような従来の問題点に鑑み、顔面を撮像した画像から顔面旋回方向及び顔面における視線方向を夫々推定し、顔面旋回方向に基づいて顔面における視線方向を補正することで、視線方向を間接的かつ高精度に検出する技術を提案することを目的とする。   Therefore, in view of the conventional problems as described above, the present invention estimates the face turning direction and the gaze direction on the face from the image obtained by imaging the face, and corrects the gaze direction on the face based on the face turning direction. An object of the present invention is to propose a technique for detecting the gaze direction indirectly and with high accuracy.

このため、請求項1記載の発明に係る視線方向検出装置では、顔面の画像を撮像する撮像手段と、該撮像手段により撮像された画像から、顔面輪郭を抽出する顔面輪郭抽出手段と、前記撮像手段により撮像された画像及び顔面輪郭抽出手段により抽出された顔面輪郭に基づいて、顔面全体に対する額及び両頬の面積比率を算出する顔面面情報算出手段と、該顔面面情報算出手段により算出された顔面全体に対する額及び両頬の面積比率に基づいて、撮像方向に対する顔面旋回方向を推定する顔面旋回方向推定手段と、前記撮像手段により撮像された画像から、両瞳及び鼻が位置する座標を検出する座標検出手段と、該座標検出手段により検出された両瞳及び鼻の座標を線分で結んだ逆三角形の重心を算出する重心算出手段と、前記両瞳及び鼻の座標を線分で結んだ逆三角形並びに重心算出手段により算出された重心に基づいて、顔面における視線方向を推定する視線方向推定手段と、前記顔面旋回方向推定手段により推定された顔面旋回方向に基づいて、前記視線方向推定手段により推定された視線方向を撮像方向に対する視線方向となるように補正する視線方向補正手段と、を含んで構成されたことを特徴とする。   For this reason, in the gaze direction detection device according to the first aspect of the present invention, an imaging unit that captures an image of a face, a facial contour extraction unit that extracts a facial contour from an image captured by the imaging unit, and the imaging Based on the image captured by the means and the facial contour extracted by the facial contour extracting means, the facial information calculating means for calculating the forehead and the cheek area ratio with respect to the entire face, and the facial information calculating means Based on the forehead for the entire face and the area ratio of both cheeks, the face turning direction estimating means for estimating the face turning direction with respect to the imaging direction, and the coordinates at which both pupils and nose are located from the image taken by the imaging means Coordinate detection means for detecting; centroid calculation means for calculating the centroid of an inverted triangle connecting the coordinates of both pupils and nose detected by the coordinate detection means with line segments; and both the pupils and nose Based on the inverted triangle obtained by connecting the coordinates with line segments and the centroid calculated by the centroid calculating means, the gaze direction estimating means for estimating the gaze direction on the face, and the face turning direction estimated by the face turning direction estimating means And a gaze direction correction unit that corrects the gaze direction estimated by the gaze direction estimation unit so as to be a gaze direction with respect to the imaging direction.

請求項2記載の発明では、基準方向に対する撮像方向のオフセット角度に基づいて、前記顔面旋回方向推定手段により推定された顔面旋回方向を基準方向に対する顔面旋回方向となるように補正する顔面旋回方向補正手段を備え、前記視線方向補正手段は、前記顔面旋回方向補正手段により補正された顔面旋回方向に基づいて、前記視線方向推定手段により推定された視線方向を基準方向に対する視線方向となるように補正することを特徴とする。   According to the second aspect of the invention, based on the offset angle of the imaging direction with respect to the reference direction, the face turning direction correction that corrects the face turning direction estimated by the face turning direction estimation means to be the face turning direction with respect to the reference direction. And the gaze direction correcting unit corrects the gaze direction estimated by the gaze direction estimating unit to be a gaze direction with respect to a reference direction based on the face turning direction corrected by the face turning direction correcting unit. It is characterized by doing.

請求項3記載の発明では、前記顔面のうち両眼及び鼻が占める範囲を設定する顔面特徴設定手段を備え、前記座標検出手段は、前記顔面特徴設定手段により設定された範囲内の画像から、両瞳及び鼻が位置する座標を検出することを特徴とする。
請求項4記載の発明では、前記顔面特徴設定手段により設定された範囲を、最新画像に基づいて更新学習する顔面特徴学習手段を備えたことを特徴とする。
The invention according to claim 3 further comprises facial feature setting means for setting a range occupied by both eyes and nose of the face, wherein the coordinate detection means is based on an image within the range set by the facial feature setting means. It is characterized by detecting coordinates where both pupils and nose are located.
According to a fourth aspect of the present invention, there is provided face feature learning means for learning to update the range set by the face feature setting means based on the latest image.

請求項5記載の発明では、前記座標検出手段は、両鼻孔の中心を結ぶ線分の中点を鼻座標とすることを特徴とする。
請求項6記載の発明に係る視線方向検出方法では、撮像装置により顔面を撮像した画像から、顔面輪郭を抽出するステップと、前記画像及び顔面輪郭に基づいて、顔面全体に対する額及び両頬の面積比率を算出するステップと、前記顔面全体に対する額及び両頬の面積比率に基づいて、撮像方向に対する顔面旋回方向を推定するステップと、前記画像から、両瞳及び鼻が位置する座標を検出するステップと、前記両瞳及び鼻の座標を線分で結んだ逆三角形の重心を算出するステップと、前記逆三角形及びその重心に基づいて、顔面における視線方向を推定するステップと、前記撮像方向に対する顔面旋回方向に基づいて、前記視線方向を撮像方向に対す視線方向となるように補正するステップと、をコンピュータに実行させることを特徴とする。
According to a fifth aspect of the present invention, the coordinate detecting means uses a midpoint of a line segment connecting the centers of both nostrils as a nose coordinate.
In the gaze direction detection method according to the sixth aspect of the present invention, a step of extracting a facial contour from an image obtained by imaging a face by an imaging device, and a forehead and areas of both cheeks based on the image and the facial contour. A step of calculating a ratio, a step of estimating a face turning direction with respect to an imaging direction based on a forehead and an area ratio of both cheeks with respect to the whole face, and a step of detecting coordinates where both pupils and nose are located from the image Calculating a center of gravity of an inverted triangle obtained by connecting the coordinates of both eyes and nose with line segments; estimating a gaze direction on the face based on the inverted triangle and its center of gravity; and a face relative to the imaging direction Correcting the line-of-sight direction to be the line-of-sight direction with respect to the imaging direction based on the turning direction. .

請求項1又は請求項6に記載の発明によれば、顔面を撮像した画像及びこれから抽出された顔面輪郭に基づいて、顔面全体に対する額及び両頬の面積比率が算出される。そして、顔面全体に対する額及び両頬の面積比率に基づいて、撮像方向に対する顔面旋回方向が推定される。また、顔面を撮像した画像から両瞳及び鼻の座標が検出され、両瞳及び鼻の座標を線分で結んだ逆三角形の重心が算出される。その後、逆三角形及びその重心に基づいて顔面における視線方向が推定され、顔面旋回方向に基づいて視線方向が撮像方向に対する視線方向となるように補正される。従って、被写体に直接装着するアイカメラなどを使用しなくとも、顔面を撮像した画像から、顔面構成要素を介して、視線方向を間接的かつ高精度に検出することができる。   According to the invention described in claim 1 or claim 6, the forehead and the area ratio of both cheeks to the entire face are calculated based on the image obtained by capturing the face and the face contour extracted therefrom. Then, the face turning direction with respect to the imaging direction is estimated based on the forehead for the entire face and the area ratio of both cheeks. Also, the coordinates of both pupils and nose are detected from the image of the face, and the center of gravity of an inverted triangle connecting the coordinates of both pupils and nose with line segments is calculated. Thereafter, the gaze direction on the face is estimated based on the inverted triangle and its center of gravity, and the gaze direction is corrected so as to be the gaze direction with respect to the imaging direction based on the face turning direction. Therefore, it is possible to detect the gaze direction indirectly and with high accuracy from the image obtained by capturing the face without using an eye camera or the like that is directly attached to the subject.

請求項2記載の発明によれば、被写体に正対する理想的な位置に撮像装置が設置されていなくとも、撮像方向のオフセット角度に基づいて、顔面旋回方向が基準方向に対する顔面旋回方向となるように補正されるので、視線方向の検出精度低下を防止することができる。
請求項3記載の発明によれば、顔面のうち両眼及び鼻が占める範囲内の画像から、両瞳及び鼻が位置する座標が検出されるので、画像処理に要する負荷が低減し、処理能力を向上させることができる。
According to the second aspect of the present invention, the face turning direction becomes the face turning direction with respect to the reference direction based on the offset angle of the image pickup direction even if the image pickup apparatus is not installed at an ideal position facing the subject. Therefore, it is possible to prevent a reduction in detection accuracy in the line-of-sight direction.
According to the invention described in claim 3, since the coordinates where both eyes and nose are located are detected from the image within the range occupied by both eyes and nose of the face, the load required for image processing is reduced, and the processing capability is reduced. Can be improved.

請求項4記載の発明によれば、顔面のうち両眼及び鼻が占める範囲が最新画像に基づいて更新学習されるので、例えば、被写体がひげを伸ばすことで容貌が変貌しても、処理能力の低下を抑制することができる。
請求項5記載の発明によれば、両鼻孔の中心を結ぶ線分の中点が鼻座標となるので、鼻が位置する座標を高精度に特定することができる。
According to the fourth aspect of the present invention, since the range occupied by both eyes and nose of the face is updated and learned based on the latest image, for example, even if the subject changes its appearance, the processing capability Can be suppressed.
According to the fifth aspect of the invention, since the midpoint of the line segment connecting the centers of both nostrils is the nose coordinate, the coordinate where the nose is located can be specified with high accuracy.

以下、添付された図面を参照して本発明を詳述する。
図1は、キャブオーバ型のトラックに対し本発明を適用して構築した視線方向検出装置の構成を示す。
視線方向検出装置は、車両運転者10の顔面10Aを撮像するCCD(Charge Coupled Device)カメラなどの撮像装置20(撮像手段)と、コンピュータを内蔵した制御装置30と、を含んで構成される。制御装置30は、撮像装置20により撮像された画像を解析することで、車両運転者10の視線方向を間接的かつ高精度に検出する。そして、制御装置30では、ROM(Read Only Memory)などのメモリに記憶されたプログラムにより、図2に示すように、顔面輪郭抽出手段30A,顔面面情報算出手段30B,顔面旋回方向推定手段30C,撮像方向設定手段30D,顔面旋回方向補正手段30E,顔面特徴設定手段30F,顔面特徴学習手段30G,両瞳座標検出手段30H,鼻座標検出手段30I,重心算出手段30J,視線方向推定手段30K及び視線方向補正手段30Lが夫々実現される。なお、両瞳座標検出手段30H及び鼻座標検出手段30Iにより、座標検出手段が構成される。
Hereinafter, the present invention will be described in detail with reference to the accompanying drawings.
FIG. 1 shows the configuration of a gaze direction detecting device constructed by applying the present invention to a cab-over type track.
The line-of-sight detection device includes an imaging device 20 (imaging means) such as a CCD (Charge Coupled Device) camera that images the face 10A of the vehicle driver 10 and a control device 30 incorporating a computer. The control device 30 detects the direction of the line of sight of the vehicle driver 10 indirectly and with high accuracy by analyzing the image captured by the imaging device 20. Then, in the control device 30, as shown in FIG. 2, a facial contour extraction unit 30 A, a facial information calculation unit 30 B, a facial turning direction estimation unit 30 C, and the like are stored by a program stored in a memory such as a ROM (Read Only Memory). Imaging direction setting means 30D, facial turning direction correction means 30E, facial feature setting means 30F, facial feature learning means 30G, binocular coordinate detection means 30H, nose coordinate detection means 30I, centroid calculation means 30J, gaze direction estimation means 30K and gaze Each of the direction correction means 30L is realized. The pupil detection unit 30H and the nose coordinate detection unit 30I constitute a coordinate detection unit.

顔面輪郭抽出手段30Aでは、公知の画像処理技術を用いて、撮像装置20により撮像された画像から、車両運転者10の顔面輪郭が抽出される。
顔面面情報算出手段30Bでは、画像及び顔面輪郭に基づいて、顔面全体に対する額及び両頬の面積比率(以下「顔面面情報」という)が算出される。即ち、図3に示すように、画像を所定間隔のメッシュで分割し、額及び両頬が占める範囲に含まれる矩形要素を計数することで、顔面輪郭により区画される顔面全体に対する額及び両頬が占める面積比率(割合)が算出される。
The facial contour extraction unit 30A extracts the facial contour of the vehicle driver 10 from the image captured by the imaging device 20 using a known image processing technique.
The face information calculation means 30B calculates the forehead and the area ratio of both cheeks (hereinafter referred to as “face information”) based on the image and the face contour. That is, as shown in FIG. 3, the image is divided by a mesh of a predetermined interval, and the rectangular elements included in the range occupied by the forehead and both cheeks are counted, so that the forehead and both cheeks for the entire face demarcated by the facial contour. The area ratio (ratio) occupied by is calculated.

顔面旋回方向推定手段30Cでは、顔面面情報に基づいて、撮像方向に対する顔面旋回方向(顔面を向けている方向)が推定される。即ち、撮像方向に対して顔面が正対した画像を示す図4(A)を基準として、同図(B)〜(F)に夫々示すように、顔面が上方,下方,右上方,右方及び右下方を向くと、顔面全体に対する額及び両頬の面積比率が変化する特性がある。例えば、同図(B)に示すように顔面が上方を向いたときには、左右の頬の面積比率が略一定割合で増加する一方、額の面積比率が減少する。また、同図(D)に示すように顔面が右上方を向いたときには、右の頬及び額の面積比率が減少する一方、左の頬の面積比率が増加する。このため、額及び両頬の面積比率の変化を介して、撮像方向に対して顔面がどちらを向いているかを推定することができる。   The face turning direction estimation means 30C estimates the face turning direction (the direction in which the face is directed) with respect to the imaging direction based on the face information. That is, with reference to FIG. 4A showing an image of the face facing the imaging direction, the face is upward, downward, upper right, rightward as shown in FIGS. When facing the lower right, there is a characteristic that the forehead and the area ratio of both cheeks to the entire face change. For example, as shown in FIG. 5B, when the face faces upward, the area ratio of the left and right cheeks increases at a substantially constant rate, while the area ratio of the forehead decreases. Also, as shown in FIG. 4D, when the face is directed to the upper right, the area ratio of the right cheek and forehead decreases, while the area ratio of the left cheek increases. For this reason, it can be estimated which direction the face is facing with respect to the imaging direction through the change in the area ratio of the forehead and both cheeks.

撮像方向設定手段30Dでは、基準方向たる車両前方に対する撮像方向、換言すると、車両前方に対する撮像装置20のオフセット角度が設定される。即ち、車両運転者10が前方を目視しているときに、その正対する位置から顔面10Aを撮像することが理想である。しかし、車両運転者10に正対する位置に撮像装置20があると、前方視界が妨げられるおそれがあるので、このような理想的な位置に撮像装置20を設置することは現実的には不可能である。このため、車両運転者10に対面する位置であってその顔面10Aが撮像可能な位置、例えば、ダッシュボード,天井などに撮像装置20を設置する構成を採用せざるを得ない。このような位置に撮像装置20を設置すると、図5及び図6に示すように、鉛直面及び水平面について、撮像方向が理想的な正対方向からずれてしまう。そこで、撮像方向に対する顔面旋回方向を車両前方に対する顔面旋回方向となるように補正可能とすべく、車両前方に対する撮像装置20のオフセット角度を予め設定する機能が備えられていることが望ましい。   In the imaging direction setting means 30D, the imaging direction with respect to the vehicle front as the reference direction, in other words, the offset angle of the imaging device 20 with respect to the vehicle front is set. In other words, when the vehicle driver 10 is looking forward, it is ideal to take an image of the face 10A from the directly facing position. However, if the imaging device 20 is at a position directly opposite the vehicle driver 10, the forward view may be hindered, so it is practically impossible to install the imaging device 20 at such an ideal position. It is. For this reason, the structure which installs the imaging device 20 in the position which faces the vehicle driver 10 and the face 10A can image, for example, a dashboard, a ceiling, etc. must be employ | adopted. When the imaging device 20 is installed at such a position, as shown in FIGS. 5 and 6, the imaging direction is deviated from the ideal facing direction with respect to the vertical plane and the horizontal plane. Therefore, it is desirable that a function for presetting the offset angle of the imaging device 20 with respect to the front of the vehicle is provided so that the face turning direction with respect to the imaging direction can be corrected to be the face turning direction with respect to the front of the vehicle.

顔面旋回方向補正手段30Eでは、撮像装置20のオフセット角度に基づいて、撮像方向に対する顔面旋回方向が補正され、車両前方に対する顔面旋回方向が求められる。即ち、撮像方向に対する顔面旋回方向に対して、撮像装置20のオフセット角度を減算することで、車両前方を基準方向とした顔面旋回方向が求められる。
顔面特徴設定手段30Fでは、画像処理範囲を限定して処理能力を向上させるべく、車両運転者10の顔面10Aのうち両眼及び鼻が占めるであろうと考えられる範囲(以下「顔面特徴」という)が設定される。即ち、車両運転者10の両眼及び鼻は、その身体的特徴である身長,座高などにより異なる高さにある。画像全体から両眼及び鼻を抽出することは可能であるが、制御装置30に対して高い処理能力を要求することとなり、コストなどの面からも現実的でない。そこで、顔面特徴に基づいて画像処理範囲を限定することで、処理負荷の低減を通して処理能力を向上させることができる。
The face turning direction correcting unit 30E corrects the face turning direction with respect to the imaging direction based on the offset angle of the imaging device 20, and obtains the face turning direction with respect to the front of the vehicle. That is, by subtracting the offset angle of the imaging device 20 from the face turning direction with respect to the imaging direction, the face turning direction with the vehicle front as the reference direction is obtained.
In the facial feature setting means 30F, in order to limit the image processing range and improve the processing capability, a range in which both eyes and nose of the face 10A of the vehicle driver 10 will be occupied (hereinafter referred to as “facial feature”). Is set. That is, both eyes and nose of the vehicle driver 10 are at different heights depending on their physical characteristics such as height and sitting height. Although it is possible to extract both eyes and nose from the entire image, it requires a high processing capability for the control device 30 and is not realistic from the viewpoint of cost or the like. Therefore, by limiting the image processing range based on the facial features, the processing capability can be improved through a reduction in processing load.

顔面特徴学習手段30Gでは、車両運転者10の顔面特徴が、例えば、ひげを伸ばすことで変貌することに鑑み、最新画像との差分を介して適宜更新学習される。
両瞳座標検出手段30Hでは、最新の顔面特徴により限定される画像処理範囲について、画像から両眼領域が抽出される。そして、抽出された両眼領域からさらに両瞳が抽出され、瞳座標としてその中心座標が夫々検出される。なお、車両運転者10がサングラスを着用しているときには、両瞳位置を画像から特定することができないので、サングラスのレンズ中心を瞳座標とすればよい。
In the facial feature learning means 30G, the facial feature of the vehicle driver 10 is appropriately updated and learned through a difference from the latest image in view of, for example, changing the facial feature by extending the beard.
In the binocular coordinate detection means 30H, the binocular region is extracted from the image for the image processing range limited by the latest facial features. Then, both pupils are further extracted from the extracted binocular region, and their center coordinates are detected as pupil coordinates. When the vehicle driver 10 is wearing sunglasses, the position of both eyes cannot be specified from the image, and therefore the lens center of the sunglasses may be set as the pupil coordinates.

鼻座標検出手段30Iでは、最新の顔面特徴により限定される画像処理範囲について、画像から鼻孔領域が抽出される。そして、鼻座標として、抽出された鼻孔の中心座標、即ち、左右の鼻孔中心を結ぶ線分の中点が検出される。
重心算出手段30Jでは、図7に示すように、両瞳及び鼻の座標を線分で結んだ逆三角形の重心が算出される。
In the nose coordinate detection means 30I, the nostril region is extracted from the image for the image processing range limited by the latest facial features. Then, as the nose coordinates, the center coordinates of the extracted nostrils, that is, the midpoint of the line segment connecting the left and right nostril centers are detected.
As shown in FIG. 7, the center-of-gravity calculating means 30J calculates the center of gravity of an inverted triangle obtained by connecting the coordinates of both pupils and nose with line segments.

視線方向推定手段30Kでは、逆三角形及びその重心の変化に基づいて、顔面10Aにおける視線方向が推定される。例えば、図8(A)に示すように、視線が右上方を向いているときには、逆三角形における両瞳を結ぶ線分が左方に向かって下方に傾斜すると共に、その重心が逆三角形の中心から右上方へと移動する。また、同図(B)に示すように、視線が右下方を向いているときには、逆三角形における両瞳を結ぶ線分が右方に向かって下方に傾斜すると共に、その重心が逆三角形の中心から右下方へと移動する。このため、視線方向に対応した規則性を用いれば、顔面10Aにおける視線方向を間接的かつ高精度に推定することができる。   The line-of-sight direction estimation means 30K estimates the line-of-sight direction on the face 10A based on the inverted triangle and the change in its center of gravity. For example, as shown in FIG. 8A, when the line of sight is directed to the upper right, the line segment connecting both pupils in the inverted triangle is tilted downward toward the left, and the center of gravity is the center of the inverted triangle. Move to the upper right. Further, as shown in FIG. 5B, when the line of sight is directed to the lower right, the line segment connecting both pupils in the inverted triangle is inclined downward toward the right, and the center of gravity thereof is the center of the inverted triangle. Move to the lower right. For this reason, if the regularity corresponding to a gaze direction is used, the gaze direction in the face 10A can be estimated indirectly and with high accuracy.

視線方向補正手段30Lでは、車両前方に対する顔面旋回方向に基づいて、顔面における視線方向が補正され、車両前方に対する視線方向が求められる。即ち、車両前方に対する顔面旋回方向に顔面における視線方向を加算することで、車両前方を基準方向とした視線方向が求められる。
次に、視線方向検出装置で実行される処理手順について、図9に示すフローチャートを参照しつつ説明する。
The line-of-sight direction correcting means 30L corrects the line-of-sight direction on the face based on the face turning direction with respect to the front of the vehicle, and obtains the line-of-sight direction with respect to the front of the vehicle. That is, the line-of-sight direction with respect to the front of the vehicle is obtained by adding the line-of-sight direction on the face to the face turning direction with respect to the front of the vehicle.
Next, a processing procedure executed by the gaze direction detection device will be described with reference to a flowchart shown in FIG.

ステップ1(図では「S1」と略記する。以下同様)では、撮像装置20により車両運転者10の顔面10Aが撮像される。
ステップ2では、画像から顔面輪郭が抽出される。
ステップ3では、画像及び顔面輪郭に基づいて顔面面情報が算出される。
ステップ4では、顔面面情報に基づいて撮像方向に対する顔面旋回方向が推定される。
In step 1 (abbreviated as “S1” in the figure, the same applies hereinafter), the face 10A of the vehicle driver 10 is imaged by the imaging device 20.
In step 2, a facial contour is extracted from the image.
In step 3, facial information is calculated based on the image and facial contour.
In step 4, the face turning direction with respect to the imaging direction is estimated based on the face information.

ステップ5では、撮像装置20のオフセット角度に基づいて顔面旋回方向が補正され、車両前方に対する顔面旋回方向が求められる。
ステップ6では、最新の顔面特徴に基づいて顔面10Aのうち両眼及び鼻が占めるであろうと考えられる画像処理範囲(抽出範囲)が限定される。
ステップ7では、限定された右眼抽出範囲について、画像から右眼領域が抽出される。
In step 5, the face turning direction is corrected based on the offset angle of the imaging device 20, and the face turning direction with respect to the front of the vehicle is obtained.
In step 6, the image processing range (extraction range) that is considered to be occupied by both eyes and nose of the face 10A based on the latest facial features is limited.
In step 7, the right eye region is extracted from the image for the limited right eye extraction range.

ステップ8では、抽出された右眼領域から更に右瞳が抽出され、右瞳座標としてその中心座標が算出される。
ステップ9では、限定された左眼抽出範囲について、画像から左眼領域が抽出される。
ステップ10では、抽出された左眼領域から更に左瞳が抽出され、左瞳座標として、その中心座標が算出される。
In step 8, the right pupil is further extracted from the extracted right eye region, and the center coordinate is calculated as the right pupil coordinate.
In step 9, the left eye region is extracted from the image for the limited left eye extraction range.
In step 10, the left pupil is further extracted from the extracted left eye region, and the center coordinate is calculated as the left pupil coordinate.

ステップ11では、限定された鼻抽出範囲について、画像から鼻孔領域が抽出される。
ステップ12では、抽出された鼻孔領域から、鼻座標として、左右の鼻孔中心を結ぶ線分の中点座標が算出される。
ステップ13では、両瞳座標及び鼻座標を夫々頂点とする逆三角形が作成される。
ステップ14では、作成された逆三角形の各頂点座標からその重心座標が算出される。
In step 11, a nostril region is extracted from the image for the limited nose extraction range.
In step 12, the midpoint coordinates of the line segment connecting the center of the left and right nostrils are calculated as nose coordinates from the extracted nostril area.
In step 13, inverted triangles having apexes at both pupil coordinates and nose coordinates are created.
In step 14, the barycentric coordinates are calculated from the vertex coordinates of the created inverted triangle.

ステップ15では、逆三角形及びその重心の変化に基づいて、顔面における視線方向が推定される。
ステップ16では、撮像方向に対する顔面旋回方向に基づいて、顔面における視線方向が補正され、車両前方に対する視線方向が求められる。
かかる視線方向検出装置によれば、顔面を撮像した画像から顔面輪郭が抽出されると共に、画像及び顔面輪郭に基づいて、顔面全体に対する額及び両頬の面積比率が算出される。ここで、顔面全体に対する額及び両頬の面積比率は、図4に示すように、撮像方向に対する顔面旋回方向に応じて規則性をもって変化する特性がある。そこで、その規則性を用いることで、顔面撮像方向に対する顔面旋回方向を推定することができる。
In step 15, the line-of-sight direction on the face is estimated based on the inverted triangle and the change in its center of gravity.
In step 16, the gaze direction on the face is corrected based on the face turning direction with respect to the imaging direction, and the gaze direction with respect to the front of the vehicle is obtained.
According to this gaze direction detection device, a facial contour is extracted from an image obtained by capturing a facial image, and the forehead and the cheek area ratio with respect to the entire facial surface are calculated based on the image and the facial contour. Here, the area ratio of the forehead and both cheeks with respect to the entire face has a characteristic that changes with regularity according to the turning direction of the face with respect to the imaging direction, as shown in FIG. Therefore, by using the regularity, it is possible to estimate the face turning direction with respect to the face imaging direction.

ところで、被写体に正対する位置からその顔面を撮像することが理想であるが、車両に対して本発明を適用したときには、車両運転者10の前方視界を確保する必要から、理想的な位置に撮像装置20を設置することはできない。このため、基準方向たる車両前方に対する撮像装置20のオフセット角度に基づいて、撮像方向に対する顔面旋回方向を基準方向に対する顔面旋回方向となるように補正すれば、理想的な位置に撮像装置20が設置されていなくとも、これに起因する視線方向の検出精度低下を防止することができる。   By the way, it is ideal to image the face from a position facing the subject. However, when the present invention is applied to the vehicle, it is necessary to ensure the forward view of the vehicle driver 10, and thus the image is captured at the ideal position. The device 20 cannot be installed. For this reason, if the facial turning direction with respect to the imaging direction is corrected to be the facial turning direction with respect to the reference direction based on the offset angle of the imaging device 20 with respect to the vehicle front as the reference direction, the imaging device 20 is installed at an ideal position. Even if it is not done, it is possible to prevent a reduction in the detection accuracy of the line-of-sight direction due to this.

一方、顔面を撮像した画像から、両瞳及び鼻が位置する座標が検出されると共に、両瞳及び鼻の座標を線分で結んだ逆三角形の重心が算出される。このとき、画像処理範囲を限定して処理能力を向上させるべく、顔面のうち両眼及び鼻が占めるであろうと考えられる範囲内の画像から、両瞳及び鼻が位置する座標が検出される。ここで、逆三角形及びその重心は、図8に示すように、顔面における視線方向に応じて規則性をもって変化する特性がある。そこで、その規則性を用いることで、顔面における視線方向を推定することができる。   On the other hand, coordinates where both pupils and nose are located are detected from an image of the face, and the center of gravity of an inverted triangle connecting the coordinates of both pupils and nose with line segments is calculated. At this time, in order to improve the processing capability by limiting the image processing range, coordinates where both eyes and nose are located are detected from an image within a range where both eyes and nose are supposed to occupy the face. Here, as shown in FIG. 8, the inverted triangle and its center of gravity have a characteristic that changes with regularity according to the line-of-sight direction on the face. Therefore, by using the regularity, it is possible to estimate the gaze direction on the face.

そして、顔面旋回方向に基づいて顔面における視線方向を補正、即ち、顔面旋回方向と顔面における視線方向を加算することで、基準方向に対する視線方向を間接的かつ高精度に検出することができる。
なお、本発明に係る視線方向検出装置は、車両運転者の視線方向を検出する構成に限らず、例えば、「アイカメラ」により視線方向を検出する種々の用途に適用可能であることはいうまでもない。
Then, by correcting the gaze direction on the face based on the face turning direction, that is, by adding the face turning direction and the gaze direction on the face, the gaze direction with respect to the reference direction can be detected indirectly and with high accuracy.
Note that the gaze direction detection device according to the present invention is not limited to the configuration for detecting the gaze direction of the vehicle driver, and can be applied to various applications in which the gaze direction is detected by, for example, an “eye camera”. Nor.

トラックに対し本発明を適用して構築した視線方向検出装置の構成図Configuration diagram of a gaze direction detecting device constructed by applying the present invention to a track 制御装置における各種機能のブロック図Block diagram of various functions in the control device 顔面面情報を算出する方法の説明図Explanatory diagram of how to calculate facial information 撮像方向に対する顔面旋回方向を推定する方法を示し、(A)〜(F)は夫々視線方向が正面,上方,下方,右上方,右方及び右下方にあるときの説明図FIGS. 9A to 9F illustrate a method for estimating a face turning direction with respect to an imaging direction, and FIGS. 6A to 6F are explanatory diagrams when the line-of-sight directions are front, upper, lower, upper right, right, and lower right, respectively. 鉛直面における撮像装置のオフセット状態の説明図Explanatory drawing of the offset state of the imaging device in the vertical plane 水平面における撮像装置のオフセット状態を示し、(A)及び(B)は夫々撮像装置を右方及び左方にオフセットしたときの説明図The offset state of the imaging device in the horizontal plane is shown, and (A) and (B) are explanatory diagrams when the imaging device is offset to the right and left, respectively. 両瞳座標及び鼻座標により規定される逆三角形並びにその重心の説明図Illustration of an inverted triangle defined by both pupil coordinates and nose coordinates and its center of gravity 顔面における視線方向を推定する方法を示し、(A)及び(B)は夫々視線を右上方及び右下方に向けたときの説明図The method of estimating the gaze direction on the face is shown, and (A) and (B) are explanatory diagrams when the gaze is directed to the upper right and lower right, respectively. 視線方向検出装置で実行される処理手順を示すフローチャートThe flowchart which shows the process sequence performed with a gaze direction detection apparatus.

符号の説明Explanation of symbols

10 車両運転者
10A 顔面
20 撮像装置
30 制御装置
30A 顔面輪郭抽出手段
30B 顔面面情報算出手段
30C 顔面旋回方向推定手段
30D 撮像方向設定手段
30E 顔面旋回方向補正手段
30F 顔面特徴設定手段
30G 顔面特徴学習手段
30H 両瞳座標検出手段
30I 鼻座標検出手段
30J 重心算出手段
30K 視線方向推定手段
30L 視線方向補正手段
DESCRIPTION OF SYMBOLS 10 Vehicle driver 10A Face 20 Imaging device 30 Control apparatus 30A Face outline extraction means 30B Face information calculation means 30C Face turning direction estimation means 30D Imaging direction setting means 30E Face turning direction correction means 30F Face feature setting means 30G Face feature learning means 30H Binocular coordinate detection means 30I Nose coordinate detection means 30J Center of gravity calculation means 30K Gaze direction estimation means 30L Gaze direction correction means

Claims (6)

顔面の画像を撮像する撮像手段と、
該撮像手段により撮像された画像から、顔面輪郭を抽出する顔面輪郭抽出手段と、
前記撮像手段により撮像された画像及び顔面輪郭抽出手段により抽出された顔面輪郭に基づいて、顔面全体に対する額及び両頬の面積比率を算出する顔面面情報算出手段と、
該顔面面情報算出手段により算出された顔面全体に対する額及び両頬の面積比率に基づいて、撮像方向に対する顔面旋回方向を推定する顔面旋回方向推定手段と、
前記撮像手段により撮像された画像から、両瞳及び鼻が位置する座標を検出する座標検出手段と、
該座標検出手段により検出された両瞳及び鼻の座標を線分で結んだ逆三角形の重心を算出する重心算出手段と、
前記両瞳及び鼻の座標を線分で結んだ逆三角形並びに重心算出手段により算出された重心に基づいて、顔面における視線方向を推定する視線方向推定手段と、
前記顔面旋回方向推定手段により推定された顔面旋回方向に基づいて、前記視線方向推定手段により推定された視線方向を撮像方向に対する視線方向となるように補正する視線方向補正手段と、
を含んで構成されたことを特徴とする視線方向検出装置。
Imaging means for capturing an image of the face;
Facial contour extraction means for extracting a facial contour from an image captured by the imaging means;
Face information calculation means for calculating the forehead and the cheek area ratio for the entire face based on the image taken by the image pickup means and the face contour extracted by the face contour extraction means;
A face turning direction estimating means for estimating the face turning direction with respect to the imaging direction based on the forehead and the area ratio of both cheeks calculated by the face information calculating means;
Coordinate detection means for detecting coordinates where both pupils and nose are located from the image captured by the imaging means;
Centroid calculating means for calculating the centroid of an inverted triangle connecting the coordinates of both pupils and nose detected by the coordinate detecting means with line segments;
Line-of-sight direction estimating means for estimating the line-of-sight direction on the face based on the inverted triangle connecting the coordinates of both eyes and nose with line segments and the center of gravity calculated by the center-of-gravity calculating means;
Gaze direction correcting means for correcting the gaze direction estimated by the gaze direction estimating means based on the face turning direction estimated by the face turning direction estimating means to be a gaze direction with respect to the imaging direction;
A line-of-sight direction detection device comprising:
基準方向に対する撮像方向のオフセット角度に基づいて、前記顔面旋回方向推定手段により推定された顔面旋回方向を基準方向に対する顔面旋回方向となるように補正する顔面旋回方向補正手段を備え、
前記視線方向補正手段は、前記顔面旋回方向補正手段により補正された顔面旋回方向に基づいて、前記視線方向推定手段により推定された視線方向を基準方向に対する視線方向となるように補正することを特徴とする請求項1記載の視線方向検出装置。
Based on the offset angle of the imaging direction with respect to the reference direction, the face turning direction correcting means for correcting the face turning direction estimated by the face turning direction estimating means to be the face turning direction with respect to the reference direction,
The line-of-sight direction correcting unit corrects the line-of-sight direction estimated by the line-of-sight direction estimating unit to be a line-of-sight direction with respect to a reference direction based on the face turning direction corrected by the face turning direction correcting unit. The gaze direction detecting device according to claim 1.
前記顔面のうち両眼及び鼻が占める範囲を設定する顔面特徴設定手段を備え、
前記座標検出手段は、前記顔面特徴設定手段により設定された範囲内の画像から、両瞳及び鼻が位置する座標を検出することを特徴とする請求項1又は請求項2に記載の視線方向検出装置。
Comprising facial feature setting means for setting a range occupied by both eyes and nose of the face;
The gaze direction detection according to claim 1, wherein the coordinate detection unit detects coordinates where both pupils and the nose are located from an image within a range set by the facial feature setting unit. apparatus.
前記顔面特徴設定手段により設定された範囲を、最新画像に基づいて更新学習する顔面特徴学習手段を備えたことを特徴とする請求項3記載の視線方向検出装置。   4. The gaze direction detecting device according to claim 3, further comprising facial feature learning means for learning to update the range set by the facial feature setting means based on the latest image. 前記座標検出手段は、両鼻孔の中心を結ぶ線分の中点を鼻座標とすることを特徴とする請求項1〜4のいずれか1つに記載の視線方向検出装置。   The gaze direction detecting device according to any one of claims 1 to 4, wherein the coordinate detecting means uses a midpoint of a line segment connecting the centers of both nostrils as a nose coordinate. 撮像装置により顔面を撮像した画像から、顔面輪郭を抽出するステップと、
前記画像及び顔面輪郭に基づいて、顔面全体に対する額及び両頬の面積比率を算出するステップと、
前記顔面全体に対する額及び両頬の面積比率に基づいて、撮像方向に対する顔面旋回方向を推定するステップと、
前記画像から、両瞳及び鼻が位置する座標を検出するステップと、
前記両瞳及び鼻の座標を線分で結んだ逆三角形の重心を算出するステップと、
前記逆三角形及びその重心に基づいて、顔面における視線方向を推定するステップと、
前記撮像方向に対する顔面旋回方向に基づいて、前記視線方向を撮像方向に対する視線方向となるように補正するステップと、
をコンピュータに実行させることを特徴とする視線方向検出方法。
Extracting a facial contour from an image of the face captured by the imaging device;
Calculating a forehead and cheek area ratio for the entire face based on the image and facial contour;
Estimating the face turning direction with respect to the imaging direction based on the forehead for the entire face and the area ratio of both cheeks;
Detecting the coordinates where both pupils and nose are located from the image;
Calculating the center of gravity of an inverted triangle connecting the coordinates of both pupils and nose with line segments;
Estimating a gaze direction on the face based on the inverted triangle and its center of gravity;
Correcting the line-of-sight direction to be the line-of-sight direction with respect to the imaging direction based on the face turning direction with respect to the imaging direction;
A gaze direction detecting method characterized by causing a computer to execute.
JP2003320290A 2003-09-11 2003-09-11 Gaze direction detection device and gaze direction detection method Expired - Fee Related JP4118773B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2003320290A JP4118773B2 (en) 2003-09-11 2003-09-11 Gaze direction detection device and gaze direction detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2003320290A JP4118773B2 (en) 2003-09-11 2003-09-11 Gaze direction detection device and gaze direction detection method

Publications (2)

Publication Number Publication Date
JP2005081101A true JP2005081101A (en) 2005-03-31
JP4118773B2 JP4118773B2 (en) 2008-07-16

Family

ID=34418971

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2003320290A Expired - Fee Related JP4118773B2 (en) 2003-09-11 2003-09-11 Gaze direction detection device and gaze direction detection method

Country Status (1)

Country Link
JP (1) JP4118773B2 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007272435A (en) * 2006-03-30 2007-10-18 Univ Of Electro-Communications Face feature extraction device and face feature extraction method
JP2008094221A (en) * 2006-10-11 2008-04-24 Denso Corp Eye state detector, and eye state detector mounting method
EP1962509A1 (en) * 2005-11-17 2008-08-27 Aisin Seiki Kabushiki Kaisha Vehicle surrounding display device
JP2011204182A (en) * 2010-03-26 2011-10-13 Konami Digital Entertainment Co Ltd Image generating device, image processing method, and program
US8862380B2 (en) 2010-10-11 2014-10-14 Hyundai Motor Company System and method for alarming front impact danger coupled with driver viewing direction and vehicle using the same
KR20150029317A (en) * 2013-09-10 2015-03-18 에스케이플래닛 주식회사 Apparatus, method and computer readable medium having computer program recorded for facial image correction
CN111417335A (en) * 2017-10-06 2020-07-14 爱尔康公司 Tracking eye movement within a tracking range
CN117146828A (en) * 2023-10-30 2023-12-01 网思科技股份有限公司 Method and device for guiding picking path, storage medium and computer equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0757172A (en) * 1993-08-16 1995-03-03 Mitsubishi Electric Corp Alarm device for driver abnormal state occurrence
JPH0765294A (en) * 1993-08-23 1995-03-10 Mitsubishi Electric Corp Preventive safety device for vehicle
JPH07249197A (en) * 1994-03-10 1995-09-26 Mitsubishi Electric Corp Detecting device for state of person
JP2000137788A (en) * 1998-10-29 2000-05-16 Fuji Photo Film Co Ltd Image processing method, image processor, and record medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0757172A (en) * 1993-08-16 1995-03-03 Mitsubishi Electric Corp Alarm device for driver abnormal state occurrence
JPH0765294A (en) * 1993-08-23 1995-03-10 Mitsubishi Electric Corp Preventive safety device for vehicle
JPH07249197A (en) * 1994-03-10 1995-09-26 Mitsubishi Electric Corp Detecting device for state of person
JP2000137788A (en) * 1998-10-29 2000-05-16 Fuji Photo Film Co Ltd Image processing method, image processor, and record medium

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8115811B2 (en) 2005-11-17 2012-02-14 Aisin Seiki Kabushiki Kaisha Vehicle surrounding area display device
EP1962509A1 (en) * 2005-11-17 2008-08-27 Aisin Seiki Kabushiki Kaisha Vehicle surrounding display device
EP1962509A4 (en) * 2005-11-17 2010-08-18 Aisin Seiki Vehicle surrounding display device
JP2007272435A (en) * 2006-03-30 2007-10-18 Univ Of Electro-Communications Face feature extraction device and face feature extraction method
US8144992B2 (en) 2006-10-11 2012-03-27 Denso Corporation Eye condition detection apparatus and method for installing same
JP2008094221A (en) * 2006-10-11 2008-04-24 Denso Corp Eye state detector, and eye state detector mounting method
JP2011204182A (en) * 2010-03-26 2011-10-13 Konami Digital Entertainment Co Ltd Image generating device, image processing method, and program
US8862380B2 (en) 2010-10-11 2014-10-14 Hyundai Motor Company System and method for alarming front impact danger coupled with driver viewing direction and vehicle using the same
KR20150029317A (en) * 2013-09-10 2015-03-18 에스케이플래닛 주식회사 Apparatus, method and computer readable medium having computer program recorded for facial image correction
KR102171332B1 (en) * 2013-09-10 2020-10-28 에스케이플래닛 주식회사 Apparatus, method and computer readable medium having computer program recorded for facial image correction
CN111417335A (en) * 2017-10-06 2020-07-14 爱尔康公司 Tracking eye movement within a tracking range
CN111417335B (en) * 2017-10-06 2023-05-05 爱尔康公司 Tracking eye movement within a tracking range
CN117146828A (en) * 2023-10-30 2023-12-01 网思科技股份有限公司 Method and device for guiding picking path, storage medium and computer equipment
CN117146828B (en) * 2023-10-30 2024-03-19 网思科技股份有限公司 Method and device for guiding picking path, storage medium and computer equipment

Also Published As

Publication number Publication date
JP4118773B2 (en) 2008-07-16

Similar Documents

Publication Publication Date Title
JP5230748B2 (en) Gaze direction determination device and gaze direction determination method
JP6695503B2 (en) Method and system for monitoring the condition of a vehicle driver
KR101868597B1 (en) Apparatus and method for assisting in positioning user`s posture
CN104573623B (en) Face detection device and method
US7853051B2 (en) Recognizing apparatus and method, recording media, and program
JP4811259B2 (en) Gaze direction estimation apparatus and gaze direction estimation method
US11134238B2 (en) Goggle type display device, eye gaze detection method, and eye gaze detection system
JP5790762B2 (en) 瞼 Detection device
JP6840697B2 (en) Line-of-sight direction estimation device, line-of-sight direction estimation method, and line-of-sight direction estimation program
US20150173665A1 (en) State estimation device and state estimation program
EP3575926B1 (en) Method and eye tracking system for providing an approximate gaze convergence distance of a user
JP2008136789A (en) Eyeball parameter estimating instrument and method
JP2009116742A (en) Onboard image processor, image processing method, and program
JP7369184B2 (en) Driver attention state estimation
CN109725714B (en) Sight line determining method, device and system and head-mounted eye movement equipment
JP4118773B2 (en) Gaze direction detection device and gaze direction detection method
KR20140076413A (en) Apparatus and method for gaze tracking control
JP4552636B2 (en) Driver monitor system and processing method thereof
JP4173083B2 (en) Arousal state determination device and arousal state determination method
US10997861B2 (en) Luminance control device, luminance control system, and luminance control method
JP2018101212A (en) On-vehicle device and method for calculating degree of face directed to front side
JP6906943B2 (en) On-board unit
JP2017138645A (en) Sight-line detection device
JP2009278185A (en) Image recognition apparatus
JP2003256852A (en) Steady gaze judging method

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20060330

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20080416

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20080422

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20080423

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110502

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110502

Year of fee payment: 3

S533 Written request for registration of change of name

Free format text: JAPANESE INTERMEDIATE CODE: R313533

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110502

Year of fee payment: 3

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110502

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140502

Year of fee payment: 6

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

LAPS Cancellation because of no payment of annual fees