JPH1091325A - Gaze detection system - Google Patents

Gaze detection system

Info

Publication number
JPH1091325A
JPH1091325A JP8243832A JP24383296A JPH1091325A JP H1091325 A JPH1091325 A JP H1091325A JP 8243832 A JP8243832 A JP 8243832A JP 24383296 A JP24383296 A JP 24383296A JP H1091325 A JPH1091325 A JP H1091325A
Authority
JP
Japan
Prior art keywords
gaze
user
point
detection system
vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP8243832A
Other languages
Japanese (ja)
Inventor
Tatsuaki Uchida
竜朗 内田
Katsuya Tsuchida
勝也 土田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Corp
Original Assignee
Toshiba Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Corp filed Critical Toshiba Corp
Priority to JP8243832A priority Critical patent/JPH1091325A/en
Publication of JPH1091325A publication Critical patent/JPH1091325A/en
Pending legal-status Critical Current

Links

Abstract

PROBLEM TO BE SOLVED: To provide a noncontact gaze detection system which allows the movement of the head of a user, has detection precision as to a gaze and its direction, and also has a fast detection speed. SOLUTION: This gaze detection system is equipped with a display device 1 and an image pickup device 2 with an automatic focusing function which is provided at a distance from the eyeball part of the user 3. Then the center E of the pupil of the user 3 and the positions of featured points on the face are found from image pickup data obtained by the image pickup device 2 and the found position data are converted into the gaze direction to calculate the user's gaze point P.

Description

【発明の詳細な説明】DETAILED DESCRIPTION OF THE INVENTION

【0001】[0001]

【発明の属する技術分野】本発明は、視線検出システム
に係り、特にインターフェースの入力手段等として利用
するために視線およびその方向を検出する、非接触型の
視線検出システムに関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a gaze detection system, and more particularly to a non-contact gaze detection system for detecting a gaze and its direction to be used as input means of an interface.

【0002】[0002]

【従来の技術】近年、視線の動きを、メニュー選択やカ
ーソル制御、画面のスクロールなど表示装置への指示操
作に利用するために、視線を検出する技術の開発が図ら
れている。
2. Description of the Related Art In recent years, a technique for detecting a line of sight has been developed in order to use the movement of the line of sight for an instruction operation on a display device such as menu selection, cursor control, scrolling of a screen, and the like.

【0003】従来から、視線を検出するシステムとして
は、頭部装着式のアイカメラのように、検出部を眼球部
に接触または近接させて検出する接触型の視線検出シス
テムがある。この視線検出システムでは、眼球に近赤外
光を照射し、反射光をビデオカメラ等で撮像して画像処
理する方法が採られており、頭部の動きと関係なく眼球
が必ず検出部内に位置しているため、視線の方向を算出
する際に頭部の位置による補正を行なう必要がなく、視
線を検出するための処理が容易であるという利点を有し
ている。しかし、頭部の動きが制約されるばかりでな
く、視野が狭いなどの問題があり、長時間の使用に適し
なかった。
Conventionally, as a system for detecting a line of sight, there is a contact-type line-of-sight detection system that detects a detection unit by contacting or approaching the eyeball unit, such as a head-mounted eye camera. This gaze detection system employs a method of irradiating near-infrared light to the eyeball, capturing reflected light with a video camera or the like, and performing image processing, so that the eyeball is always located in the detection unit regardless of the movement of the head. Therefore, there is no need to perform correction based on the position of the head when calculating the direction of the line of sight, and there is an advantage that the processing for detecting the line of sight is easy. However, not only the movement of the head is restricted, but also the field of view is narrow, and thus it is not suitable for long-time use.

【0004】一方、検出部を眼球部から離して視線を検
出する非接触型の視線検出方法があり、例えば、顔上の
数点と瞳孔の空間位置情報をステレオ画像計測等により
計測する方法や、顔の向き検出にモザイク特徴を用いる
アスペクト法等を用いて画像データを低減した方法など
が最近開発されている。
On the other hand, there is a non-contact type gaze detection method in which a detection unit is separated from an eyeball unit to detect a gaze, for example, a method of measuring spatial position information of several points on a face and a pupil by stereo image measurement or the like. Recently, a method of reducing image data by using an aspect method using a mosaic feature for face direction detection and the like have been developed recently.

【0005】[0005]

【発明が解決しようとする課題】しかし、これらの視線
検出方法のうちで前者の方法では、画像データを入力し
た後の処理が非常に複雑になり、位置情報計測後の処理
に多くの時間を必要とするという問題があった。
However, in the former method out of these line-of-sight detection methods, processing after inputting image data becomes very complicated, and much time is required for processing after position information measurement. There was a problem of needing.

【0006】また、後者のアスペクト法では、顔の向き
(頭部の方向)に関する画像データを蓄積した最適辞書
の構築方法によって、視線検出の精度が決定され、辞書
の良否が検出精度に直接大きな影響を及ぼしていた。
In the latter aspect method, the accuracy of gaze detection is determined by a method of constructing an optimal dictionary in which image data relating to the face direction (head direction) is accumulated, and the quality of the dictionary is directly greater than the detection accuracy. Had an effect.

【0007】本発明はこれらの点に鑑みてなされたもの
で、使用者の頭部の動きが許容され、視線およびその方
向の検出精度が高く、さらに検出速度が高速化された非
接触型の視線検出システムを提供することを目的とす
る。
SUMMARY OF THE INVENTION The present invention has been made in view of the above points, and is directed to a non-contact type in which movement of a user's head is permitted, detection accuracy of a line of sight and its direction is high, and detection speed is increased. It is an object to provide a gaze detection system.

【0008】[0008]

【課題を解決するための手段】本発明の視線検出システ
ムは、表示画面と、該表示画面を使用する使用者の眼球
部から離間して設けられた撮像装置とを備え、前記撮像
装置により撮像された画像データから、前記使用者の瞳
孔および顔上の特徴的な固定点の位置をそれぞれ求める
位置算出手段と、この手段により求められた位置データ
を視線方向に変換し、前記表示画面上の使用者注視点を
算出する注視点等算出手段とを具備したことを特徴とす
る。
A gaze detection system according to the present invention includes a display screen, and an imaging device provided at a distance from an eyeball of a user who uses the display screen. From the obtained image data, a position calculating means for respectively obtaining the positions of characteristic fixed points on the pupil and the face of the user, and converting the position data obtained by this means into a line of sight, A gaze point calculating means for calculating a user gaze point;

【0009】本発明の位置算出手段においては、使用者
の顔部等の画像データから、輝度の最大値あるいは最小
値を含む少なくとも3点(これらは、通常鼻頭と両目の
両端部にそれぞれ相当する。)を抽出し、これらの3点
を頂点とする三角形の重心の位置を算出して、この点を
顔上に固定された特徴点とすることが望ましい。なお、
特徴点としては、前記した3点を頂点とする三角形の重
心以外に、内心、外心、垂心等を用いることもできる。
また、顔面に存在するほくろ等の位置を予め記憶装置に
登録しておき、この登録データを基に算出した三角形の
重心等を、特徴的点とすることもできる。
In the position calculating means of the present invention, at least three points including a maximum value or a minimum value of luminance (these correspond to both ends of a nose head and both eyes, respectively) are obtained from image data of a user's face and the like. ) Is extracted, and the position of the center of gravity of the triangle having these three points as vertices is calculated, and this point is desirably a feature point fixed on the face. In addition,
In addition to the center of gravity of the triangle having the three points as vertices, an inner center, an outer center, a vertical center, and the like can be used as the feature points.
In addition, the positions of moles and the like existing on the face may be registered in the storage device in advance, and the center of gravity of the triangle calculated based on the registered data may be used as the characteristic point.

【0010】本発明の視線検出システムにおいては、使
用者の眼球部から離間して設けられた撮像装置により、
使用者の顔部等の画像が撮像され、この画像データか
ら、位置算出手段により、使用者の瞳孔の中心並びに顔
上の特徴的な点の位置がそれぞれ算出される。そして、
得られた位置データを用いて、注視点等算出手段により
適当な変換および演算がなされ、使用者の視線の方向、
および視線の表示画面との交点である注視点が求められ
る。
In the eye-gaze detecting system of the present invention, an imaging device provided apart from the eyeball of the user provides
An image of the user's face or the like is captured, and from this image data, the center of the user's pupil and the positions of characteristic points on the face are calculated by the position calculation means. And
Using the obtained position data, an appropriate conversion and calculation are performed by the gaze point calculating means, and the direction of the user's line of sight,
A gazing point, which is an intersection with the display screen of the line of sight, is obtained.

【0011】このように本発明の視線検出システムによ
れば、頭部の動きに関係なく精度の高い視線検出がなさ
れ、さらに検出の高速化が可能である。
As described above, according to the gaze detection system of the present invention, highly accurate gaze detection is performed irrespective of the movement of the head, and the speed of detection can be further increased.

【0012】また、本発明において、画像データから位
置算出手段により求められた位置データを時間軸に対し
て比較し、データの時間的変化が固視微動程度に小さ
く、視線方向に変化がない場合、あるいは使用者の視線
が表示画面の外側に向いている間などは、注視点等算出
手段の動作を停止して、視線方向および注視点の算出を
行なわないように構成することにより、システム運転の
低消費電力化を図ることができる。また、視線検出を表
示装置等への入力手段として利用する場合に、固視微動
による表示画面上のポインタ等のふらつきを解消し、安
定した入力を行なうことができる。
Further, in the present invention, the position data obtained by the position calculating means from the image data is compared with respect to the time axis, and the temporal change of the data is as small as the fixation fine movement and there is no change in the line of sight. Or, when the user's line of sight is directed to the outside of the display screen, the system operation is stopped by stopping the operation of the gaze point and other calculation means and not calculating the line of sight and the line of sight. Power consumption can be reduced. In addition, when gaze detection is used as input means to a display device or the like, it is possible to eliminate a wobble of a pointer or the like on a display screen due to fine fixation and perform stable input.

【0013】[0013]

【発明の実施の形態】以下、本発明の実施例を図面に基
いて説明する。
Embodiments of the present invention will be described below with reference to the drawings.

【0014】図1は、本発明の視線検出システムの一実
施例を示す構成説明図である。
FIG. 1 is an explanatory diagram showing the configuration of an embodiment of the eye-gaze detecting system according to the present invention.

【0015】実施例の視線検出システムは、図に示すよ
うに、液晶表示装置のような表示装置1と、自動焦点機
能を有し被写体までの距離を測定することができる撮像
装置2とを備えており、撮像装置2は、使用者3の眼球
部から離間して、例えば表示装置1の画面1aの下部に
設けられている。そして、撮像装置2には、撮像された
画像のデータから、使用者3の瞳孔の中心Eと顔上の特
徴的な点の位置をそれぞれ求め、求められた位置データ
を視線の方向に変換し、この視線方向と表示画面1aと
の交点である使用者注視点Pを算出する変換・算出装置
(図示を省略。)が接続されている。
As shown in the figure, the visual line detection system according to the embodiment includes a display device 1 such as a liquid crystal display device, and an imaging device 2 having an automatic focusing function and capable of measuring a distance to a subject. The imaging device 2 is provided, for example, below the screen 1 a of the display device 1, away from the eyeball of the user 3. Then, the imaging device 2 obtains the center E of the pupil of the user 3 and the position of a characteristic point on the face from the data of the captured image, and converts the obtained position data into the direction of the line of sight. A conversion / calculation device (not shown) for calculating a user's gazing point P, which is an intersection between the line-of-sight direction and the display screen 1a, is connected.

【0016】実施例の変換・算出装置による視線検出の
手順(プログラム)を、図2にフローチャートで示す。
以下に詳述する。
FIG. 2 is a flow chart showing the procedure (program) of gaze detection by the conversion / calculation device of the embodiment.
Details will be described below.

【0017】(1)自動焦点機能付き撮像装置2により
撮像された使用者3の頭部(顔部)についての画像デー
タから、撮像装置2の任意の点Cから使用者3の瞳孔中
心Eまでの位置ベクトルを算出する→ベクトルCE (2)同じく撮像装置2により撮像された使用者3の顔
部についての画像データから、使用者3の顔上の特徴的
な固定点を算出する。これには、画像データから、顔の
向きに関わらず、顔上で輝度の最大値および最小値を示
す3点をそれぞれ抽出する。そして、これらの3点を頂
点とする三角形の重心Gの位置を算出して、この重心G
を特徴的な固定点とする。
(1) From image data of the head (face) of the user 3 imaged by the imaging device 2 with an automatic focusing function, from an arbitrary point C of the imaging device 2 to the center E of the pupil of the user 3 → Vector CE (2) A characteristic fixed point on the face of the user 3 is calculated from the image data of the face of the user 3 similarly captured by the imaging device 2. For this purpose, three points indicating the maximum value and the minimum value of the luminance on the face are respectively extracted from the image data regardless of the direction of the face. Then, the position of the center of gravity G of the triangle having these three points as vertices is calculated, and this center of gravity G is calculated.
Is a characteristic fixed point.

【0018】(3)瞳孔中心Eから顔上の特徴点である
重心Gまでの位置ベクトルを算出する。→ベクトルEG (4)撮像装置2の任意の点Cから、視線方向と表示画
面1aとの交点である注視点Pまでの位置ベクトルを算
出する。→ベクトルCP (5)(1)〜(4)で得られたベクトルCE、ベクト
ルEG、ベクトルCPの和を算出し、このベクトル式が
成り立つ注視点Pの位置に、例えばポインタを動作させ
る。
(3) A position vector from the pupil center E to the center of gravity G which is a feature point on the face is calculated. → Vector EG (4) Calculate a position vector from an arbitrary point C of the imaging device 2 to a gazing point P which is an intersection between the line of sight direction and the display screen 1a. → Vector CP (5) The sum of the vector CE, vector EG, and vector CP obtained in (1) to (4) is calculated, and, for example, a pointer is moved to the position of the gazing point P where this vector expression holds.

【0019】上記手順における各点の位置および位置ベ
クトルを、図3(a)および図3(b)にそれぞれ示
す。ステップ(5)に示すベクトル和は、下式によって
求められる。
FIGS. 3A and 3B show the position and position vector of each point in the above procedure. The vector sum shown in step (5) is obtained by the following equation.

【0020】 ベクトルCP=ベクトルCG+ベクトルGP [1] ベクトルCG=ベクトルCE+ベクトルEG [2] [2]式を[1]式に代入して、 ベクトルGP=ベクトルCP−(ベクトルCE+ベクトルEG) [3] ここで、[1]式は、使用者3の頭部の動きを許容する
ための、使用者3と撮像装置2と注視点Pとの位置関係
を示したベクトル式であり、図3(a)にベクトル表記
したように、撮像装置2の任意の点Cから表示画面1a
上の使用者が注視している点PまでのベクトルCPは、
点Cから使用者2の顔部に固定された特徴点Gまでのベ
クトルCGと、点Gから注視点PまでのベクトルGPの
和で表わされる。
Vector CP = Vector CG + Vector GP [1] Vector CG = Vector CE + Vector EG [2] Substituting equation (1) into equation [1], vector GP = vector CP− (vector CE + vector EG) [ 3] Here, Expression [1] is a vector expression showing the positional relationship between the user 3, the imaging device 2, and the point of gaze P for allowing the movement of the head of the user 3, and FIG. As shown in the vector notation in (a), the display screen 1a starts at an arbitrary point C of the imaging device 2.
The vector CP up to the point P on which the user is gazing is
It is represented by the sum of the vector CG from the point C to the feature point G fixed to the face of the user 2 and the vector GP from the point G to the gazing point P.

【0021】また、[2]式は、顔部に固定された特徴
点Gと瞳孔中心Eとの位置関係を示したベクトル式であ
り、図3(b)にベクトル表記したように、撮像装置2
の任意の点Cから点GまでのベクトルCGは、点Cから
瞳孔中心EまでのベクトルCEと、瞳孔中心Eから点G
までのベクトルEGの和で表わされる。そして、こうし
て求められるベクトルGPにより、視線方向の算出が行
なわれることになる。ここで、瞳孔中心Eは、撮像装置
2により得られた画像データに対して、以下の処理を行
なうことにより求めることができる。
The expression [2] is a vector expression indicating the positional relationship between the feature point G fixed to the face and the center of the pupil E. As shown in FIG. 2
Is a vector CE from the point C to the pupil center E, and a vector CE from the point C to the pupil center E.
Up to the vector EG. Then, the line-of-sight direction is calculated based on the vector GP thus obtained. Here, the pupil center E can be obtained by performing the following processing on the image data obtained by the imaging device 2.

【0022】図4(a)は、撮像装置2により撮像され
た画像の眼球周辺部を拡大した画像を示す。
FIG. 4A shows an enlarged image of the periphery of the eyeball of the image picked up by the image pickup device 2.

【0023】瞳孔中心Eの抽出においては、まずこの拡
大画像を、A,B等の位置で目の両端部を結ぶ線に平行
に例えば左から右に向かって走査して、各点の輝度を測
定し、輝度と走査時間との関係を求める。
In the extraction of the pupil center E, first, this enlarged image is scanned from left to right, for example, in parallel to a line connecting both ends of the eye at positions A, B, and the like, and the luminance of each point is calculated. Measure and determine the relationship between the luminance and the scanning time.

【0024】一般に、眼球4内各部の輝度は、瞳孔4a
<虹彩4b<結膜部4cの順に大きくなっているので、
図4(a)に示したA,Bの各位置で走査すると、図4
(b)に示すような輝度分布が得られる。このような輝
度分布を表わす信号をコンパレータを通して変換し、瞳
孔4aと虹彩4bの境界にのみパルスを形成すると、図
4(c)に示す出力が得られ、クロックパルスをカウン
トすることにより、瞳孔4aの両端で虹彩4bとの境界
が明らかになる。そして、眼球4上端から下端にかけて
のA,B等の位置で、最低3回の走査を行ない、各走査
において前記したカウント操作を行ない瞳孔4aと虹彩
4bとの境界を求めることにより、瞳孔中心Eを求める
ことができる。
Generally, the brightness of each part in the eyeball 4 is determined by the pupil 4a.
<Iris 4b <conjunctival portion 4c
When scanning is performed at each of the positions A and B shown in FIG.
A luminance distribution as shown in FIG. When a signal representing such a luminance distribution is converted through a comparator and a pulse is formed only at the boundary between the pupil 4a and the iris 4b, an output shown in FIG. 4C is obtained. By counting clock pulses, the pupil 4a is counted. At both ends of the iris 4b. Then, at least three scans are performed at positions A, B, and the like from the upper end to the lower end of the eyeball 4, and the above-described counting operation is performed in each scan to determine the boundary between the pupil 4a and the iris 4b, thereby obtaining the pupil center E. Can be requested.

【0025】なお、コンパレータを使用しない場合に
は、図4(b)に示す輝度分布において、瞳孔4aと虹
彩4bを示す輝度のゲインの範囲内にしきい値を設ける
ことによって、瞳孔4aと虹彩4bとの境界の判別を行
なうことができる。
When the comparator is not used, in the luminance distribution shown in FIG. 4B, a threshold value is set within a range of the luminance gain indicating the pupil 4a and the iris 4b, so that the pupil 4a and the iris 4b are provided. Can be determined.

【0026】このように実施例の視線検出システムにお
いては、前記した手順にしたがって視線検出のプログラ
ムを実行することによって、頭部の動きが許容されるう
えに、従来の非接触型の視線検出システムに比べて高速
処理が可能となり、かつ精度の高い視線検出を行なうこ
とができる。
As described above, in the eye-gaze detecting system of the embodiment, by executing the eye-gaze detecting program in accordance with the above-described procedure, the movement of the head is allowed and the conventional non-contact eye-gaze detecting system is used. As a result, it is possible to perform high-speed processing and perform highly accurate gaze detection.

【0027】次に、本発明の別の実施例について説明す
る。この実施例における視線検出の手順を示すフローチ
ャートを、図5に示す。
Next, another embodiment of the present invention will be described. FIG. 5 is a flowchart showing the procedure of gaze detection in this embodiment.

【0028】この実施例では、撮像装置により得られた
画像データから求められた使用者の瞳孔中心Eに関する
位置データを、時間軸に対して比較し、データに時間的
変化がない場合に、検出プログラムの実行を停止するよ
うに構成され、単位時間内に変化した変化量のみが、表
示画面上のポインタやカーソルの動き、あるいはスクロ
ールの移動量として利用されるようになっている。
In this embodiment, position data relating to the center E of the pupil of the user obtained from the image data obtained by the image pickup device is compared with the time axis, and when there is no temporal change in the data, the detection is performed. The program execution is stopped, and only the amount of change within a unit time is used as the movement of the pointer or cursor on the display screen or the amount of scroll movement.

【0029】このようなプログラムを有する視線検出シ
ステムにおいては、従来の視線検出システムと比較し
て、視線方向または頭部に動きが無い場合の演算処理が
不要になり、より高速度な視線検出が可能になるうえ
に、消費電力を低減することができる。また、固視微動
等の眼球の細かい動きにより視線検出にふらつきが生じ
ることがなく、安定した入力を行なうことができる。
The gaze detection system having such a program eliminates the need for arithmetic processing when there is no movement in the gaze direction or the head, as compared with the conventional gaze detection system, and enables a gaze detection at a higher speed. In addition to being possible, power consumption can be reduced. In addition, a stable input can be performed without causing fluctuations in gaze detection due to fine movements of the eyeball such as fine fixation.

【0030】[0030]

【発明の効果】以上の説明から明らかなように、本発明
のシステムによれば、頭部の動きが許容され、検出精度
が高く高速度の視線検出を行なうことができる。
As is apparent from the above description, according to the system of the present invention, the movement of the head is permitted, and it is possible to perform high-speed gaze detection with high detection accuracy.

【図面の簡単な説明】[Brief description of the drawings]

【図1】本発明の視線検出システムの一実施例を示す構
成説明図。
FIG. 1 is a configuration explanatory view showing an embodiment of a gaze detection system according to the present invention.

【図2】同実施例の変換・算出装置による視線検出のプ
ログラムを示すフローチャート。
FIG. 2 is a flowchart showing a gaze detection program by the conversion / calculation device of the embodiment.

【図3】同実施例の視線検出プログラムにおける各点の
位置および位置ベクトルを示す図。
FIG. 3 is a view showing the position and position vector of each point in the eye gaze detection program of the embodiment.

【図4】同実施例における瞳孔中心の抽出方法を説明す
るための図。
FIG. 4 is an exemplary view for explaining a pupil center extraction method in the embodiment.

【図5】本発明の別の実施例における視線検出のプログ
ラムを示すフローチャート。
FIG. 5 is a flowchart showing a gaze detection program according to another embodiment of the present invention.

【符号の説明】[Explanation of symbols]

1………表示装置 1a………表示画面 2………撮像装置 3………使用者 4………眼球 4a………瞳孔 4b………虹彩 4c………結膜部 DESCRIPTION OF SYMBOLS 1 ... Display apparatus 1a ... Display screen 2 ... Imaging apparatus 3 ... User 4 ... Eyeball 4a ... Pupil 4b ... Iris 4c ... Conjunctiva

Claims (3)

【特許請求の範囲】[Claims] 【請求項1】 表示画面と、該表示画面を使用する使用
者の眼球部から離間して設けられた撮像装置とを備え、
前記撮像装置により撮像された画像データから、前記使
用者の瞳孔および顔上の特徴的な固定点の位置をそれぞ
れ求める位置算出手段と、この手段により求められた位
置データを視線方向に変換し、前記表示画面上の使用者
注視点を算出する注視点等算出手段とを具備したことを
特徴とする視線検出システム。
A display screen; and an imaging device provided apart from an eyeball of a user who uses the display screen;
From the image data imaged by the imaging device, a position calculation means for respectively obtaining the positions of characteristic fixed points on the pupil and the face of the user, and convert the position data obtained by this means into a line-of-sight direction, A gaze point detection system, comprising: a gaze point calculating means for calculating a user gaze point on the display screen;
【請求項2】 前記位置算出手段により求められた位置
データを時間軸に対して比較し、前記データに変化がな
い場合に、前記注視点等算出手段の動作を停止する手段
を備えたことを特徴とする請求項1記載の視線検出シス
テム。
And means for comparing the position data obtained by the position calculating means with respect to a time axis, and stopping the operation of the gaze point calculating means when there is no change in the data. The gaze detection system according to claim 1, wherein:
【請求項3】 前記位置算出手段において、前記撮像装
置により撮像された画像データから、輝度の最大値ある
いは最小値を含む少なくとも3点を抽出して、前記3点
を頂点とする三角形の重心を算出し、この点を顔上の特
徴的な固定点とすることを特徴とする請求項1または2
記載の視線検出システム。
3. The position calculating means extracts at least three points including a maximum value or a minimum value of luminance from image data picked up by the image pickup device, and calculates a barycenter of a triangle having the three points as vertices. 3. The method according to claim 1, wherein the calculation is performed, and this point is set as a characteristic fixed point on the face.
The eye gaze detection system according to the above.
JP8243832A 1996-09-13 1996-09-13 Gaze detection system Pending JPH1091325A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP8243832A JPH1091325A (en) 1996-09-13 1996-09-13 Gaze detection system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP8243832A JPH1091325A (en) 1996-09-13 1996-09-13 Gaze detection system

Publications (1)

Publication Number Publication Date
JPH1091325A true JPH1091325A (en) 1998-04-10

Family

ID=17109611

Family Applications (1)

Application Number Title Priority Date Filing Date
JP8243832A Pending JPH1091325A (en) 1996-09-13 1996-09-13 Gaze detection system

Country Status (1)

Country Link
JP (1) JPH1091325A (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002282210A (en) * 2001-03-27 2002-10-02 Japan Science & Technology Corp Method and apparatus for detecting visual axis
WO2002080527A1 (en) * 2001-03-30 2002-10-10 Koninklijke Philips Electronics N.V. Remote camera control device
JP2004362569A (en) * 2003-05-30 2004-12-24 Microsoft Corp Assessment method and system of head
JP2008015800A (en) * 2006-07-06 2008-01-24 Omron Corp Device for detecting impersonation
KR100820639B1 (en) 2006-07-25 2008-04-10 한국과학기술연구원 System and method for 3-dimensional interaction based on gaze and system and method for tracking 3-dimensional gaze
KR100845274B1 (en) 2006-11-06 2008-07-09 주식회사 시공테크 Apparatus and method for generating user-interface based on face recognition in a exhibition system
KR100960269B1 (en) * 2008-10-07 2010-06-07 한국과학기술원 Apparatus of estimating user's gaze and the method thereof
JP5229928B1 (en) * 2012-08-30 2013-07-03 広太郎 海野 Gaze position specifying device and gaze position specifying program
WO2014084224A1 (en) * 2012-11-27 2014-06-05 京セラ株式会社 Electronic device and line-of-sight input method
WO2015080063A1 (en) * 2013-11-27 2015-06-04 株式会社ニコン Electronic apparatus
US9547798B2 (en) * 2014-05-20 2017-01-17 State Farm Mutual Automobile Insurance Company Gaze tracking for a vehicle operator
CN117017235A (en) * 2023-10-09 2023-11-10 湖南爱尔眼视光研究所 Visual cognition detection method, device and equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02138673A (en) * 1988-07-14 1990-05-28 A T R Tsushin Syst Kenkyusho:Kk Image pickup device
JPH06242883A (en) * 1993-02-19 1994-09-02 Fujitsu General Ltd Inputting device
JPH086708A (en) * 1994-04-22 1996-01-12 Canon Inc Display device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02138673A (en) * 1988-07-14 1990-05-28 A T R Tsushin Syst Kenkyusho:Kk Image pickup device
JPH06242883A (en) * 1993-02-19 1994-09-02 Fujitsu General Ltd Inputting device
JPH086708A (en) * 1994-04-22 1996-01-12 Canon Inc Display device

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002282210A (en) * 2001-03-27 2002-10-02 Japan Science & Technology Corp Method and apparatus for detecting visual axis
JP4729188B2 (en) * 2001-03-27 2011-07-20 独立行政法人科学技術振興機構 Gaze detection device
WO2002080527A1 (en) * 2001-03-30 2002-10-10 Koninklijke Philips Electronics N.V. Remote camera control device
JP2004362569A (en) * 2003-05-30 2004-12-24 Microsoft Corp Assessment method and system of head
JP2008015800A (en) * 2006-07-06 2008-01-24 Omron Corp Device for detecting impersonation
US8032842B2 (en) 2006-07-25 2011-10-04 Korea Institute Of Science & Technology System and method for three-dimensional interaction based on gaze and system and method for tracking three-dimensional gaze
KR100820639B1 (en) 2006-07-25 2008-04-10 한국과학기술연구원 System and method for 3-dimensional interaction based on gaze and system and method for tracking 3-dimensional gaze
KR100845274B1 (en) 2006-11-06 2008-07-09 주식회사 시공테크 Apparatus and method for generating user-interface based on face recognition in a exhibition system
KR100960269B1 (en) * 2008-10-07 2010-06-07 한국과학기술원 Apparatus of estimating user's gaze and the method thereof
JP5229928B1 (en) * 2012-08-30 2013-07-03 広太郎 海野 Gaze position specifying device and gaze position specifying program
WO2014084224A1 (en) * 2012-11-27 2014-06-05 京セラ株式会社 Electronic device and line-of-sight input method
WO2015080063A1 (en) * 2013-11-27 2015-06-04 株式会社ニコン Electronic apparatus
US9547798B2 (en) * 2014-05-20 2017-01-17 State Farm Mutual Automobile Insurance Company Gaze tracking for a vehicle operator
US10088899B2 (en) 2014-05-20 2018-10-02 State Farm Mutual Automobile Insurance Company Eye gaze tracking utilizing surface normal identification
CN117017235A (en) * 2023-10-09 2023-11-10 湖南爱尔眼视光研究所 Visual cognition detection method, device and equipment

Similar Documents

Publication Publication Date Title
US9998654B2 (en) Information processing device and method for controlling information processing device
EP2927634A2 (en) Single-camera ranging method and system
JP4537901B2 (en) Gaze measurement device, gaze measurement program, and gaze calibration data generation program
JP6123694B2 (en) Information processing apparatus, information processing method, and program
JPH1091325A (en) Gaze detection system
Toennies et al. Feasibility of hough-transform-based iris localisation for real-time-application
US20220100268A1 (en) Eye tracking device and a method thereof
CN109634431B (en) Medium-free floating projection visual tracking interaction system
CN103654709A (en) Line-of-sight detection apparatus, line-of-sight detection method, and program therefor
JPH0651901A (en) Communication equipment for glance recognition
US20230080861A1 (en) Automatic Iris Capturing Method And Apparatus, Computer-Readable Storage Medium, And Computer Device
JP7081599B2 (en) Information processing equipment, information processing methods, and programs
JP5995217B2 (en) A method to detect an ellipse that approximates the pupil
JPH09167049A (en) Line of sight input device for console
JP2004062393A (en) Method and device for determining attention
WO2018220963A1 (en) Information processing device, information processing method, and program
JPH0720987A (en) Estimation device of closely observed point
US20230386038A1 (en) Information processing system, eye state measurement system, information processing method, and non-transitory computer readable medium
JP2005323905A (en) Instrument and program for measuring eyeball movement
EP4185184A1 (en) Method for determining a coronal position of an eye relative to the head
CN114740966A (en) Multi-modal image display control method and system and computer equipment
JPH0634779B2 (en) Eye measuring device
CN110334579B (en) Iris recognition image determining method and device, terminal equipment and storage medium
CN113902791B (en) Three-dimensional reconstruction method and device based on liquid lens depth focusing
US20230236666A1 (en) Optical apparatus, image pickup apparatus, control method of optical apparatus, and storage medium

Legal Events

Date Code Title Description
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20060731

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20060808

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20070109