JPH0377533A - Apparatus for measuring gaze attitude and absolute position of gaze point - Google Patents

Apparatus for measuring gaze attitude and absolute position of gaze point

Info

Publication number
JPH0377533A
JPH0377533A JP1213917A JP21391789A JPH0377533A JP H0377533 A JPH0377533 A JP H0377533A JP 1213917 A JP1213917 A JP 1213917A JP 21391789 A JP21391789 A JP 21391789A JP H0377533 A JPH0377533 A JP H0377533A
Authority
JP
Japan
Prior art keywords
head
image
visual field
point
subject
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP1213917A
Other languages
Japanese (ja)
Other versions
JPH0371129B2 (en
Inventor
Sukeyasu Kanno
救泰 漢野
Kozo Sakikawa
崎川 浩三
Shinkichi Miwa
三輪 信吉
Toshio Tanabe
田辺 敏夫
Takeshi Yamaguchi
毅 山口
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HOKURIKU HAANESU KK
Ishikawa Prefecture
Ishikawa Prefectural Government
Original Assignee
HOKURIKU HAANESU KK
Ishikawa Prefecture
Ishikawa Prefectural Government
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HOKURIKU HAANESU KK, Ishikawa Prefecture, Ishikawa Prefectural Government filed Critical HOKURIKU HAANESU KK
Priority to JP1213917A priority Critical patent/JPH0377533A/en
Publication of JPH0377533A publication Critical patent/JPH0377533A/en
Publication of JPH0371129B2 publication Critical patent/JPH0371129B2/ja
Granted legal-status Critical Current

Links

Landscapes

  • Eye Examination Apparatus (AREA)

Abstract

PURPOSE:To enable a gaze point analysis with eye camera to be carried out and a position and direction of a subject's head to be also computed even in case that the head moves around horizontal or vertical axis by setting up four or more characteristic points indicating respective absolute positions in an object whereat the subject is gazing. CONSTITUTION:Nine LED's P1-P9 are arranged as characteristic points on a drafting board 1 at spacings of Dx and Dy in x- and y- directions, respectively. Their positions are defined so that at least four LED's, which are centered around a point gazed at by a subject 2, are included in a photographic range of a visual field camera 3a fixed to the subject's head 2a. The visual field camera 3a fixed to the subject's head 2a outputs image signals of the drafting board in a direction of the subject's face, and eye mark cameras 3b detect directions of the subject's right and left eyes separately and output coordinate signals. It is possible, therefore, to detect an attitude of the subject's head based on data put out of the eye camera and to analyze an absolute position variation of gaze point even in case that the subject moves the gaze point through turning his head up and down or right and left.

Description

【発明の詳細な説明】 (産業上の利用分野) 本発明は、視野カメラから出力される視野画像信号と対
象視界内に置かれた特徴点の信号からアイカメラを装着
した被検者の頭部の姿勢を計測する装置に関し、更に、
これらの信号と注視点の視野画像上での座標から、頭部
の上下左右の回動を伴う場合における対象視界内での注
視点の絶対位置の計測を可能にした装置に関するもので
ある。
Detailed Description of the Invention (Field of Industrial Application) The present invention is a method for detecting the head of a subject wearing an eye camera based on the visual field image signal output from the visual field camera and the signal of the feature point placed within the target field of view. Regarding the device for measuring the posture of the
The present invention relates to a device that makes it possible to measure the absolute position of a gaze point within a target field of view when the head is rotated vertically and horizontally from these signals and the coordinates of the gaze point on a visual field image.

(従来の技術) 人の注視点を解析する装置として、アイカメラがある。(Conventional technology) An eye camera is a device that analyzes a person's gaze point.

アイカメラは、視野カメラとアイマークカメラとを備え
ており、被検者の頭部に固定されて使用される。このと
き、視野カメラが被検者の正面の視野を撮影し、アイマ
ークカメラが眼球の正面からの偏倚を検出して、注視点
の位置を視野画像上の座標として出力する。
The eye camera includes a field of view camera and an eye mark camera, and is used while being fixed to the subject's head. At this time, the visual field camera photographs the subject's front visual field, the eye mark camera detects the deviation of the eyeball from the front, and outputs the position of the gaze point as coordinates on the visual field image.

従って、被検者の頭部を固定して測定を行えば、被検者
が対象物を観察するときの対象視界内での注視点の動き
は、アイマークカメラから出力されるアイマークの座標
を追跡することによって、自動的に解析できる。しかし
、対象視界内の一点を注視しながら頭部を動かしたとき
や、頭部の移動や回動によって注視点を移動させたとき
には、アイマークカメラから出力される座標(相対座標
)の変化は、実際の対象視界内での注視点の座標(絶対
座標)の変化を示さなくなる。そのため、自然な頭部の
動きを伴う場合の注視点の解析を行う必要があるときに
は、注視点の位置を示すアイマークを表示した視野画像
を逐次追跡して、人間が対象視界の映像を基に注視点の
絶対座標を読み取っており、注視点の絶対座標の計測及
びその移動量や移動方向の解析を自動的に行うことはで
きなかった。
Therefore, if measurements are taken with the subject's head fixed, the movement of the gaze point within the field of view when the subject observes the object is determined by the coordinates of the eyemark output from the eyemark camera. can be automatically analyzed by tracking. However, when you move your head while gazing at a point within the target field of view, or when you move or rotate your head to move the point of gaze, the coordinates (relative coordinates) output from the eye mark camera will change. , it no longer shows changes in the coordinates (absolute coordinates) of the gaze point within the actual field of view. Therefore, when it is necessary to analyze the gaze point when natural head movements are involved, humans can sequentially track visual field images displaying eye marks that indicate the position of the gaze point, and then The absolute coordinates of the point of interest are read in the system, and it was not possible to automatically measure the absolute coordinates of the point of interest and analyze the amount and direction of movement.

この問題を解決する方法として、対象視界内の定位置に
特徴点を複数個設置し、視野画像上での特徴点の座標を
検出することによって、視野画像の基準位置からの偏倚
を検出してアイマークカメラで検出された注視点の座標
を補正することにより、頭部が動いたときにも注視点の
絶対座標の変化を自動的に解析できるようにする方法が
提唱されている。
As a way to solve this problem, by setting multiple feature points at fixed positions within the target field of view and detecting the coordinates of the feature points on the field of view image, we can detect the deviation of the field of view image from the reference position. A method has been proposed that allows changes in the absolute coordinates of the gaze point to be automatically analyzed even when the head moves by correcting the coordinates of the gaze point detected by an eyemark camera.

その一つとして特開昭62−53630号公報には、最
初の1フレームの基準画像に対する計測時の視野画像の
変換行列を求め、アイマーク信号を基準画像上の座標に
変換して絶対位置を算出することにより、頭部の動きを
伴う場合の注視点の絶対座標を検出する方法が提唱され
ている。
As one of them, Japanese Patent Application Laid-Open No. 62-53630 discloses that a transformation matrix of a visual field image during measurement with respect to a reference image of the first frame is obtained, and an eye mark signal is converted to coordinates on the reference image to calculate the absolute position. A method has been proposed for detecting the absolute coordinates of the gaze point when head movement is involved.

(発明が解決しようとする課題) しかし、上記公報に記載された方法は、被検者の頭部の
対象視界の面と平行な方向の移動、核部の法線方向の移
動及び該法線回りの回動には対応できるが、頭部の左右
又は上下方向の回動、即ち、頭部の垂直軸回りの回動又
は左右方向水平軸回りの回動を伴う動きには対応できな
い。そして実際の対象視界の観測においては、被検者が
頭部を上下左右に回動させて注視点を移動させることが
非常に多いので、上記方法では、被検者の動きを大幅に
制限しなければ注視点の自動解析ができない問題があっ
た。
(Problem to be Solved by the Invention) However, the method described in the above publication requires movement of the subject's head in a direction parallel to the field of view, movement in the normal direction of the nucleus, and movement of the subject's head in the normal direction. However, it cannot accommodate rotations of the head in the left-right or up-down directions, that is, movements that involve rotation of the head around a vertical axis or rotations of the head around a horizontal axis in the left-right direction. In actual observation of the target field of view, the subject very often moves the point of gaze by turning the head up and down and left and right, so the above method does not significantly restrict the subject's movement. Without this, there was a problem that automatic analysis of gaze points could not be performed.

そこでこの発明は、被検者が頭部を上下又は左右に回動
させて注視点を移動させた場合でも、アイカメラから出
力されるデータに基いて被検者の頭部の姿勢を検出し、
注視点の絶対位置の変化を自動的に解析できる装置を得
ることを課題としている。
Therefore, the present invention detects the posture of the subject's head based on the data output from the eye camera even when the subject moves the gaze point by rotating the head up and down or left and right. ,
The objective is to obtain a device that can automatically analyze changes in the absolute position of the gaze point.

(課題を解決するための手段) 本発明では、観察する対象視界l内に4個以上の特徴点
P、〜P、を設置し、この特徴点の少なくとも任意の4
個が視野カメラ3aが撮影した視野画像4内に入るよう
に特徴点P、〜P、相互の位置関係を決めである。この
特徴点は、周囲より十分に輝度を高くすることによって
、画像処理により視野画像4上における位置座標を検出
可能にする。各特徴点は、視野画像の一画面走査(例え
ばl/30秒)ごとに垂直同期信号に同期させて順次発
光させるか、発光色を変えてカラー画像処理する等、検
出した特徴点の各々を識別できるようにする。
(Means for Solving the Problems) In the present invention, four or more feature points P, ~P, are installed within the field of view l of the object to be observed, and at least any four of these feature points
The mutual positional relationship of the feature points P, ~P is determined so that the feature points P, . By making the brightness of this feature point sufficiently higher than the surroundings, the position coordinates on the visual field image 4 can be detected by image processing. Each detected feature point may be sequentially emitted in synchronization with a vertical synchronization signal every one screen scan (for example, 1/30 seconds) of the visual field image, or the emitted light color may be changed and color image processing may be performed. Be identifiable.

本発明の装置は、視野カメラ3aの画像信号から上記特
徴点P+”P*の信号を分離する特徴点信号検出手段3
2と、検出された特徴点の像pl〜p、の視野画像4上
での座標を求める特徴点座標検出手段33と、隣接する
特徴点間の視野画像4上での距Md□、dX□、dyI
、dy!を求める特徴点間距離算出手段34とを備えて
いる。
The apparatus of the present invention includes a feature point signal detection means 3 that separates the signal of the feature point P+"P* from the image signal of the visual field camera 3a.
2, a feature point coordinate detection means 33 for determining the coordinates of the detected feature point images pl to p on the visual field image 4, and distances Md□, dX□ on the visual field image 4 between adjacent feature points. ,dyI
,dy! The distance calculation means 34 between feature points calculates the distance between feature points.

そして更にこの発明の装置は、対象視界lに対する頭部
2aの左右及び上下方向の回動を含む姿勢変化の検出及
び注視点Qの絶対座標の計測を可能にする手段として、
4個の特徴点の像p1〜p4を頂点とする四辺形の対向
辺に対応する2個の特徴点間距離を比較する頭部位置・
姿勢算出手段35を備えている。
Furthermore, the device of the present invention serves as a means for detecting posture changes including horizontal and vertical rotations of the head 2a with respect to the target visual field l, and for measuring the absolute coordinates of the gaze point Q.
A head position that compares the distance between two feature points corresponding to opposite sides of a quadrilateral whose vertices are images p1 to p4 of four feature points.
A posture calculation means 35 is provided.

(作用〉 被験者が対象視界1のある点Qを注視したとき、頭部2
aに固定した視野カメラ3aの画像信号から検出される
4個の特徴点の像p1〜p4の座標は、特徴点P、−P
4を設置した対象視界lと被験者の頭部2aの位置関係
を表している。注視点Qの計測に当たっては、特徴点P
I−Pqを設置した対象視界1から被検者の頭部2aま
での距離と視野画像4上での各特徴点の像間距離dx、
d、との関係をあらかじめ測定しておく。特徴点P1〜
P、を設置した面に平行な頭部2aの移動や核部の法線
回りの頭部の回転や咳法線方向の頭部の移動は、それぞ
れ、特徴点の像p+−p4の平行移動、旋回及び特徴点
の像相互の間隔の変化を測定することにより検出するこ
とができる。また、被検者の上下又は左右への頭部の回
動は、視野画像4上の4個の特徴点の像pI−p4で形
成される四辺形の対向する辺の距離d□、dイ2、dy
I、dyt相互を比較することにより、検出可能である
(Effect) When the subject gazes at a certain point Q in the target visual field 1, the head 2
The coordinates of images p1 to p4 of four feature points detected from the image signal of the field-of-view camera 3a fixed at point a are the feature points P, -P
4 represents the positional relationship between the target field of view l and the subject's head 2a. When measuring the gaze point Q, the feature point P
The distance from the target visual field 1 where I-Pq is installed to the subject's head 2a and the inter-image distance dx of each feature point on the visual field image 4,
The relationship between d and d is measured in advance. Feature point P1~
The movement of the head 2a parallel to the plane where P is installed, the rotation of the head around the normal to the nucleus, and the movement of the head in the cough normal direction are respectively parallel movements of the feature point image p+-p4. , can be detected by measuring the rotation and the change in the distance between the images of the feature points. In addition, the rotation of the subject's head up and down or left and right is determined by the distances d 2.dy
It can be detected by comparing I and dyt.

従って、上記検出されたデータに従ってアイマークの相
対座標を絶対座標に換算することが可能である。
Therefore, it is possible to convert the relative coordinates of the eyemark into absolute coordinates according to the detected data.

例えば、長方形の4隅の位置に特徴点を設置した場合、
対象視界面の法線が視野カメラ3aの視線方向と一致す
る場合には、視野画像4上の各特徴点の像pI−p4の
位置関係も長方形をなしく第4図)、視野カメラ3aか
ら対象視界1までの距離が一定である限り、頭部2aが
左右・上下に平行移動しても視野画像4上の長方形の形
状・大きさは同じである。このときの平行移動量は、1
個の特徴点に注目してその像の視野画像4上での移動量
から求められる。頭部2aが前後に移動したときは、長
方形の形状は変わらないが、大きさが変化する。すなわ
ち、視野画像上における各特徴点の像間距離dX、dy
が同率で変化するため、この像間距離を計測することに
より、頭部と対象視界面との法線方向の距離が決定され
、アイマークの像qと特徴点の像p1〜p4との位置関
係から注視点の絶対位置が決定される。また、頭部2a
が左右の回転移動を伴った場合は、第5図に示すように
、特徴点の像間距離がd□=dx!、dyI≠dy2の
関係となり、上下の回転の場合はd訓≠dX2% dy
I = d y2の関係となり、この像間距離の関係か
ら各々の特徴点P、−P4の視野カメラ3aからの距離
、対象視界の法線とカメラ視線方向のなす角度が求めら
れるため、このときの注視点Qの絶対座標が求められる
For example, if feature points are placed at the four corners of a rectangle,
When the normal line of the target field of view coincides with the line of sight direction of the field of view camera 3a, the positional relationship between the images pI-p4 of each feature point on the field of view image 4 also becomes non-rectangular (Fig. 4), and from the field of view camera 3a. As long as the distance to the target field of view 1 is constant, the shape and size of the rectangle on the field of view image 4 will remain the same even if the head 2a moves horizontally and vertically in parallel. The amount of parallel movement at this time is 1
It is determined from the amount of movement of the image on the visual field image 4 by focusing on each feature point. When the head 2a moves back and forth, the rectangular shape does not change, but the size changes. That is, the inter-image distance dX, dy of each feature point on the visual field image
changes at the same rate, by measuring this inter-image distance, the distance in the normal direction between the head and the target viewing plane is determined, and the positions of the eye mark image q and the feature point images p1 to p4 are determined. The absolute position of the point of interest is determined from the relationship. In addition, the head 2a
is accompanied by left and right rotational movement, as shown in FIG. 5, the distance between the images of the feature points is d□=dx! , the relationship is dyI≠dy2, and in the case of vertical rotation, d practice≠dX2% dy
The relationship is I = d y2, and from this relationship of image distance, the distance of each feature point P, -P4 from the field of view camera 3a, and the angle between the normal of the target field of view and the direction of the camera's line of sight can be found. The absolute coordinates of the gaze point Q are determined.

(実施例) 以下この発明の一実施例を添付図面を参照して詳細に説
明する。
(Example) An example of the present invention will be described in detail below with reference to the accompanying drawings.

第2図は本発明の一実施システムの全景で、対象視界と
なる図板1に相対して作業する被検者2の図板l上での
注視点Qの動きを解析するものである0図板l上には、
特徴点となるLED (発光ダイオード)P、〜P、を
X方向、y方向にそれぞれ一定間隔り、、D、で9個設
置する。LEDP、−P、の設置位置は、被検者2が主
に注視する箇所を中心に被検者の頭部2aに固定した視
野カメラ3aの撮影領域に最低4個のLEDが入る位置
である。
FIG. 2 is a panoramic view of one implementation system of the present invention, which analyzes the movement of a gaze point Q on a drawing board 1 of a subject 2 who is working in opposition to a drawing board 1 serving as a target field of view. On the diagram,
Nine LEDs (light emitting diodes) P, ~P, serving as feature points, are installed at regular intervals in the X and Y directions, D, respectively. The installation position of the LEDP, -P, is such that at least four LEDs are included in the photographing area of the visual field camera 3a fixed to the subject's head 2a, centering on the area where the subject 2 mainly gazes. .

被検者の頭部2aには、1個の視野カメラ3aと2個の
アイマークカメラ3bとを備えたアイカメラ3が固定さ
れる。視野カメラ3aは、被検者2の顔が向いている方
向の図板1の画像信号を出力する。アイマークカメラ3
bは、被検者2の眼球の方向を左右側々に検出してその
座標信号を出力する。アイカメラ3の出力は、記録装置
20とLEDコントローラ10に与えられ、LEDコン
トローラ10はL E D P +〜P、の点灯タイミ
ングをM御している。記録装置20に記録された計測情
報は、解析装置30で解析され、被検者の頭部2aの位
置・姿勢及び注視点Qの絶対座標が算出される。
An eye camera 3 including one visual field camera 3a and two eye mark cameras 3b is fixed to the subject's head 2a. The visual field camera 3a outputs an image signal of the drawing board 1 in the direction in which the face of the subject 2 is facing. eye mark camera 3
b detects the direction of the eyeballs of the subject 2 to the left and right sides and outputs the coordinate signals thereof. The output of the eye camera 3 is given to the recording device 20 and the LED controller 10, and the LED controller 10 controls the lighting timing of the LEDs P+ to P. The measurement information recorded in the recording device 20 is analyzed by the analysis device 30, and the position and posture of the subject's head 2a and the absolute coordinates of the gaze point Q are calculated.

LEDP、〜P、の点灯タイミングを制御する手段及び
視野カメラ3aの画像信号から被検者の位置・姿勢を検
出する手段をシステムブロック図(第1図)に従って説
明する。視野画像信号は、アイカメラ(例えばナック社
のアイマークレコーダ等)の視野カメラ3aから出力さ
れ、アイマーク座標信号は、アイマークカメラ3bから
出力される。実際のアイカメラでは、アイマーク座標信
号を画像信号として視野画像信号と合威し、同時にディ
ジタル座標信号として出力している。記録装置20は、
この両信号を記録媒体上に記録する。
The means for controlling the lighting timing of the LEDPs, . The visual field image signal is output from a visual field camera 3a of an eye camera (for example, an eye mark recorder manufactured by Knack Corporation), and the eye mark coordinate signal is output from an eye mark camera 3b. In an actual eye camera, the eye mark coordinate signal is combined with the visual field image signal as an image signal, and simultaneously output as a digital coordinate signal. The recording device 20 is
Both signals are recorded on a recording medium.

LEDコントローラ10は、視野画像信号から、同期信
号分離器11で1/30秒間隔の垂直同期信号を分離し
て出力し、LED点灯信号発生器12で垂直同期信号を
分配して、第3図に示すタイよングでLED点灯信号が
出力され、L E D P +〜P、を順次点灯させる
。従って、■つのLEDの点灯間隔は、9730秒であ
る。視野画像信号は、第4図に示すように、少なくとも
4個のLEDの像pI−p4及びアイマークの像qを含
んでおり、記録装置20で磁気テープ等の記録媒体上に
記録される。
The LED controller 10 separates and outputs vertical synchronizing signals at 1/30 second intervals from the visual field image signal using a synchronizing signal separator 11, and distributes the vertical synchronizing signals using an LED lighting signal generator 12, as shown in FIG. An LED lighting signal is output at the timing shown in FIG. Therefore, the lighting interval of the two LEDs is 9730 seconds. As shown in FIG. 4, the visual field image signal includes at least four LED images pI-p4 and an eye mark image q, and is recorded by the recording device 20 on a recording medium such as a magnetic tape.

記録された視野画像信号は、解析装置30の再生器31
で再生されてLED信号検出器32に与えられ、画像信
号中にLEDの像p、〜p4を検出したときのタイ壽ン
グ信号が、LED座標検出器33に与えられる。LED
P、〜P、は、輝度を背景に比べて充分高くしておき、
LED信号検出器32は、適当なしきい値で画像信号を
2値化してL E D P r −P *の像の信号を
検出し、その検出タイミングから、視野画像4上での座
標値をLED座標検出器33が検出する。視野画像上で
のLEDの像plNP9の座標は、Y座標は走査線の数
で、X座標はl走査線を320分割するクロックパルス
を発生させその数で検出する。
The recorded visual field image signal is transmitted to the regenerator 31 of the analysis device 30.
A tying signal is given to the LED coordinate detector 33 when the LED images p, to p4 are detected in the image signal. LED
For P, ~P, the brightness is set sufficiently high compared to the background.
The LED signal detector 32 binarizes the image signal with an appropriate threshold value, detects the signal of the image of L E D P r -P *, and from the detection timing detects the coordinate value on the visual field image 4 of the LED. The coordinate detector 33 detects. As for the coordinates of the LED image plNP9 on the visual field image, the Y coordinate is the number of scanning lines, and the X coordinate is detected by generating a clock pulse that divides 1 scanning line by 320.

なお図に示す実施例は、録画した画像信号を再生して注
視点を解析する例であるが、アイカメラの視野カメラ3
aの視野画像信号を直接LED信号検出器32へ転送し
てリアルタイムに解析することも可能である。
Note that the embodiment shown in the figure is an example in which the recorded image signal is played back and the gaze point is analyzed, but the visual field camera 3 of the eye camera
It is also possible to directly transfer the visual field image signal of a to the LED signal detector 32 and analyze it in real time.

LED間距離算出手段34では、LED座標検出器33
で求めらた各LEDの像の座標から各120間の像間路
Md□、dX□、d、いdy2を算出し、この像間距離
から頭部位置・姿勢算出手段35で各L E D P 
+〜P4と頭部2aとの距離を算出し、図板lに対する
頭部2aの位置・姿勢を求める。注視点絶対座標算出手
段36では、頭部位置・姿勢信号及び再生器31から与
えられるアイマーク座標信号から注視点Qの絶対位置を
算出する。実施例では、LEDの像p+”l)mの座標
検出までをハードウェアで処理し、検出した各座標値を
コンピュータへl/30秒毎に転送し、9/30秒間は
頭部は固定状態であるとして、LEDの像間距離の算出
以後の処理をソフトウェアで行っている。ソフトウェア
での処理内容の詳細は以下の通りである。
In the inter-LED distance calculation means 34, the LED coordinate detector 33
The inter-image paths Md□, dX□, d, and Idy2 between each 120 are calculated from the coordinates of the image of each LED obtained in , and from this inter-image distance, the head position/posture calculation means 35 calculates each L E D P
The distance between +~P4 and the head 2a is calculated, and the position/posture of the head 2a with respect to the drawing board l is determined. The gaze point absolute coordinate calculation means 36 calculates the absolute position of the gaze point Q from the head position/posture signal and the eye mark coordinate signal provided from the regenerator 31. In the example, the process up to the coordinate detection of the LED image p+"l)m is processed by hardware, and each detected coordinate value is transferred to the computer every 1/30 seconds, and the head is kept stationary for 9/30 seconds. Assuming that, the processing after calculation of the distance between images of the LED is performed by software.The details of the processing by the software are as follows.

被検者2の顔が図板1と正対したときに視野カメラ3a
の方向が図板1の法線と一致するように視野カメラ3a
を頭部2aに固定し、この基準姿勢のときの視野カメラ
3aから図板1までの距離をtoとし、視野カメラ3a
から出力される視野画像4を第4図とすると、視野画像
4におけるLEDの像間距離dxl、d IL!% d
 71、dytと図板1上の絶対距離り、、D、との変
換係数はX方向、y方向それぞれ dxa= dxt= dxz   dyo= d’y+
= dyz (2)で表される。
When the face of the subject 2 is directly facing the drawing board 1, the visual field camera 3a
The field of view camera 3a is aligned so that the direction of
is fixed to the head 2a, the distance from the visual field camera 3a to the drawing board 1 in this standard posture is to, and the visual field camera 3a is fixed to the head 2a.
If the field of view image 4 output from the field of view image 4 is shown in FIG. %d
71. The conversion coefficients between dyt and the absolute distance on drawing board 1, D, are dxa= dxt= dxz dyo= d'y+ in the X direction and y direction, respectively.
= dyz (2).

注視点絶対位置の算出は、変換係数をα8、α9、アイ
マークの像qの視野画像4上の座標を(xL、yi)、
特徴点の像plの座標を(XLI、7LI)へ、図板上
の座標を(XLI、Y t、 r )とすると、絶対座
標(X、Y)は、 X=XL++ (XI   XLI)  °Ctx  
  (3)Y−YLI+  ()’I   Vt+) 
 ・αy    (4)である。ここで、α8=α8゜
、α、=α、。である。
The absolute position of the gaze point is calculated by using the conversion coefficients α8, α9, the coordinates of the eye mark image q on the visual field image 4 as (xL, yi),
If the coordinates of the feature point image pl are (XLI, 7LI) and the coordinates on the drawing board are (XLI, Y t, r ), then the absolute coordinates (X, Y) are: X=XL++ (XI XLI) °Ctx
(3) Y-YLI+ ()'I Vt+)
・αy (4). Here, α8=α8°, α,=α,. It is.

頭部2aが図版との距離t0を維持したまま図版1と平
行に左右・上下移動した時は、画像上の特徴点相互の位
置関係は変化しないから、絶対座標は式゛(1)〜(4
)で算出される。
When the head 2a moves horizontally and vertically in parallel with the illustration 1 while maintaining the distance t0 from the illustration, the positional relationship between the feature points on the image does not change, so the absolute coordinates are calculated using the formulas (1) to ( 4
) is calculated.

頭部2aが前後に移動し、図板1との距離tがtoから
変化したときは、画像上のLEDの像間距離がX方向y
方向同率で変化する。このときの変換係数は、LEDの
像間距離をそれぞれdX、d、とすると であり、同様に絶対座標は式(3)、(4〉で算出され
る。
When the head 2a moves back and forth and the distance t from the drawing board 1 changes from to, the distance between the images of the LEDs on the image changes in the X direction y.
Changes at the same rate in direction. The conversion coefficients at this time are given by dX and d, respectively, for the distance between images of the LED, and the absolute coordinates are similarly calculated using equations (3) and (4>).

距離tと変換係数との関係は一般に αX 1αxo’ 0 α、=α、。・             (6)0 であるが、個々の視野カメラに応じた補正が必要であり
、その係数をη(dx)、η(dy)とすると、距離t
はLEDの像間距離dx、d、から次式で表わされる。
Generally, the relationship between the distance t and the conversion coefficient is αX 1αxo' 0 α, = α.・(6)0, but correction is required depending on each field of view camera, and if the coefficients are η(dx) and η(dy), then the distance t
is expressed by the following equation from the image distance dx, d of the LED.

頭部2aの回転が生じたときの画像は、第5図(垂直軸
回りの回動のとき)のように変化する。
When the head 2a rotates, the image changes as shown in FIG. 5 (when rotating around the vertical axis).

つまり、特徴点の像9+ 9g間とps 91間のX方
向距離は等しいが、p+  92間とpz pa間のy
方向距離は、頭部の回動角により変化する。特徴点p1
及びp、の視野カメラ3aからのカメラ視線方向の距離
をaspz及びp4の距離をbとすると、頭部の回動角
は第6図に示すようにり。
In other words, the distance in the X direction between feature point images 9+ 9g and ps 91 is the same, but the y distance between p+ 92 and pz pa is the same.
The directional distance changes depending on the rotation angle of the head. Feature point p1
Assuming that the distance in the camera line-of-sight direction from the field-of-view camera 3a of and p is aspz, and the distance of p4 is b, the rotation angle of the head is as shown in FIG.

である、また、注視点Qの視野カメラ3aからの距離は Sv  =d、、 +  (dyz  dl+)  ・
 λ    (9)ここで λ= Sx /dx 、 dx =dx+−dxz  
 (10)より式(7)から求められ、これをCとする
と注視点の像qのX方向の絶対座標は 注視点の像qのY方向の絶対座標は Y=YLI+(yI   3’+2)  ・αアS。
, and the distance of the gaze point Q from the field of view camera 3a is Sv = d,, + (dyz dl+) ・
λ (9) where λ= Sx /dx, dx = dx+-dxz
(10) is obtained from equation (7), and if this is C, the absolute coordinate of the image q of the point of gaze in the X direction is the absolute coordinate of the image q of the point of gaze in the Y direction: Y=YLI+(yI 3'+2)・αaS.

である。It is.

頭部2aの上下方向の回動が生したときは、上記におい
て、x、yの位置関係が逆になり算出方法は同様である
When the head 2a rotates in the vertical direction, the positional relationship between x and y is reversed and the calculation method is the same as described above.

(発明の効果) 以上説明したように本発明によれば、被験者が注視する
対象物に絶対位置を示す特徴点を4個以上設置すること
で、アイカメラによる注視点解析を被験者頭部の左右又
は垂直軸回りの回動を伴う場合でも自動的に行うことが
できる。また、注視点の絶対位置計測およびその追跡か
ら注視点の解析を自動で行うだけでなく、頭部の位置や
方向も常に求めることができるため、各種作業や運動の
解析に有用な装置を得ることができる。
(Effects of the Invention) As explained above, according to the present invention, by installing four or more feature points that indicate the absolute position on the object that the subject is gazing at, the eye camera can perform gaze point analysis on the left and right sides of the subject's head. Alternatively, it can be performed automatically even when rotation around a vertical axis is involved. In addition, not only can the absolute position of the gaze point be measured and the analysis of the gaze point automatically performed by tracking it, but the position and direction of the head can also be determined at any time, resulting in a device that is useful for analyzing various tasks and movements. be able to.

【図面の簡単な説明】 第1図は本発明の一実施例の構成を示すプロンク図、第
2図はシステムの全景図、第3図は9個のLEDの点灯
タイミングを示す図、第4図は視野画像上のLEDの位
置関係を示す図、第5図は頭部の回転が生じたときの視
野画像上のLEDの位置関係を示す図、第6図は頭部の
回転が生じたときの対象視界面と視野カメラとの位置関
係を示す図、第7図は基準位置における対象視界と視野
画像との関係を模式的に示す図である。 図中、 1:図板 3a:視野カメラ 4:視野画像 33:LED座標検出器 2a:頭部 3b:アイマークカメラ 32:LED信号検出器 34: LED間距離算出手段 35:頭部位置・姿勢算出手段
[BRIEF DESCRIPTION OF THE DRAWINGS] Fig. 1 is a prong diagram showing the configuration of an embodiment of the present invention, Fig. 2 is a panoramic view of the system, Fig. 3 is a diagram showing the lighting timing of nine LEDs, and Fig. 4 is a diagram showing the lighting timing of nine LEDs. The figure shows the positional relationship of the LEDs on the visual field image, Figure 5 shows the positional relationship of the LEDs on the visual field image when the head rotates, and Figure 6 shows the positional relationship of the LEDs on the visual field image when the head rotates. FIG. 7 is a diagram schematically showing the relationship between the target field of view and the field of view image at the reference position. In the figure: 1: Drawing board 3a: Field of view camera 4: Field of view image 33: LED coordinate detector 2a: Head 3b: Eye mark camera 32: LED signal detector 34: Inter-LED distance calculation means 35: Head position/posture Calculation method

Claims (2)

【特許請求の範囲】[Claims] (1)視野カメラ(3a)で撮影された視野画像(4)
の画像情報から対象視界(1)に対する頭部(2a)の
姿勢を計測する注視姿勢の計測装置において、 対象視界(1)内の定位置に設置した4個以上の特徴点
(P_1〜P_9)と、視野カメラ(3a)の画像情報
から上記特徴点の信号を検出する特徴点信号検出手段(
32)と、検出された特徴点の信号の視野画像(4)上
での座標を求める特徴点座標検出手段(33)と、隣接
する特徴点間の視野画像(4)上での距離を求める特徴
点間距離算出手段(34)と、視野画像(4)上での4
個の特徴点を頂点とする四辺形の対向辺に対応する2個
の特徴点間距離を比較する手段を含む頭部位置・姿勢算
出手段(35)とを備え、対象視界(1)に対する頭部
(2a)の左右及び上下方向の回動を含む姿勢変化を検
出可能にした、注視点姿勢の計測装置。
(1) Field of view image (4) taken with field of view camera (3a)
In a gaze posture measurement device that measures the posture of the head (2a) with respect to the target field of view (1) from image information of the target field of view (1), four or more feature points (P_1 to P_9) installed at fixed positions within the target field of view (1) are used. and feature point signal detection means (
32), a feature point coordinate detection means (33) for determining the coordinates of the signal of the detected feature point on the visual field image (4), and a distance between adjacent feature points on the visual field image (4). Inter-feature point distance calculation means (34) and 4 on the visual field image (4)
head position/posture calculation means (35) including means for comparing distances between two feature points corresponding to opposite sides of a quadrilateral having the feature points as vertices; A gaze point posture measuring device capable of detecting posture changes including horizontal and vertical rotations of the portion (2a).
(2)アイカメラ(3)から出力される視野画像(4)
の画像情報と該視野画像上での注視点(Q)の位置を示
すアイマーク座標信号とから視野画像(4)上での注視
点の像(q)の絶対座標を計測する注視点絶対位置の計
測装置において、 対象視界(1)内の定位置に設置した4個以上の特徴点
(P_1〜P_9)と、視野カメラ(3a)の画像情報
から上記特徴点の信号を検出する特徴点信号検出手段(
32)と、検出された特徴点の像の視野画像(4)上で
の座標を求める特徴点座標検出手段(33)と、隣接す
る特徴点間の視野画像(4)上での距離を求める特徴点
間距離算出手段(34)と、視野画像(4)上での4個
の特徴点を頂点とする四辺形の対向辺に対応する2個の
特徴点間距離を比較する手段を含む頭部位置・姿勢算出
手段(35)とを備え、対象視界(1)に対する頭部(
2a)の左右及び上下方向の回動を含む姿勢変化時にお
ける注視点(Q)の絶対座標の計測を可能とした、注視
点絶対位置の計測装置。
(2) Visual field image (4) output from the eye camera (3)
absolute position of the gaze point, which measures the absolute coordinates of the image (q) of the gaze point on the visual field image (4) from the image information of and the eye mark coordinate signal indicating the position of the gaze point (Q) on the visual field image; In the measuring device, four or more feature points (P_1 to P_9) installed at fixed positions within the target field of view (1) and a feature point signal that detects the signal of the feature point from the image information of the field of view camera (3a). Detection means (
32), feature point coordinate detection means (33) for determining the coordinates of the image of the detected feature point on the visual field image (4), and determining the distance between adjacent feature points on the visual field image (4). A head including a feature point distance calculation means (34) and a means for comparing distances between two feature points corresponding to opposite sides of a quadrilateral having four feature points as vertices on the visual field image (4). head position/orientation calculation means (35), the head (
2a) A device for measuring the absolute position of the point of gaze that is capable of measuring the absolute coordinates of the point of gaze (Q) during posture changes including horizontal and vertical rotations.
JP1213917A 1989-08-19 1989-08-19 Apparatus for measuring gaze attitude and absolute position of gaze point Granted JPH0377533A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP1213917A JPH0377533A (en) 1989-08-19 1989-08-19 Apparatus for measuring gaze attitude and absolute position of gaze point

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP1213917A JPH0377533A (en) 1989-08-19 1989-08-19 Apparatus for measuring gaze attitude and absolute position of gaze point

Publications (2)

Publication Number Publication Date
JPH0377533A true JPH0377533A (en) 1991-04-03
JPH0371129B2 JPH0371129B2 (en) 1991-11-12

Family

ID=16647180

Family Applications (1)

Application Number Title Priority Date Filing Date
JP1213917A Granted JPH0377533A (en) 1989-08-19 1989-08-19 Apparatus for measuring gaze attitude and absolute position of gaze point

Country Status (1)

Country Link
JP (1) JPH0377533A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100491469B1 (en) * 2002-06-12 2005-05-25 주식회사 애트랩 Optical mouse with display function
JP2008212718A (en) * 2008-05-15 2008-09-18 Eyemetrics Japan Co Ltd Visual field detection system
JP2009199004A (en) * 2008-02-25 2009-09-03 Nippon Steel Corp Technology managing device and technology managing method
JP2014155635A (en) * 2013-02-18 2014-08-28 Iwate Prefectural Univ Line-of-sight measurement apparatus, display method for watching region, display method for gaussian distribution of watching points
JP2015228992A (en) * 2014-06-05 2015-12-21 大日本印刷株式会社 Visual axis analysis system and visual axis analysis device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6253631A (en) * 1985-09-03 1987-03-09 日本電気株式会社 Method and apparatus for measuring position of steady gaze point
JPS6468235A (en) * 1987-09-07 1989-03-14 Agency Ind Science Techn Head mount type three-dimensional optometer equipped with eyeball/head cooperative movement analyser

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6253631A (en) * 1985-09-03 1987-03-09 日本電気株式会社 Method and apparatus for measuring position of steady gaze point
JPS6468235A (en) * 1987-09-07 1989-03-14 Agency Ind Science Techn Head mount type three-dimensional optometer equipped with eyeball/head cooperative movement analyser

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100491469B1 (en) * 2002-06-12 2005-05-25 주식회사 애트랩 Optical mouse with display function
JP2009199004A (en) * 2008-02-25 2009-09-03 Nippon Steel Corp Technology managing device and technology managing method
JP2008212718A (en) * 2008-05-15 2008-09-18 Eyemetrics Japan Co Ltd Visual field detection system
JP2014155635A (en) * 2013-02-18 2014-08-28 Iwate Prefectural Univ Line-of-sight measurement apparatus, display method for watching region, display method for gaussian distribution of watching points
JP2015228992A (en) * 2014-06-05 2015-12-21 大日本印刷株式会社 Visual axis analysis system and visual axis analysis device

Also Published As

Publication number Publication date
JPH0371129B2 (en) 1991-11-12

Similar Documents

Publication Publication Date Title
JP5858433B2 (en) Gaze point detection method and gaze point detection device
JP5915981B2 (en) Gaze point detection method and gaze point detection device
US5870136A (en) Dynamic generation of imperceptible structured light for tracking and acquisition of three dimensional scene geometry and surface characteristics in interactive three dimensional computer graphics applications
WO2018061925A1 (en) Information processing device, length measurement system, length measurement method, and program storage medium
CN104871176B (en) Scanning device and method for positioning scanning device
WO2019172351A1 (en) Object identification device, object identification system, object identification method, and program recording medium
GB2452944A (en) Imaging system and method for generating a depth map using three light sources having different frequencies
JPH07286837A (en) Instrument and method for measuring rotational amount of spherical body
JP6816773B2 (en) Information processing equipment, information processing methods and computer programs
JP6879375B2 (en) Information processing equipment, length measurement system, length measurement method and computer program
JPH1163927A (en) Head position and posture measuring device, and operation monitoring device
JP2009210331A (en) Camera calibration apparatus and camera calibration method
JP2002143094A (en) Visual axis detector
CN110441326A (en) Defect inspection method and detection device and computer readable storage medium
JPH0377533A (en) Apparatus for measuring gaze attitude and absolute position of gaze point
CN109102548A (en) It is a kind of for identifying the method and system of following range
JP2012055418A (en) View line detection device and view line detection method
JPS63289406A (en) Three-dimensional configuration measuring instrument
JP2003023562A (en) Image photographic system and camera system
KR20050041525A (en) Motion capture system
US20240071085A1 (en) Detection system, method, and storage medium
JP3042773B2 (en) 3D motion analyzer
JPH0650983A (en) Operation analyzer for specimen part
JPS62176427A (en) Apparatus for measuring position of notice point
JPS59156078A (en) Inspecting method of display condition of color picture tube