JPH11276438A - Visual line measuring device - Google Patents

Visual line measuring device

Info

Publication number
JPH11276438A
JPH11276438A JP8429798A JP8429798A JPH11276438A JP H11276438 A JPH11276438 A JP H11276438A JP 8429798 A JP8429798 A JP 8429798A JP 8429798 A JP8429798 A JP 8429798A JP H11276438 A JPH11276438 A JP H11276438A
Authority
JP
Japan
Prior art keywords
subject
head
eyeball
intersection
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP8429798A
Other languages
Japanese (ja)
Inventor
Takeshi Fujimura
武志 藤村
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Isuzu Motors Ltd
Original Assignee
Isuzu Motors Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Isuzu Motors Ltd filed Critical Isuzu Motors Ltd
Priority to JP8429798A priority Critical patent/JPH11276438A/en
Publication of JPH11276438A publication Critical patent/JPH11276438A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Processing (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

PROBLEM TO BE SOLVED: To provide a visual line measuring device capable of judging an object position looked by a subject. SOLUTION: A sight line measuring device is provided with a visual line direction output means 1 for outputting the visual line direction synthesized by these eyeball direction and head direction by detecting the eyeball direction of a subject and the head direction of the subject, an intersection detecting means 2 for detecting an intersection C at which this visual line direction crosses either surface of a polyhedron T surrounding the subject and an intersection display means 3 for displaying a position of the detected intersection C on a development plane of the polyhedron T.

Description

【発明の詳細な説明】DETAILED DESCRIPTION OF THE INVENTION

【0001】[0001]

【発明の属する技術分野】本発明は、被験者が見ている
方向、即ち視線方向を計測する視線計測装置に係り、特
に、被験者が見ている対象の位置が判定できる視線計測
装置に関するものである。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a gaze measuring apparatus for measuring a direction in which a subject is looking, that is, a gaze direction, and more particularly, to a gaze measuring apparatus capable of determining a position of an object viewed by a subject. .

【0002】[0002]

【従来の技術】車両等の運転中における安全性を高める
ために、運転席から見えない領域、即ち死角を減らすこ
とが望まれる。死角を調べるには、被験者を運転席につ
かせ、その被験者の視線方向を計測することが有効であ
る。しかし、視線方向を計測することは困難であり、従
来、適当な装置がなかった。
2. Description of the Related Art In order to enhance safety during driving of a vehicle or the like, it is desired to reduce an area invisible from a driver's seat, that is, a blind spot. To examine the blind spot, it is effective to hold the subject in the driver's seat and measure the direction of the line of sight of the subject. However, it is difficult to measure the line-of-sight direction, and there has been no suitable device in the past.

【0003】例えば、特開平7−208927号には、
運転者の顔画像から眼球の位置を検出する技術が記載さ
れている。しかし、顔画像上の眼球位置が検出できて
も、視線方向までは検出できない。上記技術は、視線方
向までは得られないため、死角を調べることには利用で
きない。
For example, JP-A-7-208927 discloses that
A technique for detecting the position of an eyeball from a driver's face image is described. However, even if the eyeball position on the face image can be detected, it cannot be detected up to the line of sight. The technique described above cannot be used for examining blind spots because it cannot be obtained up to the line of sight.

【0004】[0004]

【発明が解決しようとする課題】ところで、視線方向と
は、人が実際に見ている方向のことである。人は、眼球
を移動させるだけでなく、頭部を移動させて視線方向を
変えている。運転席では頭部が自由に動かせるから、運
転席における被験者の視線方向を計測するには被験者の
頭部の移動を考慮する必要がある。
The gaze direction is the direction in which a person is actually looking. In addition to moving an eyeball, a person changes a line of sight by moving a head. Since the head can be freely moved in the driver's seat, it is necessary to consider the movement of the subject's head in measuring the gaze direction of the subject in the driver's seat.

【0005】本出願人は、被験者の眼球の変位と頭部方
向とを独立してそれぞれの検出手段で検出し、眼球の変
位から被験者の眼球方向を求め、この眼球方向を頭部方
向と合成する方法を提案している。しかし、得られた視
線方向は、被験者が見ている方向を示すのみで、被験者
が見ている対象の位置までは判らない。被験者が見てい
る対象の位置が判定できることが望まれる。
The present applicant independently detects the displacement of the subject's eyeball and the head direction by respective detecting means, obtains the subject's eyeball direction from the displacement of the eyeball, and combines the subject's eyeball direction with the head direction. Suggest a way to do it. However, the obtained line-of-sight direction only indicates the direction in which the subject is looking, and does not indicate the position of the subject that the subject is looking at. It is desired that the position of the object that the subject is looking at can be determined.

【0006】そこで、本発明の目的は、上記課題を解決
し、被験者が見ている対象の位置が判定できる視線計測
装置を提供することにある。
SUMMARY OF THE INVENTION It is an object of the present invention to solve the above-mentioned problems and to provide a gaze measuring apparatus capable of determining the position of a subject viewed by a subject.

【0007】[0007]

【課題を解決するための手段】上記目的を達成するため
に本発明は、被験者の眼球方向と被験者の頭部方向とを
検出し、これら眼球方向と頭部方向とを合成した視線方
向を出力する視線方向出力手段と、この視線方向が被験
者に臨む面に交わる交点を検出する交点検出手段と、検
出された交点の位置を前記多面体の展開平面上に表示す
る交点表示手段とを備えたものである。
In order to achieve the above object, the present invention detects the eye direction of a subject and the head direction of the subject, and outputs a gaze direction obtained by combining the eye direction and the head direction. Gaze direction output means, an intersection detection means for detecting an intersection where the gaze direction intersects a surface facing the subject, and an intersection display means for displaying the position of the detected intersection on a development plane of the polyhedron. It is.

【0008】[0008]

【発明の実施の形態】以下、本発明の一実施形態を添付
図面に基づいて詳述する。
An embodiment of the present invention will be described below in detail with reference to the accompanying drawings.

【0009】図1に示されるように、本発明の視線計測
装置は、被験者の眼球方向と被験者の頭部方向とを検出
し、これら眼球方向と頭部方向とを合成した視線方向を
出力する視線方向出力手段1と、この視線方向が被験者
に臨む面、例えば被験者を囲む多面体のいずれかの面に
交わる交点を検出する交点検出手段2と、検出された交
点の位置を前記多面体の展開平面上に表示する交点表示
手段3とからなる。
As shown in FIG. 1, the eye gaze measuring apparatus of the present invention detects the eye direction of the subject and the head direction of the subject, and outputs a gaze direction obtained by combining the eye direction and the head direction. Gaze direction output means 1; intersection detection means 2 for detecting an intersection where the gaze direction faces a subject, for example, any surface of a polyhedron surrounding the subject; and a position of the detected intersection is defined as a development plane of the polyhedron. And an intersection display means 3 displayed above.

【0010】視線方向出力手段1は、眼球方向を検出す
るために、被験者の頭部に装着され被験者の瞳部位置を
検出する眼球センサ11と、予め求めておいた瞳部位置
と眼球方向との相関関係を情報として保持する情報保持
手段12と、この相関関係から前記検出した瞳部位置に
対応する被験者の眼球方向を検出する眼球方向検出手段
13と、検出した眼球方向を被験者にとっての正面視状
態に基づく眼球方向に補正する眼球方向補正手段14と
を有する。なお、眼球センサ11を用いる代わりに、被
験者の眼球及びその近傍を撮像し、その画像を画像処理
することによって瞳部を抽出して位置検出してもよい。
The gaze direction output means 1 includes an eyeball sensor 11 mounted on the subject's head to detect the position of the pupil of the subject in order to detect the direction of the eyeball, and the pupil position and the eyeball direction determined in advance. Information holding means 12 for holding the correlation of the information as information, eye direction detection means 13 for detecting the eye direction of the subject corresponding to the detected pupil position from the correlation, and the detected eye direction in front of the subject. Eye direction correcting means 14 for correcting the eye direction based on the visual state. Instead of using the eyeball sensor 11, the eyeball of the subject and the vicinity thereof may be imaged, and the image may be subjected to image processing to extract the pupil and detect the position.

【0011】また、視線方向出力手段1は、頭部方向を
検出するために、計測環境に設置された座標系に対する
被験者の頭部に固定された座標系の相対的位置及び角度
を検出する頭部センサ15と、頭部センサ15による位
置及び角度から被験者の頭部方向を検出する頭部方向検
出手段16と、検出した頭部方向を被験者にとっての正
面視状態に基づく頭部方向に補正する頭部方向補正手段
17とを有する。
The line-of-sight direction output means 1 detects the head position by detecting the relative position and angle of the coordinate system fixed to the subject's head with respect to the coordinate system installed in the measurement environment in order to detect the head direction. A part sensor 15, head direction detecting means 16 for detecting the head direction of the subject from the position and angle by the head sensor 15, and correcting the detected head direction to a head direction based on a front view state of the subject. And head direction correcting means 17.

【0012】以下、各手段について説明する。Hereinafter, each means will be described.

【0013】被験者の視線方向を計測するには、眼球方
向と頭部方向とを計測する必要がある。本発明では、眼
球センサ11又は上記の画像処理にによって眼球の位置
(変位)を計測し、その眼球の変位から眼球方向を検出
する。眼球方向とは、眼球の変位のみによる視線方向を
表すものである。
To measure the gaze direction of the subject, it is necessary to measure the eyeball direction and the head direction. In the present invention, the position (displacement) of the eyeball is measured by the eyeball sensor 11 or the above-described image processing, and the direction of the eyeball is detected from the displacement of the eyeball. The eyeball direction indicates a gaze direction based on only the displacement of the eyeball.

【0014】眼球の変位は、眼球の近傍に基準点をと
り、この基準点を原点とした直交座標系を定義し、眼球
センサ1で検出した瞳部の位置を座標H,V(横方向座
標,縦方向座標)で表す。
For the displacement of the eyeball, a reference point is set near the eyeball, an orthogonal coordinate system is defined with the reference point as the origin, and the position of the pupil detected by the eyeball sensor 1 is defined as coordinates H and V (horizontal coordinates). , Vertical coordinate).

【0015】まず、瞳部の位置座標H,Vから被験者の
眼球方向を検出するために、予め瞳部位置と眼球方向と
の相関関係を調べておく(マッピング)。このために、
図2に示されるような指標21を作製しておく。指標2
1は、絶対位置の分かっている水平方向の3点と垂直方
向の3点との計5点(中心点は共通)を有し、被験者の
眼球と中心点とが水平になるようにして、被験者から所
定距離離して配置する。この指標21の各点を被験者に
注視してもらい、その各点による眼球方向θ(立体角)
を求める。眼球方向θは、被験者から指標21までの距
離L1、中心点から各点までの距離L2よりtanθ=
L2/L1の関係から求める。眼球方向θと位置座標H
又はVとの関係は、被験者毎に異なるが、同じ被験者で
は再現性がある。そこで、このようにして調べた相関関
係の情報を、情報保持手段12にマップとして記憶す
る。この相関関係から前記検出した瞳部位置H,Vに対
応する被験者の眼球方向pH,pV(横方向角,縦方向
角)を検出することができる。
First, in order to detect the eyeball direction of the subject from the position coordinates H and V of the pupil, the correlation between the pupil position and the eyeball direction is checked in advance (mapping). For this,
An index 21 as shown in FIG. 2 is prepared in advance. Indicator 2
1 has three points in the horizontal direction and three points in the vertical direction in which the absolute positions are known, for a total of five points (the center point is common), and the eyeball of the subject and the center point are horizontal, It is placed at a predetermined distance from the subject. The examinee gazes at each point of the index 21, and the eyeball direction θ (solid angle) at each point.
Ask for. The eyeball direction θ is calculated from the distance L1 from the subject to the index 21 and the distance L2 from the center point to each point, tan θ =
It is determined from the relationship L2 / L1. Eye direction θ and position coordinates H
Or, the relationship with V differs for each subject, but the same subject has reproducibility. Therefore, the information on the correlation thus determined is stored in the information holding unit 12 as a map. From this correlation, it is possible to detect the eyeball direction pH and pV (horizontal angle, vertical angle) of the subject corresponding to the detected pupil position H, V.

【0016】次に、頭部方向検出について説明する。Next, the head direction detection will be described.

【0017】頭部センサ15には、3次元センサを使用
する。3次元センサは、直交する3軸のコイルを持つ直
交コイルを2つ使用し、第一直交コイルは対象物とは独
立に設け、第二直交コイルは対象物に取り付ける。コイ
ルは、電流を流すと磁界を発生し、磁界中に置かれると
起電力を生じる。磁界の強さは距離の3乗に反比例す
る。この性質を利用して、一方の直交コイルで磁界を発
生し、他方の直交コイルで磁界の強さを検出する。図3
に示されるように、計測環境に設置した磁界を発生する
第一直交コイル31と被験者の頭部に装着した磁界を検
出する第二直交コイル32とにより、第一直交コイル3
1の座標系に対する第二直交コイル32の座標系の相対
的位置及び角度として頭部の位置と角度とが検出され
る。このようにして検出した頭部の位置及び角度により
頭部方向を算出する。
As the head sensor 15, a three-dimensional sensor is used. The three-dimensional sensor uses two orthogonal coils having three orthogonal coils, the first orthogonal coil is provided independently of the object, and the second orthogonal coil is attached to the object. The coil generates a magnetic field when a current flows, and generates an electromotive force when placed in the magnetic field. The strength of the magnetic field is inversely proportional to the cube of the distance. Utilizing this property, one orthogonal coil generates a magnetic field, and the other orthogonal coil detects the strength of the magnetic field. FIG.
As shown in FIG. 2, a first orthogonal coil 31 that generates a magnetic field installed in the measurement environment and a second orthogonal coil 32 that detects a magnetic field attached to the subject's head are used for the first orthogonal coil 3.
The position and angle of the head are detected as the relative position and angle of the coordinate system of the second orthogonal coil 32 with respect to the one coordinate system. The head direction is calculated based on the position and angle of the head thus detected.

【0018】第二直交コイル32は、頭部の正面方向が
第二直交コイル32の座標系のx軸となるよう頭部に固
定されるので、x軸の方向が頭部方向として算出され
る。即ち、第一直交コイル31の座標系を構成するX,
Y,Zの各軸に対して第二直交コイル32の座標系を構
成するx,y,z軸のうちx軸が占める大きさを要素と
したベクトルdcXが頭部方向となる。
Since the second orthogonal coil 32 is fixed to the head such that the front direction of the head is the x-axis of the coordinate system of the second orthogonal coil 32, the direction of the x-axis is calculated as the head direction. . That is, X, which constitutes the coordinate system of the first orthogonal coil 31,
A vector dcX having the element occupied by the size occupied by the x-axis among the x-, y-, and z-axes forming the coordinate system of the second orthogonal coil 32 with respect to the Y- and Z-axes is the head direction.

【0019】頭部方向dcX=(dcX.X dcX.
Y dcX.Z) 同時に、第二直交コイル32のy軸の方向及びz軸の方
向もベクトルdcY、ベクトルdcZとして求める。
Head direction dcX = (dcX.X dcX.
Y dcX. Z) At the same time, the direction of the y-axis and the direction of the z-axis of the second orthogonal coil 32 are also obtained as a vector dcY and a vector dcZ.

【0020】ところで、上記の構成で検出される頭部の
位置及び角度は、第一直交コイル31によって設定され
る座標系において第二直交コイル32が示している位置
及び角度である。この第一直交コイル31の座標系をセ
ンサ座標系と呼ぶことにする。算出された頭部方向はセ
ンサ座標系で示され、センサ座標系の原点方向は例えば
X軸の方向である。一方、測定環境の中での被験者が正
面視状態であると認識する方向を正面視方向とする。あ
る測定環境において被験者の視線がどの方向に向いてい
るかを示すには、被験者の正面視方向に基づいて示すの
が適切である。しかし、センサ座標系は、第一直交コイ
ル31によって設定されるので、その原点方向(X軸方
向)と、被験者の正面視方向とは、互いにずれを有す
る。
The position and angle of the head detected by the above configuration are the position and angle indicated by the second orthogonal coil 32 in the coordinate system set by the first orthogonal coil 31. The coordinate system of the first orthogonal coil 31 is called a sensor coordinate system. The calculated head direction is indicated in the sensor coordinate system, and the origin direction of the sensor coordinate system is, for example, the direction of the X axis. On the other hand, a direction in which the subject recognizes that the subject is in a front view state in the measurement environment is defined as a front view direction. In order to indicate the direction of the subject's gaze in a certain measurement environment, it is appropriate to indicate based on the subject's front view direction. However, since the sensor coordinate system is set by the first orthogonal coil 31, the origin direction (X-axis direction) and the subject's front view direction are offset from each other.

【0021】また、瞳部位置と眼球方向との相関関係の
マッピングにおいては、被験者の眼球と指標21の中心
点とが水平であることが正面視状態であると仮定した
が、実際には被験者がこの仮定のとおりに眼球の正面視
状態を認識するとは限らない。眼球の正面視方向には、
個人差があり、指標21を用いた水平視方向とは必ずし
も一致しない。
In mapping the correlation between the pupil position and the eyeball direction, it is assumed that the eyeball of the subject and the center point of the index 21 are horizontal when viewed from the front. Does not always recognize the state of the eyeball in a front view according to this assumption. In the frontal direction of the eyeball,
There is an individual difference and does not always match the horizontal viewing direction using the index 21.

【0022】従って、例えば、トラックのキャブにおけ
る被験者の視線を調べる際には、被験者がシートに座っ
た状態で視線方向を計測するが、被験者が認識する頭部
の正面視方向は、センサ座標系における原点方向からず
れ、また、被験者の個人差によりずれる。
Therefore, for example, when examining the line of sight of a subject in a cab of a truck, the direction of the line of sight is measured while the subject is sitting on a seat. Deviates from the direction of the origin, and also deviates due to individual differences of subjects.

【0023】図4(a)に示されるように、センサ座標
系は大地Gに対して垂直なZ軸、水平なX軸及びY軸を
持つ座標系である。このセンサ座標系における眼球方向
及び頭部方向の原点方向はX軸である。
As shown in FIG. 4A, the sensor coordinate system is a coordinate system having a Z axis perpendicular to the ground G, an X axis and a Y axis horizontal. The origin directions of the eyeball direction and the head direction in the sensor coordinate system are the X axis.

【0024】また、図4(b)に示されるように、瞳部
位置と眼球方向との相関関係のマッピングを行う際の正
面視方向θ0は、センサ座標系の原点方向(X軸)に一
致するとは限らない。
As shown in FIG. 4B, the front view direction θ0 when mapping the correlation between the pupil position and the eyeball direction coincides with the origin direction (X axis) of the sensor coordinate system. Not necessarily.

【0025】さらに、図4(c)に示されるように、車
両運転姿勢にある被験者は、Z軸に対し頭部が3軸の立
体角γ0をなす(図は頭部を側面から見たものであるか
ら、頭部の後傾しか示されない)。このとき、被験者
は、マッピング時の正面視方向θ0に対し立体角β0を
なす方向を正面視方向と認識する。従って、車両運転姿
勢にある被験者が正面視状態であると認識していると
き、眼球方向補正手段4及び頭部方向検出手段6は、眼
球方向β0、頭部方向γ0を検出する。そこで、被験者
が計測環境において正面視状態であると認識していると
きの眼球方向β0及び頭部方向γ0を計測しておき、こ
れらの計測結果を補正値として記憶しておく。補正は、
被験者が計測環境において正面視状態であると認識して
いるとき眼球方向が0、頭部方向が0となるように行
う。
Further, as shown in FIG. 4 (c), the subject in the vehicle driving posture has a three-axis solid angle γ0 of the head with respect to the Z axis (the figure shows the head viewed from the side). , So only the head tilts backwards). At this time, the subject recognizes a direction forming a solid angle β0 with respect to the front view direction θ0 at the time of mapping as the front view direction. Therefore, when the subject in the vehicle driving posture recognizes that the subject is in the front view state, the eyeball direction correcting means 4 and the head direction detecting means 6 detect the eyeball direction β0 and the head direction γ0. Therefore, the eyeball direction β0 and the head direction γ0 when the subject recognizes that the subject is in the front view state in the measurement environment are measured, and the measurement results are stored as correction values. The correction is
When the subject recognizes that the subject is in a front view state in the measurement environment, the eyeball direction is set to 0 and the head direction is set to 0.

【0026】検出した眼球方向を眼球方向の補正値で補
正し、検出した頭部方向を頭部方向の補正値で補正する
ことにより、被験者が認識する正面視方向を原点方向と
する眼球方向及び頭部方向が得られる。このようにして
補正された頭部方向と補正された眼球方向とを合成して
視線方向を得る。
The detected eye direction is corrected by the correction value of the eye direction, and the detected head direction is corrected by the correction value of the head direction. The head direction is obtained. The gaze direction is obtained by synthesizing the head direction corrected in this way and the corrected eyeball direction.

【0027】補正を含めた視線方向の算出式は、次のよ
うになる。
The formula for calculating the line of sight including the correction is as follows.

【0028】予め、被験者正面視状態の眼球方向及び頭
部方向を求めて、補正値として記憶する。眼球方向は2
次元データであり、頭部方向は3次元データである。
The eyeball direction and the head direction in the subject's front view state are obtained in advance and stored as correction values. Eye direction is 2
The head direction is three-dimensional data.

【0029】 眼球方向:(0rgH 0rgV) β0に相当する 頭部方向:(Ψ0 θ0 Ψ0 ) γ0に相当する また、眼球方向検出手段3で求めた眼球方向pH,pV
及び頭部方向検出手段6で求めた頭部方向は次のように
なる。
The eye direction: (0rgH 0rgV) head direction corresponding to β0: (Ψ 0 θ 0 Ψ 0) γ0 corresponding to addition, eye direction pH was determined at an eyeball direction detecting means 3, pV
The head direction obtained by the head direction detecting means 6 is as follows.

【0030】 眼球方向:(pH pV) 頭部方向:dcX=(dcX.X dcX.Y dc
X.Z) 1)まず、γ0分補正をする。
Eye direction: (pH pV) Head direction: dcX = (dcX.X dcX.Y dc
X. Z) 1) First, γ0 is corrected.

【0031】センサ座標系での原点方向はX軸方向にな
るから(1 0 0)で表すことができる。ここで、頭
部方向補正値(Ψ0 θ0 Ψ0 )を(1 0 0)に
変換する回転行列を、
Since the origin direction in the sensor coordinate system is in the X-axis direction, it can be expressed by (100). Here, a rotation matrix for converting the head direction correction value (Ψ 0 θ 0 Ψ 0 ) into (1 0 0) is

【0032】[0032]

【数1】 (Equation 1)

【0033】とすると、γ0分補正された頭部方向:d
cX´=(dcX.X´ dcX.Y´ dcX.Z
´)は、
Then, the head direction corrected by γ0: d
cX ′ = (dcX.X ′ dcX.Y ′ dcX.Z
´)

【0034】[0034]

【数2】 (Equation 2)

【0035】となる。## EQU1 ##

【0036】2)頭部方向の3次元データを眼球方向の
2次元データに合わせるため、3次元から2次元への変
換を行う。頭部方向を水平/垂直成分fH,fVに変換
する。
2) In order to match the three-dimensional data in the head direction with the two-dimensional data in the eyeball direction, conversion from three dimensions to two dimensions is performed. The head direction is converted into horizontal / vertical components fH and fV.

【0037】 fH=−atan(dcX.Y´/dcX.X´) fV=asin(dcX.Z´) 3)第二直交コイル32のy軸の方向及びz軸の方向か
ら頭部の傾き成分fSを算出する。頭部の傾き成分fS
とは、被験者にとって横方向の傾きのことで、第二直交
コイル32のx軸の回りにyz平面を回転したときに生
じる回転角で表される。
FH = −atan (dcX.Y ′ / dcX.X ′) fV = asin (dcX.Z ′) 3) A tilt component of the head from the direction of the y-axis and the direction of the z-axis of the second orthogonal coil 32. Calculate fS. Head tilt component fS
Is a lateral inclination for the subject, and is represented by a rotation angle generated when the yz plane is rotated around the x-axis of the second orthogonal coil 32.

【0038】 fS=−atan(dcY.Z´/dcZ.Z´) 頭部の傾き分のみ眼球のデータを回転させて、眼球方向
(pH pV)を水平/垂直成分pH´,pV´に変換
する。この際、β0分補正をする。即ち、(0rgH
0rgV)の補正を行う。
FS = −atan (dcYZ.Z ′ / dcZ.Z ′) The eyeball data is rotated only by the inclination of the head, and the eyeball direction (pH pV) is converted into horizontal / vertical components pH ′ and pV ′. I do. At this time, β0 minute correction is performed. That is, (0rgH
0rgV).

【0039】 pH´=(pH−0rgH)×(π/180)×cos
(fS)−(pV−0rgV)×(π/180)×si
n(fS) pV´=(pH−0rgH)×(π/180)×sin
(fS)+(pV−0rgV)×(π/180)×co
s(fS) 4)上記2,3の結果から合成視線方向を得る。
PH ′ = (pH-0rgH) × (π / 180) × cos
(FS) − (pV−0rgV) × (π / 180) × si
n (fS) pV ′ = (pH-0rgH) × (π / 180) × sin
(FS) + (pV-0rgV) × (π / 180) × co
s (fS) 4) Obtain the synthetic line-of-sight direction from the results of above 2 and 3.

【0040】 dH=fH+pH´ dV=fV+pV´ このように検出された眼球方向と頭部方向とを補正して
から合成して視線方向を得るようにしたので、車両運転
姿勢にある被験者の正面視方向に基づく視線方向を計測
することができる。
DH = fH + pH ′ dV = fV + pV ′ Since the eyeball direction and the head direction detected in this way are corrected and then combined to obtain the gaze direction, the subject in the vehicle driving posture is viewed from the front. The gaze direction based on the direction can be measured.

【0041】以上のようにして視線方向がベクトルとし
て求められる。しかし、実際には、視線ベクトルを捉え
るだけでなく、対象物のどこを見ているかを判断する必
要がある場合もある。例えば、ショーウィンドに商品を
陳列する場合、視線計測によって被験者がどの部分に目
をやりがちか判断できれば、その判断の結果を陳列に反
映させることにより、商品の陳列の効果を上げることが
期待できる。また、トラックのキャブ内で運転者がどの
部分に目をやりがちか判断できれば、その判断の結果
は、警報装置等の車内機器の配置を検討する際に役立
つ。
As described above, the line-of-sight direction is obtained as a vector. However, in practice, it is sometimes necessary to determine not only the gaze vector but also where the target is looking. For example, when a product is displayed in a show window, if the subject can determine which part of the display tends to be looked at by gaze measurement, the result of the determination can be reflected in the display, thereby increasing the effect of displaying the product. . In addition, if the driver can determine which part of the truck cab he / she tends to look at, the result of the determination is useful when examining the arrangement of in-vehicle equipment such as an alarm device.

【0042】そこで、被験者が見ている対象物となる面
のセンサ座標系から見た任意の3点の座標を計測し、こ
の座標からその平面の平面方程式を算出し、その平面方
程式と前記のようにして求めた視線ベクトルとを利用し
て視線方向が平面に交わる交点を検出する。被験者が見
ている対象物となる面は、例えば、被験者を囲む多面体
のひとつの面であり、計測環境をトラックの車室とする
と、図5(a)に示されるように、多面体Tは、天井平
面51、右サイドドア平面52、フロントウィンドウ平
面53、左サイドドア平面54、フロント下部平面55
等からなる。視線は、多面体のいずれかの面に交わるこ
とになる。各平面の平面方程式と視線ベクトルとから交
点Cが求まる。
Therefore, the coordinates of any three points of the surface of the object to be viewed by the subject as viewed from the sensor coordinate system are measured, and a plane equation of the plane is calculated from the coordinates, and the plane equation and the above-described plane equation are calculated. The intersection point where the line-of-sight direction intersects the plane is detected using the line-of-sight vector thus obtained. The surface to be an object viewed by the subject is, for example, one surface of a polyhedron surrounding the subject. If the measurement environment is a truck cabin, as shown in FIG. Ceiling plane 51, right side door plane 52, front window plane 53, left side door plane 54, front lower plane 55
Etc. The line of sight will intersect any surface of the polyhedron. The intersection C is obtained from the plane equation of each plane and the line-of-sight vector.

【0043】求めた交点Cは多面体Tのいずれかの面に
ある。この交点Cを判りやすく表示するために、図5
(b)に示されるように、多面体Tをひとつの平面に展
開し、この展開平面上に交点を表示する。ここでは、フ
ロントウィンドウ平面53の各辺に接する面をそれぞれ
の辺で折り返すことにより展開してフロントウィンドウ
平面53と同一面に展開してある。
The obtained intersection point C is on any surface of the polyhedron T. In order to easily display the intersection C, FIG.
As shown in (b), the polyhedron T is developed on one plane, and intersections are displayed on this developed plane. Here, the surface in contact with each side of the front window plane 53 is folded back at each side to develop and develop on the same plane as the front window plane 53.

【0044】計算上では、多面体の各頂点と多面体内の
1つの点とからなる11点の座標を用い、稜線等の主要
な直線からなるワイヤモデルを近似的に作製する。この
ワイヤモデルに基づき各平面の平面方程式を算出する。
各平面の平面方程式と視線ベクトルとから交点を求め
る。図6に示されるように、ワイヤモデルを展開して表
示し、その展開平面上に交点Cを表示する。これにより
被験者が見ている対象の位置が判定できる。時間経過と
共に交点が移動する場合には、各時間の交点を残して表
示することにより軌跡Dが得られる。視線が異なる平面
に移動した場合でも連続した軌跡Dが得られる。
In calculation, a wire model consisting of major straight lines such as ridge lines is approximately produced using the coordinates of 11 points consisting of each vertex of the polyhedron and one point in the polyhedron. The plane equation of each plane is calculated based on the wire model.
The intersection is obtained from the plane equation of each plane and the line-of-sight vector. As shown in FIG. 6, the wire model is developed and displayed, and the intersection C is displayed on the developed plane. As a result, the position of the target viewed by the subject can be determined. When the intersection moves with the passage of time, the trajectory D can be obtained by displaying the intersection at each time. A continuous trajectory D can be obtained even when the line of sight moves to a different plane.

【0045】[0045]

【発明の効果】本発明は次の如き優れた効果を発揮す
る。
The present invention exhibits the following excellent effects.

【0046】(1)多面体の平面への視線方向の交点を
展開平面上に表示するので、被験者が見ている対象の位
置が視覚的に判りやすい。
(1) Since the intersection of the line of sight with the plane of the polyhedron is displayed on the development plane, the position of the object viewed by the subject can be easily visually recognized.

【図面の簡単な説明】[Brief description of the drawings]

【図1】本発明の一実施形態を示す視線計測装置のブロ
ック構成図である。
FIG. 1 is a block diagram of an eye gaze measuring apparatus according to an embodiment of the present invention.

【図2】本発明による瞳孔位置と眼球方向との相関関係
を予備測定する原理図である。
FIG. 2 is a principle diagram for preliminary measurement of a correlation between a pupil position and an eyeball direction according to the present invention.

【図3】本発明による頭部方向検出の原理図である。FIG. 3 is a principle diagram of head direction detection according to the present invention.

【図4】本発明において補正する座標系のずれを示した
図である。
FIG. 4 is a diagram showing a deviation of a coordinate system to be corrected in the present invention.

【図5】本発明において使用する被験者を囲む多面体の
(a)斜視図、(b)展開平面図である。
5A is a perspective view of a polyhedron surrounding a subject used in the present invention, and FIG. 5B is a developed plan view.

【図6】本発明による表示手段の平面図である。FIG. 6 is a plan view of display means according to the present invention.

【符号の説明】[Explanation of symbols]

1 視線方向出力手段 2 交点検出手段 3 交点表示手段 C 交点 T 多面体 1 gaze direction output means 2 intersection detection means 3 intersection display means C intersection T polyhedron

Claims (1)

【特許請求の範囲】[Claims] 【請求項1】 被験者の眼球方向と被験者の頭部方向と
を検出し、これら眼球方向と頭部方向とを合成した視線
方向を出力する視線方向出力手段と、この視線方向が被
験者に臨む面に交わる交点を検出する交点検出手段と、
検出された交点の位置を前記多面体の展開平面上に表示
する交点表示手段とを備えたことを特徴とする視線計測
装置。
1. A gaze direction output means for detecting a gaze direction of a subject and a head direction of the subject, and outputting a gaze direction obtained by combining the eye direction and the head direction, and a face in which the gaze direction faces the subject. Intersection detection means for detecting an intersection that intersects
An intersection display means for displaying the position of the detected intersection on a development plane of the polyhedron.
JP8429798A 1998-03-30 1998-03-30 Visual line measuring device Pending JPH11276438A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP8429798A JPH11276438A (en) 1998-03-30 1998-03-30 Visual line measuring device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP8429798A JPH11276438A (en) 1998-03-30 1998-03-30 Visual line measuring device

Publications (1)

Publication Number Publication Date
JPH11276438A true JPH11276438A (en) 1999-10-12

Family

ID=13826547

Family Applications (1)

Application Number Title Priority Date Filing Date
JP8429798A Pending JPH11276438A (en) 1998-03-30 1998-03-30 Visual line measuring device

Country Status (1)

Country Link
JP (1) JPH11276438A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009204615A (en) * 2007-02-16 2009-09-10 Mitsubishi Electric Corp Road/feature measuring device, feature identifying device, road/feature measuring method, road/feature measuring program, measuring device, measuring method, measuring terminal device, measuring server device, plotting device, plotting method, plotting program, and plotted data
JP2013048807A (en) * 2011-08-31 2013-03-14 Nidek Co Ltd Visual restoration aiding device
CN104814717A (en) * 2015-04-14 2015-08-05 赵桂萍 Detecting method and device for compensation type variation position error elimination nystagmus total graph
WO2016002656A1 (en) * 2014-06-30 2016-01-07 凸版印刷株式会社 Line-of-sight measurement system, line-of-sight measurement method, and program
JP2017125739A (en) * 2016-01-13 2017-07-20 日本放送協会 Three-dimensional position calibration device and program of the same, and free view point image generation device
CN111527374A (en) * 2018-01-05 2020-08-11 三菱电机株式会社 Sight direction correction device, sight direction correction method, and sight direction correction program

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009204615A (en) * 2007-02-16 2009-09-10 Mitsubishi Electric Corp Road/feature measuring device, feature identifying device, road/feature measuring method, road/feature measuring program, measuring device, measuring method, measuring terminal device, measuring server device, plotting device, plotting method, plotting program, and plotted data
JP2013048807A (en) * 2011-08-31 2013-03-14 Nidek Co Ltd Visual restoration aiding device
WO2016002656A1 (en) * 2014-06-30 2016-01-07 凸版印刷株式会社 Line-of-sight measurement system, line-of-sight measurement method, and program
JPWO2016002656A1 (en) * 2014-06-30 2017-05-25 凸版印刷株式会社 Gaze measurement system, gaze measurement method, and program
US10460466B2 (en) 2014-06-30 2019-10-29 Toppan Printing Co., Ltd. Line-of-sight measurement system, line-of-sight measurement method and program thereof
CN104814717A (en) * 2015-04-14 2015-08-05 赵桂萍 Detecting method and device for compensation type variation position error elimination nystagmus total graph
JP2017125739A (en) * 2016-01-13 2017-07-20 日本放送協会 Three-dimensional position calibration device and program of the same, and free view point image generation device
CN111527374A (en) * 2018-01-05 2020-08-11 三菱电机株式会社 Sight direction correction device, sight direction correction method, and sight direction correction program
CN111527374B (en) * 2018-01-05 2021-10-22 三菱电机株式会社 Sight direction correction device, sight direction correction method, and sight direction correction program

Similar Documents

Publication Publication Date Title
US11412993B2 (en) System and method for scanning anatomical structures and for displaying a scanning result
US8446268B2 (en) System for displaying views of vehicle and its surroundings
US7423553B2 (en) Image display apparatus, image display method, measurement apparatus, measurement method, information processing method, information processing apparatus, and identification method
US8077217B2 (en) Eyeball parameter estimating device and method
US7533988B2 (en) Eyeshot detection device using distance image sensor
EP2042079B1 (en) Visual axis direction detection device and visual line direction detection method
Royden Analysis of misperceived observer motion during simulated eye rotations
CN107284352A (en) Periphery surveillance device for vehicles
US11275437B2 (en) Eye tracking applications in virtual reality and augmented reality
JP6479272B1 (en) Gaze direction calibration apparatus, gaze direction calibration method, and gaze direction calibration program
CN109543651A (en) A kind of driver's dangerous driving behavior detection method
JP2017068424A (en) Attitude measuring apparatus and attitude measurement method
JPH11276438A (en) Visual line measuring device
JP2005182452A (en) Device for detecting direction of face
WO2021156914A1 (en) Attention direction determination device and attention direction determination method
JP5568761B2 (en) Visual field estimation apparatus, visual field estimation method, computer program, and recording medium
JPH11281324A (en) Measuring apparatus for line of sight
JP2004106596A (en) Device and method for displaying image and device and method for measurement
US11087491B2 (en) Method for determining a coordinate of a feature point of an object in a 3D space
JP2014149794A (en) Visual line analysis device
JPH04288123A (en) Line-of-sight display device
EP4133452B1 (en) Method for calibrating a vehicle cabin camera
JP5522799B2 (en) Gaze position estimation system and gaze position estimation program
CN112336374B (en) Accurate positioning device and method for carotid artery ultrasonic probe
Way et al. A method for measuring the field of view in vehicle mirrors

Legal Events

Date Code Title Description
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20041006

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20041102

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20050308