JP4517049B2 - Gaze detection method and gaze detection apparatus - Google Patents

Gaze detection method and gaze detection apparatus Download PDF

Info

Publication number
JP4517049B2
JP4517049B2 JP2003429344A JP2003429344A JP4517049B2 JP 4517049 B2 JP4517049 B2 JP 4517049B2 JP 2003429344 A JP2003429344 A JP 2003429344A JP 2003429344 A JP2003429344 A JP 2003429344A JP 4517049 B2 JP4517049 B2 JP 4517049B2
Authority
JP
Japan
Prior art keywords
camera
subject
pupil
θ
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2003429344A
Other languages
Japanese (ja)
Other versions
JP2005185431A (en
Inventor
嘉伸 海老澤
Original Assignee
国立大学法人静岡大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 国立大学法人静岡大学 filed Critical 国立大学法人静岡大学
Priority to JP2003429344A priority Critical patent/JP4517049B2/en
Priority claimed from US10/584,635 external-priority patent/US7533989B2/en
Publication of JP2005185431A publication Critical patent/JP2005185431A/en
Application granted granted Critical
Publication of JP4517049B2 publication Critical patent/JP4517049B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a line-of-sight detection method and a line-of-sight detection apparatus capable of minimizing the restriction on the posture of a subject and detecting a line of sight without adding a marker.

Since the line-of-sight detection method and the line-of-sight detection apparatus are expected to have wide applications such as medical examinations and man-machine interfaces (line-of-sight input), many proposals have been made.
Many of the basic principles detect the line of sight from the cornea reflection point and the center of the pupil. The invention described in Patent Document 2 is an invention related to the detection, and is directed to detection of feature points. The invention described in Patent Document 3 has been proposed in detail for measurement taking the shape of the cornea into consideration.

Many of the line-of-sight detection devices that have already been proposed mainly include a type in which the head of the subject is fixed and a type in which the detection device is attached to the head of the subject. Both of these significantly hindered the normal behavior of the subject and put a great burden on the subject in gaze measurement. Recently, a non-contact type has been made to reduce the burden on the subject, but the measuring range is extremely small because the detection device is remote from the subject. It is limited, and it can be measured only when the subject cannot move.
The method for detecting the line of sight from an eye image using image processing or the like, which has been proposed as a method for removing the subject's movement restriction, takes time for image processing and sometimes has poor time characteristics.
In addition, since the line of sight is detected by the same camera as the face detection, there is a disadvantage that the accuracy is low.
In order to solve this problem, the non-contact gaze measuring apparatus described in Patent Document 1 attaches a spectacle frame with three markers to the face. When these glasses are attached, it is impossible to wear general glasses for correcting vision. In addition, measurement glasses are a burden for users and examinees who do not usually wear glasses.
JP-A-10-66678 Japanese Patent Laid-Open No. 11-56782 JP 2002-102172 A

  An object of the present invention is to provide a line-of-sight detection method and a line-of-sight detection apparatus capable of detecting the line of sight without minimizing the restriction on the posture of the subject and attaching a marker. Still another object of the present invention is to provide a visual line detection device that implements the visual line detection method.

In order to achieve the object, the eye gaze detection method according to claim 1 according to the present invention comprises:
A first camera for measuring the position of the pupil relative to the coordinate system;
A light source for forming a corneal reflection point arranged at a known position in the coordinate system is provided, and data on the distance r between the center of the pupil and the corneal reflection point and the angle φ of the distance r with respect to the coordinate axis of the coordinate system is acquired. A method for detecting a gaze of a subject using a second camera and a computing unit that computes a gaze direction that performs the following steps based on information from each camera:
A step wherein is noted subject to a known single point G of the coordinate system to obtain data about the coordinate point of the position of the pupil of the subject by the first camera,
In the state of the subject, acquiring the data of the corneal reflection point, the distance r between the reflection point and the pupil center P, the inclination φ between the coordinate axis and r by the second camera;
Calculating an angle θ between a line connecting the reference position O of the second camera and the center of the pupil and the line of sight of the subject by an arithmetic means;
Based on the measurements and calculation values, relationships and a step of r calculated itself or r is a value corrected based on the distance OP r * and the formula showing the relationship between θ θ = f (r *) a step is noted subject to an unknown one point G 'of the formula determining step and and the coordinate system to obtain data about the coordinate point of the position of the pupil of the subject by the first camera,
Acquiring 'inclination φ between' and the coordinate axes 'distance r to' the reflection point and the corneal reflection point data and the pupil center P wherein r by the second camera,
Θ ′ = f (r * ′) is calculated from r * ′, which is a value obtained by correcting r ′ itself or r ′ based on the distance OP ′ using the relational expression, and is calculated from the inclinations φ ′ and θ ′. Obtaining an unknown point G ′, and
It is composed of

According to a second aspect of the present invention, the first camera is a stereo camera arranged with the base line aligned in the horizontal axis direction of the coordinate system, and the light source of the second camera is light of the second camera. The optical axis is substantially coincident with the axis.
In the method according to the third aspect of the present invention, the equation θ = f (r * ) indicating the relationship between r * and θ is given by θ = k × r * (where k is a constant).
The fourth aspect of the method according to the invention, the pupil is one of the pupil of the subject.

The device according to claim 5 according to the present invention comprises:
A first camera for measuring the position of the pupil with respect to the coordinate system,
A light source arranged at a known position in the coordinate system, and a second camera that acquires data of an angle φ with respect to the coordinate axis of the distances r and r between the center of the pupil irradiated by the light source and a corneal reflection point;
Wherein is noted subject to a known single point G of the coordinate system to obtain data about the coordinate point of the position of the pupil of the subject by the first camera,
In the state of the subject, the corneal reflection point data, the distance r from the reflection point to the pupil center P, and the inclination φ between the coordinate axis and the r are acquired by the second camera.
Calculates the angle θ between the second camera reference position O and the line of sight of the line and the subject connecting the center of the pupil, r and r * is a value corrected on the basis of its own or r the distance OP An equation θ = f (r * ) indicating the relationship of θ is calculated,
The coordinate system is noted subject to an unknown one point G 'of obtains data about the coordinate point of the position of the pupil of the subject by the first camera,
Get the 'inclination φ between' and the coordinate axes 'distance r to' the reflection point and the corneal reflection point data and the pupil center P wherein r by the second camera,
Θ ′ = f (r * ′) is calculated from r *which is a value obtained by correcting r ′ itself or r ′ based on the distance OP ′ using the relational expression, and unknown from φ ′ and θ ′. Computing means for performing the step of obtaining one point G ′;
It is composed of

As described above in detail, the method of the present invention includes the first camera for measuring the position of the pupil with respect to the coordinate system, and the light source for forming a corneal reflection point that is arranged at a known position in the coordinate system. A line-of-sight direction in which the following steps are executed according to information from each camera using a distance r between the center of the pupil and the corneal reflection point and a second camera that acquires data of an angle φ of the distance r with respect to the coordinate axis This is a method for detecting the line of sight of a subject using a computing means for computing.
By the relational expression determination stage,
Based on the measured values using the first camera and the second camera and the values calculated by the arithmetic unit, r * and θ (the second camera and the second camera) related to r (the distance between the corneal reflection point and the pupil center) in advance. An equation θ = f (r * ) representing the relationship between the line connecting the centers of the pupils and the angle of the subject's line of sight is calculated. And
In the line-of-sight determination stage, θ ′ = f (r * ′) is calculated using the relational expression, and an unknown point G ′ is obtained from the inclinations φ ′ and θ ′ measured in the stage.
Therefore, the line of sight can be measured without extremely constraining the subject.
The relational expression determination step is performed for a specific subject, and measurement is performed using the relational expression obtained at that stage, so there is no room for measurement errors due to individual differences in the subject. . As a result, accurate data on the line of sight can be obtained.

Hereinafter, the best mode for carrying out the invention will be described in detail with reference to the drawings and the like.
FIG. 1 is a schematic diagram showing an arrangement of an embodiment of an apparatus for carrying out the method according to the invention. FIG. 4 is a perspective view showing an arrangement of devices and the like for carrying out the method of the present invention in relation to the world coordinate system.
The second camera 12 is for eyeball photography and uses a high-magnification camera. In this embodiment, this camera is arranged at the origin of the world coordinate system, the O (0, 0) point in FIG. In FIG. 4, the camera 12 is not shown.
An example of the positional relationship of corneal reflection (points) based on the data acquired by the second camera 12 is shown in an enlarged manner in FIG. This image is displayed by an output device described later.
The first cameras 10 and 11 are stereo cameras for detecting the three-dimensional position of the pupil, which are a pair of cameras.
The distance (baseline length) between the first cameras 10 and 11 is parallel to the X axis of the world coordinate system.
The second camera 12 is aimed at at least one pupil and is shown enlarged in FIG. Data on the cornea reflection point and the center of the pupil is acquired.
The first cameras 10 and 11 and the second camera 12 are each provided with illumination means (not shown) integrally or in association with each other.

FIG. 2 is a block diagram showing the configuration of an apparatus or the like for carrying out the method of the present invention.
Image outputs acquired by the CCDs of the first cameras 10 and 11 and the second camera 12 are connected to the bus line 20 via the interface 14.
In this embodiment, the second camera 12 includes a light source for forming a corneal reflection point, which is disposed at a known position O in the coordinate system, and the distance r between the center of the pupil and the corneal reflection point and the distance r. Data on the angle φ with respect to the coordinate axis is acquired. In this embodiment, the optical axis of the light from the light source is provided so as to coincide with the optical axis of the second camera 12.
The second camera 12 is supplied via a driving device 16 by a signal supplied from a processing device (CPU) 21 and is aimed at (directed and focused on) the target eye. The operator operates the input device 24 to aim as necessary while viewing the eye image (see FIG. 1) displayed on the display screen forming a part of the output device 31. Illumination means of each camera (not shown) is operated by a signal from the illumination control device 15.
The storage device 26 is provided with a program for executing control to be described later and a RAM area for performing calculation. Image information of the subject, calculation result information, system operation information, and the like are output to the output device 31 via the output interface 30. The output device 31 includes an image display device printer or the like.

The first cameras 10 and 11 are provided for detecting a three-dimensional position on the coordinates of the center of the pupil of the eye. The second camera 12 is a high-magnification line-of-sight detection camera that captures only images of surrounding eyes including the pupil.
One of the two eyes or an automatic tracking control means (drive device 16) for chasing two eyes based on the three-dimensional pupil position information from the first camera is provided.
It is also possible to manually adjust the optical axis of the camera. As will be described later, a line-of-sight vector is obtained from the three-dimensional pupil position P obtained from the outputs of the first cameras 10 and 11, the position of the corneal reflection image center obtained from the output of the second camera 12, and the pupil center P.

(Illumination light source) The illumination light source is arranged in or near the opening of the lens of the camera (hereinafter referred to as the inner light source), and is disposed at a position away from the opening (hereinafter referred to as the outer light source). ) Can be prepared. In the image by the inner light source, the pupil tends to appear brighter than the portion of the face other than the pupil. This is explained in connection with FIG. 13B of JP-B-7-82539.
Conversely, the pupil tends to appear darker than the portion other than the pupil due to the light source on the outside. This is explained in connection with FIG. 13C.
By synchronizing the inner and outer light sources with the video signal, turning them on alternately, and subtracting the latter image from the former image in real time, the areas other than the pupil are canceled out and only the pupil is detected. It becomes easy to do. Such a light source is provided in each camera.

In this embodiment, light sources are attached to a total of three units, the first and second cameras.
Thus, in order to prevent the light emitted by the light sources of other cameras from reflecting off the glasses and appearing in the image as a spectacle reflection image, the light sources attached to each camera use light sources having different center wavelengths, Attach a bandpass filter centered on the wavelength to the camera.
In this case, generally, when the emission wavelength range of the light source is broad, the emission wavelengths overlap in three or more types of light sources. Therefore, the light sources of the cameras interfere with each other no matter how narrow the bandpass filter band is selected. . In order to prevent this, a bandpass filter is installed in front of the light source, the wavelength range of the light emitted to the face is narrowed, the overlap of the emission wavelength range is reduced as much as possible, and further, in front of each camera. Can be installed in front of the light source of the camera and a band pass filter having the same center wavelength as the band pass filter can be installed to eliminate mutual interference.

The method of the present invention will be described below with reference to FIGS. The line-of-sight measurement method according to the present invention includes a relational expression determination stage for determining a relational expression and a line-of-sight measurement stage for determining an arbitrary line of sight of the subject based on the determined relational expression.
The relational expression determination stage includes steps 0 to 10, and the line-of-sight measurement stage includes steps 1 to 10.

FIG. 4 shows an arrangement of apparatuses for carrying out the method of the present invention in relation to the world coordinate system, and related planes.
(Relationship formula decision stage)
(Step 0) Localization of each camera with respect to the world coordinate system.
The origin of the world coordinate system is set to O (0, 0), and the first camera is arranged.
(Step 1) Make the subject pay attention to one known point. Let the subject see one point Q (x Q , y Q , z Q ) whose coordinates are known in the world coordinate system.
(Step 2) by the second subject of the pupil center P coordinate position data acquiring stereo camera by the camera, the three-dimensional coordinates P of the pupil center (x P, y P, z P) obtained.
(Step 3) Direction vector (l x , l y , l z ) = (x P −x O , y P −y O , z P −z O ) and its length | OP | = [(X P −x O ) 2 + (y P −y O ) 2 + (z P −z O ) 2 ] 1/2
Ask for.
(Step 4) The calculation plane H of the virtual viewpoint plane H perpendicular to | OP | and passing through O is given by the following equation.
(L x · x + l y · y + l z · z)-(l x · x O + l y · y O + l z · z O ) = 0
(1)
(Step 5) The direction vector (s x , s y , s z ) representing the calculated line of sight of the vector PQ representing the line of sight is (s x , s y , s z ) = (x Q −x P , y Q − y P , z Q −z P ), and the equation of the straight line PQ is given by the following equation because the straight line passes P (x P , y P , z P ).
(X-x P) / s x = (y-y P) / s y = (z-z P) / s z (2)
It appears in.
(Step 6) Calculation of the coordinates G (x G , y G , z G ) of the intersection of the straight line PQ and the virtual viewpoint plane H (x, y, z) obtained when the above equation (2) is set to t. The following is (3).
x = s x t + x P
y = s y t + y P (3)
z = s z t + z P
Substituting this into the formula for plane H,
t = (l x 2 + l y 2 + l z 2 ) / (l x s x + l y s y + l z s z ) (4)
And t is obtained. By substituting this into (3), the intersection point G (x G , y G , z G ) in the world coordinate system is obtained.
(Step 7) Calculation of the angle θ between the vector OP and the vector PG The angle θ formed by the direction vector (l x , l y , l z ) of the direction vector PG (s G , s G , s G ) of the vector OP Ask.

θ = cos −1 [| l x s x + l y s y + l z s z |
/ (L x 2 + l y 2 + l z 2 ) 1/2 · (s x 2 + s y 2 + s z 2 ) 1/2 ] ≧ 0
(Five)
Acquires the corneal reflection center and pupil center coordinate (step 8) r and phi a of the measurement the first camera (first camera 20),
The distance r and angle φ a (relative to the coordinate axis) between the corneal reflection center and the pupil center are obtained.
When the center coordinates of the cornea reflection and the pupil detected from the image of the line-of-sight detection camera (second camera 12) are (g x , g y ) and (p x , p y ),
When | r | ≠ 0,
| R | = [(p x -g x) 2 + (p y -g y) 2 ] 1/2 (6)
φ a = cos -1 (p y -g y) / | r | (7)
(0 ≦ φ a ≦ π)
When p x <g x , φ a = φ a '
When p x > g x , φ a = −φ a '
When the subject looks at Q, | r | is obtained as | r Q | (unit is pixel).
(Step 9) | r | is corrected by OP to obtain | r * |.
The second camera 12 detects the center of the cornea reflection image and the center of the pupil image, and estimates the angle formed by the optical axis of the line-of-sight detection camera and the line of sight from the relative positional relationship between them.
In this case, the image enlargement ratio is an important factor because the relative relationship between the center of the cornea reflection image and the center of the pupil image changes if the image enlargement ratio is different even when viewing the same direction. In order to solve this problem, first, the relationship between the distance and the enlargement ratio is measured in advance, and a relational expression is obtained. Since the three-dimensional position of the pupil is known, the distance between the line-of-sight detection camera and the pupil is calculated, the enlargement ratio is calculated using the relational expression, and the distance between the cornea reflection image center and the pupil center is corrected.

(Step 10) A relational expression f between | r * | and θ is obtained from θ and | r * |.
When θ and r when the subject looks at O are θ O and | r O |, respectively,
Since | r O | = 0 and | r * O | = 0, from the value θ when Q is viewed and | r * |
θ = k | r * | (8)
The coefficient k is obtained assuming that there is a linear relationship.
k = θ Q / | r * Q | (9)
Φ a = −φ a '(10)
And Therefore, the line of sight is calibrated by obtaining such a coefficient k (generally, a relational expression f between | r * | and θ).
| R * | is a function of θ and is independent of φ a . This is because it is assumed that k is constant regardless of φ a or φ a ′.
This will be further described with reference to FIG. In the correspondence from (| r * |, φ a ′) to (| θ |, φ a ), the origin is mapped to the origin, and is simply a map that is enlarged or reduced and turned upside down.

Use the function formula obtained in the above-mentioned relational expression determination stage [Gaze measurement stage]
(Step 1) Aiming the subject of the first camera as necessary.
It is assumed that the subject's attention point is an unknown Q ′ (x Q ′ , y Q ′ , z Q ′ ) on the viewing target plane.
Finding these coordinates is the purpose of future procedures.
(Step 2) Measurement of three-dimensional coordinates of the pupil center As shown in FIG. 5, measurement of coordinate position data (x P , y P , z p ) of the pupil center P ′ of the first camera 10, 12 subject. (Step 3) Direction vector (l x , l y , l z ) = (x P −x O , y P −y O , z P −z O ) and its length of the calculated vector OP ′ of | OP ′ | | OP ′ | = [(x P −x O ) 2 + (y P −y O ) 2 + (z P −z O ) 2 ] 1/2
Ask for.
(Step 4) to obtain the corneal reflection center and pupil center coordinates by the first camera, the distance r 'and measuring the angle phi b relative coordinate axis.
| R ′ | = [(p x −g x ) 2 + (p y −g y ) 2 ] 1/2 (6 ′)
φ b = cos -1 (p y -g y) / | r | (7 ')
(0 ≦ φ b '≦ π)
When p x <g x , φ b = φ b '
When p x > g x , φ b = −φ b '

(Step 5) r 'is corrected by | OP' | to obtain r * '.
(Step 6) Using the relational expression obtained in the previous stage, θ ′ is obtained.
θ ′ = f (| r * ′ |) = k | r * ′ |
(Step 7) A line-of-sight vector G ′ is obtained.

An orthogonal coordinate system in which the virtual viewpoint plane H is the x′-y ′ plane (the origin O coincides with the absolute coordinate system) is taken as shown in FIG. 6 (the z ′ axis passes through the point P).
It is assumed that the x ′ axis and the y ′ axis of the plane H are rotated by α in the horizontal direction and β in the vertical direction with respect to the x axis and the y axis of the absolute coordinate system.
A view of the plane H viewed from the z′-axis direction is the left side of FIG. 7, and the right side of FIG. 7 illustrates a plane passing through the three points of point O, point G ′, and point P.
A point G ′ in the orthogonal coordinate system of the virtual viewpoint plane H is obtained from θ, φ b and | OP |.
As can be seen from the right side of FIG.
| OG '| = | OP | tan θ (8)
, And the point G in the orthogonal coordinate system of the virtual viewpoint plane H '(x G', y G ', 0) of x G', y G 'is, φ b = φ b' When,
x G ' = | OG' | cosφ b '(9)
y G ' = | OG' | sinφ b '
It is obtained by

(Step 8) PG ′ is obtained.
In the orthogonal coordinate system of the virtual viewpoint plane H, the coordinates of the point P are (| OP |, 0, 0), so the line-of-sight vector in the coordinate system is represented as a vector PG ′ connecting P to G ′.
PG '= (x G', y G '- | OP |) (10)
It is obtained by This is also the direction vector of the line-of-sight vector in the orthogonal coordinate system of the virtual viewpoint plane H.

(Step 9) Calculation of the direction vector with respect to the line-of-sight vector The direction vector (l x , l y , l z ) of the vector OP in the world coordinate system matches the z ′ coordinate of the virtual viewpoint plane H orthogonal coordinate system. Now, after rotating the x ′ axis of the orthogonal coordinate system of the virtual viewpoint plane H by −α around the origin so as to coincide with the x axis of the world coordinate system, the y ′ axis is rotated around the origin so as to coincide with the y axis. , The orthogonal coordinate system of the virtual viewpoint plane H and the world coordinate system coincide. The direction vector of the vector PG ′ in the orthogonal coordinate system of the virtual viewpoint plane H after the rotation of the coordinate system matches the line-of-sight vector in the world coordinate system. here,
(11)
Given in. By this rotation, the direction vector (s x , s y , s z ) of the line-of-sight vector in the world coordinate system is given by the following equation (12).
(12)
The relationship between the orthogonal coordinate system and the angle is shown in FIG.
(Step 10) Calculation of Viewpoint Q ′ Using (s x , s y , s z ) obtained in the previous step, the expression of the line of sight in the world coordinate system is (x−x P ) / s x = (y− y P ) / s y = (z−z P ) / s z (= t) (2)
It appears in. The object to be viewed is a plane, and its formula is
m x · x + m y · y + m z · z + d = 0 (13)
, There is an intersection Q′d between the equations (11) and (12), and this is obtained.
From equation (2)
x = s x t + x P
y = s y t + y P (3)
z = s z t + z P
Substituting this into the formula for plane H,
(14)
By substituting this into equation (3), the viewpoint Q ′ (x Q ′ , y Q ′ , z Q ′ ) is obtained.

Various modifications can be expected for the embodiment of the present invention described in detail above. These belong to the technical scope of the present invention.
All cameras use a camera with high sensitivity to near infrared rays, and a fixed focus lens can be used by increasing the aperture depth of the lens as much as possible.
That is, it is possible to create a state in which the head is always in focus even if the head moves greatly, and to always take a picture of the head even with the camera direction fixed.

  Further, since the second camera 12 is generally greatly enlarged, there is a possibility that the camera can be focused only within a narrow distance range. However, depending on the application, a manual focus lens may be in time. That is when the movement of the head is limited at a certain time. For example, in the case of a user or a subject sitting on a chair, once the seat position is set, the head does not move greatly. In this case, it is sufficient that the focus is within a narrow range (10 cm), so that the user or the subject can manually focus the lens, or focus the motorized focus lens with the hand switch or remote control once. You can use it if you match them. In addition, the direction of the camera can be measured by moving the user or the subject using a body such as a hand, or by adjusting the direction and position of the eye once with a hand switch or remote control in a system including an actuator. Such a system can be relatively inexpensive. Even in this case, the three-dimensional position of the pupil is necessary as information in the line-of-sight determination.

  If it is necessary to detect the line of sight while the head is moving greatly, use the auto focus lens or motorized focus lens if the manual fixed lens or fixed focus lens may not be in focus. There is a need to. The autofocus lens is generally controlled using an image of a camera. It can be used when the contrast of the image is good, but can be controlled from the outside using an electric focus lens when the contrast of the image is poor due to reasons such as attaching a band pass filter.

In the above method, it is assumed that θ and | r | have a linear relationship and pass through the origin.
However, in general, since there is a slight shift between the optical system of the eyeball and the visual axis, corneal reflection does not always appear at the center of the pupil image when the subject looks at O. That is, it does not pass through the origin. In order to perform high-precision line-of-sight detection, it is necessary to correct this. As a method, at the time of line-of-sight calibration, not only does the subject see one point, but also measures | r | and φ ′ when the subject looks at O. That is, it is necessary to calibrate the line of sight by having the subject see these two points at least.
In this case, the corneal reflection center-pupil center vector r (shift component) obtained when O is viewed is also shifted so that r obtained when one other point is seen is zero vector. Later (origin correction), | r | and φ ′ are obtained and calibration is performed. Also in real-time gaze detection, after correcting the obtained r in the same manner, | r | and φ ′ are obtained, and the gaze is calculated using them.

  The present invention can be widely used as a line-of-sight input means in the field of human behavior monitoring, human living environment or work environment. It is planned to be used in the field of manufacturing electronic devices and in the usage environment of electronic computers.

Fig. 2 is a schematic diagram showing an arrangement of an embodiment of an apparatus for carrying out the method according to the invention. An example of the positional relationship between the pupil and the corneal reflection (point) is enlarged. It is a block diagram which shows the structure of the apparatus etc. for implementing this invention method. 4 is a flowchart for explaining a method according to the present invention; It is a perspective view which shows arrangement | sequences of an apparatus etc. in relation to a world coordinate system. It is a graph for demonstrating the mapping of a pupil and a virtual viewpoint plane [correspondence between (r ′, φ) and (| θ | = k | r ′ |, −φ)]. It is a perspective view for demonstrating the relationship between a view object plane and a virtual viewpoint plane (H). It is a figure for demonstrating the meaning of (theta) and (phi) in a virtual viewpoint plane (H). It is a graph which shows the relationship between a direction vector and an angle.

Explanation of symbols

10, 11 First camera 12 Second camera 14 Camera interface 15 Illumination control device 16 Camera drive device 20 Bus line 22 Processing device (CPU)
24 input device 26 storage device 30 output interface 31 output device

Claims (5)

  1. A first camera for measuring the position of the pupil relative to the coordinate system;
    A light source for forming a corneal reflection point arranged at a known position in the coordinate system is provided, and data on the distance r between the center of the pupil and the corneal reflection point and the angle φ of the distance r with respect to the coordinate axis of the coordinate system is acquired. A method for detecting a gaze of a subject using a second camera and a computing unit that computes a gaze direction that performs the following steps based on information from each camera:
    A step wherein is noted subject to a known single point G of the coordinate system to obtain data about the coordinate point of the position of the pupil of the subject by the first camera,
    In the state of the subject, acquiring the data of the corneal reflection point, the distance r between the reflection point and the pupil center P, the inclination φ between the coordinate axis and r by the second camera;
    Calculating an angle θ between a sight line of the second camera reference position O and the line with the subject connecting the centers of the pupil by the arithmetic means,
    Based on the measurements and calculation values, relationships and a step of r calculated itself or r is a value corrected based on the distance OP r * and the formula showing the relationship between θ θ = f (r *) a step is noted subject to an unknown one point G 'of the formula determining step and and the coordinate system to obtain data about the coordinate point of the position of the pupil of the subject by the first camera,
    Acquiring 'inclination φ between' and the coordinate axes 'distance r to' the reflection point and the corneal reflection point data and the pupil center P wherein r by the second camera,
    Θ ′ = f (r * ′) is calculated from r * ′, which is a value obtained by correcting r ′ itself or r ′ based on the distance OP ′ using the relational expression, and is calculated from the inclinations φ ′ and θ ′. Obtaining an unknown point G ′, and
    A method for detecting a line of sight of a subject constituted by:
  2.   The first camera is a stereo camera arranged with the base line aligned in the horizontal axis direction of the coordinate system, and the light source of the second camera has an optical axis that substantially matches the optical axis of the second camera. The subject's gaze detection method according to claim 1.
  3. The method of detecting a visual line of a subject according to claim 1, wherein an expression θ = f (r * ) indicating a relationship between r * and θ is given by θ = k × r * (where k is a constant).
  4.   The method for detecting a visual line of a subject according to claim 1, wherein the pupil is one of the pupils of the subject.
  5. A first camera for measuring the position of the pupil with respect to the coordinate system,
    A light source arranged at a known position in the coordinate system, and a second camera that acquires data of an angle φ with respect to the coordinate axis of the distances r and r between the center of the pupil irradiated by the light source and a corneal reflection point;
    Wherein is noted subject to a known single point G of the coordinate system to obtain data about the coordinate point of the position of the pupil of the subject by the first camera,
    In the state of the subject, the corneal reflection point data, the distance r from the reflection point to the pupil center P, and the inclination φ between the coordinate axis and the r are acquired by the second camera.
    Calculates the angle θ between the second camera reference position O and the line of sight of the line and the subject connecting the center of the pupil, r and r * is a value corrected on the basis of its own or r the distance OP An equation θ = f (r * ) indicating the relationship of θ is calculated,
    The coordinate system is noted subject to an unknown one point G 'of obtains data about the coordinate point of the position of the pupil of the subject by the first camera,
    Get the 'inclination φ between' and the coordinate axes 'distance r to' the reflection point and the corneal reflection point data and the pupil center P wherein r by the second camera,
    Θ ′ = f (r * ′) is calculated from r *which is a value obtained by correcting r ′ itself or r ′ based on the distance OP ′ using the relational expression, and unknown from φ ′ and θ ′. Computing means for performing the step of obtaining one point G ′;
    A gaze detection device for a subject constituted by:
JP2003429344A 2003-12-25 2003-12-25 Gaze detection method and gaze detection apparatus Active JP4517049B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2003429344A JP4517049B2 (en) 2003-12-25 2003-12-25 Gaze detection method and gaze detection apparatus

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2003429344A JP4517049B2 (en) 2003-12-25 2003-12-25 Gaze detection method and gaze detection apparatus
US10/584,635 US7533989B2 (en) 2003-12-25 2004-12-24 Sight-line detection method and device, and three-dimensional view-point measurement device
PCT/JP2004/019311 WO2005063114A1 (en) 2003-12-25 2004-12-24 Sight-line detection method and device, and three- dimensional view-point measurement device

Publications (2)

Publication Number Publication Date
JP2005185431A JP2005185431A (en) 2005-07-14
JP4517049B2 true JP4517049B2 (en) 2010-08-04

Family

ID=34788040

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2003429344A Active JP4517049B2 (en) 2003-12-25 2003-12-25 Gaze detection method and gaze detection apparatus

Country Status (1)

Country Link
JP (1) JP4517049B2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013141098A1 (en) 2012-03-21 2013-09-26 国立大学法人浜松医科大学 Asperger's diagnosis assistance method and system, and asperger's diagnosis assistance device
WO2013176265A1 (en) 2012-05-25 2013-11-28 国立大学法人静岡大学 Pupil detection method, corneal reflex detection method, facial posture detection method, and pupil tracking method
US9538947B2 (en) 2011-09-05 2017-01-10 National University Corporation Hamamatsu University School Of Medicine Method, system and device for assisting diagnosis of autism

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7874917B2 (en) * 2003-09-15 2011-01-25 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
SE0602545L (en) * 2006-11-29 2008-05-30 Tobii Technology Ab Eye tracking illumination
FR2915290B1 (en) * 2007-04-18 2009-07-03 Essilor Int Method for measuring at least one geometrical-physiomical parameter for implantation of a frame of visual correction glasses on the face of a bearer
US8231220B2 (en) 2007-07-26 2012-07-31 Essilor International (Compagnie Generale D'optique) Method of measuring at least one geometrico-physionomic parameter for positioning a frame of vision-correcting eyeglasses on the face of a wearer
TWI432172B (en) * 2008-10-27 2014-04-01 Utechzone Co Ltd Pupil location method, pupil positioning system and storage media
US8371693B2 (en) * 2010-03-30 2013-02-12 National University Corporation Shizuoka University Autism diagnosis support apparatus
JP5590487B2 (en) * 2010-07-30 2014-09-17 公立大学法人広島市立大学 Gaze measurement method and gaze measurement device
WO2012020760A1 (en) 2010-08-09 2012-02-16 国立大学法人静岡大学 Gaze point detection method and gaze point detection device
WO2012077713A1 (en) 2010-12-08 2012-06-14 国立大学法人静岡大学 Method for detecting point of gaze and device for detecting point of gaze
JP5387557B2 (en) * 2010-12-27 2014-01-15 カシオ計算機株式会社 Information processing apparatus and method, and program
JP5818233B2 (en) * 2011-05-24 2015-11-18 国立大学法人神戸大学 Gaze measurement apparatus and method
JP5761049B2 (en) * 2012-01-24 2015-08-12 株式会社Jvcケンウッド Autism diagnosis support device and autism diagnosis support method
EP2907453B1 (en) 2012-09-28 2018-11-21 JVC Kenwood Corporation Diagnosis assistance device and diagnosis assistance method
JP6217445B2 (en) 2013-03-07 2017-10-25 株式会社Jvcケンウッド Diagnosis support apparatus and diagnosis support method
WO2014208761A1 (en) 2013-06-28 2014-12-31 株式会社Jvcケンウッド Diagnosis assistance device and diagnosis assistance method
JP6135550B2 (en) 2013-07-31 2017-05-31 株式会社Jvcケンウッド Diagnosis support apparatus and diagnosis support method
US10417782B2 (en) 2014-08-22 2019-09-17 National University Corporation Shizuoka University Corneal reflection position estimation system, corneal reflection position estimation method, corneal reflection position estimation program, pupil detection system, pupil detection method, pupil detection program, gaze detection system, gaze detection method, gaze detection program, face orientation detection system, face orientation detection method, and face orientation detection program
KR101610496B1 (en) * 2014-08-26 2016-04-07 현대자동차주식회사 Method and apparatus for gaze tracking
KR101745140B1 (en) 2015-09-21 2017-06-08 현대자동차주식회사 Gaze tracker and method for tracking graze thereof
WO2017134918A1 (en) * 2016-02-01 2017-08-10 アルプス電気株式会社 Line-of-sight detection device
WO2017217026A1 (en) * 2016-06-16 2017-12-21 アルプス電気株式会社 Cornea center detection device and gaze detection device
CN106123819B (en) * 2016-06-29 2018-07-24 华中科技大学 A kind of ' s focus of attention measurement method
JPWO2018030515A1 (en) 2016-08-12 2019-06-13 国立大学法人静岡大学 Gaze detection device
JP6617662B2 (en) * 2016-08-24 2019-12-11 株式会社Jvcケンウッド Gaze detection device, gaze detection method, and computer program

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9538947B2 (en) 2011-09-05 2017-01-10 National University Corporation Hamamatsu University School Of Medicine Method, system and device for assisting diagnosis of autism
WO2013141098A1 (en) 2012-03-21 2013-09-26 国立大学法人浜松医科大学 Asperger's diagnosis assistance method and system, and asperger's diagnosis assistance device
WO2013176265A1 (en) 2012-05-25 2013-11-28 国立大学法人静岡大学 Pupil detection method, corneal reflex detection method, facial posture detection method, and pupil tracking method
US9514538B2 (en) 2012-05-25 2016-12-06 National University Corporation Shizuoka University Pupil detection method, corneal reflex detection method, facial posture detection method, and pupil tracking method

Also Published As

Publication number Publication date
JP2005185431A (en) 2005-07-14

Similar Documents

Publication Publication Date Title
US10650594B2 (en) Surgeon head-mounted display apparatuses
JP5467303B1 (en) Gaze point detection device, gaze point detection method, personal parameter calculation device, personal parameter calculation method, program, and computer-readable recording medium
US9713420B2 (en) Optical instrument alignment system
US9967475B2 (en) Head-mounted displaying of magnified images locked on an object of interest
CN103429139B (en) Spectacle device with an adjustable field of view and method
US10268290B2 (en) Eye tracking using structured light
US10134166B2 (en) Combining video-based and optic-based augmented reality in a near eye display
US7369101B2 (en) Calibrating real and virtual views
US8077914B1 (en) Optical tracking apparatus using six degrees of freedom
US9329683B2 (en) Method for detecting point of gaze and device for detecting point of gaze
US9244529B2 (en) Point-of-gaze estimation robust to head rotations and/or device rotations
US6891518B2 (en) Augmented reality visualization device
DE60218406T2 (en) Ophthalmic device
JP5230748B2 (en) Gaze direction determination device and gaze direction determination method
JP2019141620A (en) Fixation point determination method and device on three-dimensional object
JP5915981B2 (en) Gaze point detection method and gaze point detection device
US7134992B2 (en) Gravity referenced endoscopic image orientation
US20140218281A1 (en) Systems and methods for eye gaze determination
KR100975488B1 (en) Image display apparatus and image distortion correction method of the same
US20160179193A1 (en) Content projection system and content projection method
Moore et al. A geometric basis for measurement of three-dimensional eye position using image processing
Hennessey et al. Noncontact binocular eye-gaze tracking for point-of-gaze estimation in three dimensions
US7657062B2 (en) Self-calibration for an eye tracker
EP1333306B1 (en) Method and system for stereoscopic microscopy
US7600873B2 (en) Method of determining the spatial relationship of an eye of a person with respect to a camera device

Legal Events

Date Code Title Description
RD03 Notification of appointment of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7423

Effective date: 20061212

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20061212

RD05 Notification of revocation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7425

Effective date: 20061222

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A821

Effective date: 20061222

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20091117

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20100115

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20100309

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150