JP4568024B2 - Eye movement measuring device and eye movement measuring program - Google Patents

Eye movement measuring device and eye movement measuring program Download PDF

Info

Publication number
JP4568024B2
JP4568024B2 JP2004146068A JP2004146068A JP4568024B2 JP 4568024 B2 JP4568024 B2 JP 4568024B2 JP 2004146068 A JP2004146068 A JP 2004146068A JP 2004146068 A JP2004146068 A JP 2004146068A JP 4568024 B2 JP4568024 B2 JP 4568024B2
Authority
JP
Japan
Prior art keywords
camera
eyeball
imaging
eye movement
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP2004146068A
Other languages
Japanese (ja)
Other versions
JP2005323905A (en
Inventor
一晃 小峯
康仁 澤畠
Original Assignee
日本放送協会
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本放送協会 filed Critical 日本放送協会
Priority to JP2004146068A priority Critical patent/JP4568024B2/en
Publication of JP2005323905A publication Critical patent/JP2005323905A/en
Application granted granted Critical
Publication of JP4568024B2 publication Critical patent/JP4568024B2/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to an eye movement measurement apparatus and an eye movement measurement program, and more particularly, to an eye movement measurement apparatus and an eye movement measurement program for realizing highly accurate eye movement measurement.

  Conventionally, when measuring eye movement of a subject, measurement is performed by performing image processing or the like from an eyeball image captured by an eyeball camera. In order to prevent the eyeball from falling out of the camera's imageable range (imaging range) during eye movement measurement, the position of the eyeball is always detected, and the camera orientation or mirror is placed at the detected eyeball position. When the subject is photographed via the control, the mirror is moved so as not to deviate from the photographing range.

  For example, there is a method of measuring an eyeball by specifying an eyeball position from a monochrome camera that captures a wide area using an infrared light source and moving the camera based on the specified eyeball position. Also, with the entire face as one image area, a grayscale image and a precise image of the eyeball viewing a predetermined display are acquired, the eyeball position is determined by blinking of the eyes, and the gazing point of the display is determined from the accurate image of the eyeball. There is a technique (for example, refer to Patent Document 1).

Furthermore, in order to correct the gaze shift that occurs at the time of gaze measurement, an eyeball model is introduced in the gaze calculation means, the refraction of light on the corneal surface that causes the shift is corrected in advance, and in the personal calibration means, A line-of-sight measurement system using management calibration by two-point correction for correcting the remaining deviation has been proposed (for example, see Non-Patent Document 1).
JP-A-10-154220 Takehiko Ohno, Naoki Takekawa, Atsushi Yoshikawa, "Gaze Measurement System Realizing Simple Calibration with Two-Point Correction", Transactions of Information Processing Society of Japan, Vol. 44 No. 4, Apr. 2003

  However, in the conventional method, if the subject moves at a speed that cannot be captured by the eyeball camera, the eye movement measurement device cannot track the camera because it does not know which direction the camera should be pointed. . In this case, in order to resume tracking, it is necessary to manually adjust the orientation of the camera or mirror. Also, at the start of measurement, it was necessary to manually adjust the orientation of the camera and mirror, and it took time and effort to start the measurement.

  In addition, the method of installing only a camera for photographing a wide area requires a special light source and a control device, which makes the system expensive and complicated. In addition, since eye tracking is only dependent on a camera that captures a wide area, eye position measurement is possible even when the subject is moving, even if the camera (or mirror) is pointed in the direction indicated by the wide area camera. Due to the difference in processing time (time lag), the eyeball could not be photographed correctly.

  In addition, although the eyeball camera can capture the eyeball, the camera and mirror orientation are corrected so that the center of the eyeball is the center of the shooting range even for a slight movement of the subject. In some cases, the eye movement cannot be measured during correction, and the measurement of the eye movement is affected.

  Furthermore, as a method other than the above, there is a contact-type gaze measurement method in which a sensor or the like is attached to the subject, and this makes it possible to always track the position of the eyeball. It will be a big burden.

  The present invention has been made in view of the above-described problems, and an eye movement measuring device for realizing highly accurate eye movement measurement by controlling the orientation of a camera and quickly performing position correction, and an eyeball The purpose is to provide a motion measurement program.

  In order to solve the above problems, the present invention employs means for solving the problems having the following characteristics.

The invention described in claim 1 is an eye movement measuring device that measures the eye movement of the subject by a first imaging device that takes an image of the subject at a wide angle and a second imaging device that takes an image of the eye of the subject. A camera orientation calculation unit that calculates control information for controlling the orientation of the second imaging device camera from an image captured by the first imaging device, and an image captured by the second imaging device. An eyeball tracking unit that tracks the position of the eyeball and generates control information for controlling the orientation of the second image pickup device camera, and a line of sight that calculates the position information of the line of sight from the video imaged by the second image pickup device Based on the check result obtained by the calculation unit, the tracking state check unit for checking the tracking state of the eyeball in the line-of-sight calculation unit, and the check result obtained by the tracking state check unit, The control information obtained from the eye-tracking unit, said position by controlling the orientation of the second imaging device have a camera control unit for correcting the tracking status checking unit is previously set obtained from the line-of-sight calculating portion The tracking state is checked based on a variance value of coordinate values acquired within a set time interval and a preset threshold value .

According to the first aspect of the present invention, highly accurate eye movement measurement can be realized by controlling the direction of the camera and quickly correcting the position. Specifically, by correcting the position quickly using the control information obtained from the video imaged from the first imaging device and the control information obtained from the video imaged from the second imaging device, Accurate eye movement measurement can be realized. Further, by checking the tracking state based on the variance value of the coordinate values acquired within the time interval, it is possible to quickly grasp whether or not the position of the eyeball is correctly measured.

  According to a second aspect of the present invention, the camera direction calculation unit converts the coordinate value of the eyeball in the image captured by the first imaging device into a coordinate value corresponding to the second imaging device. Camera orientation control information is calculated.

  According to the second aspect of the present invention, highly accurate position correction can be performed using the coordinate values. Thereby, highly accurate eye movement measurement can be realized.

  According to a third aspect of the present invention, the eyeball tracking unit provides a preset allowable range in the imaging range of the second imaging device, and the center of the pupil of the eyeball imaged by the second imaging device When the position is outside the allowable range, control information for correcting the second imaging device is output to the camera control unit.

  According to the third aspect of the invention, if it is within the allowable range, there is no need to perform useless control processing, so that the processing efficiency can be improved. Thereby, highly accurate eye movement measurement can be realized.

  In the invention described in claim 4, the tracking state check unit determines whether the coordinate value of the center of the pupil obtained from the line-of-sight calculation unit can be acquired within a preset time interval. A state check is performed.

  According to the fourth aspect of the present invention, it is possible to quickly grasp that the position of the eyeball is not correctly measured by checking the tracking state at the coordinate value acquisition interval. Thereby, highly accurate eye movement measurement can be realized.

The invention described in claim 5 includes a mirror for moving an imaging range of the second imaging device, and a mirror driving unit for driving the mirror, and the camera control unit includes the mirror The control information for controlling the position of the lens is generated, and the generated control information is output to the mirror driving unit.

According to the fifth aspect of the present invention, the position of the photographing range can be corrected more quickly by driving the lightweight mirror to correct the photographing position.

According to a sixth aspect of the present invention, in a computer, processing for measuring eye movement of the subject is performed by a first imaging device that images the subject at a wide angle and a second imaging device that images the eyeball of the subject. An eye movement measurement program for execution, a camera orientation calculation process for calculating control information for controlling the orientation of the second imaging device camera from an image captured by the first imaging device, and the second Eyeball tracking processing for tracking the position of the eyeball from the video imaged by the imaging device and generating control information for controlling the orientation of the second imaging device camera; and from the video imaged by the second imaging device. Obtained by the gaze calculation processing for calculating the position information of the gaze, the tracking status check processing for checking the tracking status of the eyeball in the gaze calculation processing, and the tracking status check processing. That based on the check result, the control information obtained from the camera orientation calculation process or the eye-tracking process is performed and a camera control process to correct the position by controlling the orientation of the second imaging device to the computer, The tracking state check process is characterized in that the tracking state is checked based on a variance value of coordinate values acquired within a preset time interval obtained from the line-of-sight calculation process and a preset threshold value. To do .

According to the sixth aspect of the present invention, highly accurate eye movement measurement can be realized by controlling the direction of the camera and quickly correcting the position. Specifically, by correcting the position quickly using the control information obtained from the video imaged from the first imaging device and the control information obtained from the video imaged from the second imaging device, Accurate eye movement measurement can be realized. Further, by checking the tracking state based on the dispersion value of the coordinate values acquired within the time interval, it is possible to quickly grasp whether or not the position of the eyeball is correctly measured. Furthermore, it is possible to easily realize eye movement measurement by installing a program.

  According to the present invention, highly accurate eye movement measurement can be realized.

<Outline of the present invention>
The present invention installs a camera (second imaging device) that captures and tracks an eyeball and a wide-angle camera (first imaging device) that captures the state around the subject when measuring eye movements. When the eyeball is not tracked by the camera that shoots the eyeball, the orientation of the camera or mirror that quickly shoots the eyeball is controlled and corrected to the correct camera position based on the eyeball position detected from the image of the wide-angle camera. This realizes highly accurate eye movement measurement.

<Embodiment>
Embodiments to which an eye movement measurement device and an eye movement measurement program according to the present invention are applied will be described below with reference to the drawings. FIG. 1 is a diagram showing a first embodiment of an eye movement measurement device according to the present invention.

  An eye movement measurement device 10 shown in FIG. 1 is configured to include a subject imaging device 11 and an eyeball imaging device 12. The subject imaging device 11 includes a subject imaging camera (first imaging device) 20, a face feature extraction unit 21, a camera orientation calculation unit 22, a first camera orientation control unit 23, and a tracking state check unit 24. It is comprised so that. The eyeball photographing device 12 includes an eyeball photographing camera (second imaging device) 30, a light source device 31, a servo controller 32, a second camera orientation control unit 33, an eyeball tracking unit 34, and a line-of-sight calculation unit. 35.

  The subject photographing device 11 is a camera that is photographed at a wide angle so as to fall within the photographing range even when the subject 13 moves to some extent, and for correcting the photographing position in the eyeball photographing device 12 from the photographed subject 13 image. Generate camera orientation control information. The eyeball photographing device 12 is a camera that photographs the eyeball of the subject 13, and corrects the photographing position in accordance with the movement of the eyeball. Further, when the eyeball is out of the shooting range, the shooting position is corrected by the camera direction control information from the subject shooting device 11.

  Here, a subject to be photographed from the subject photographing camera 20 and the eyeball photographing camera 30 will be described with reference to the drawings. FIG. 2 is a diagram for explaining a state in which an object to be photographed is photographed. 2A shows an image taken by the subject photographing camera 20, and FIG. 2B shows an image taken by the eyeball camera 30. FIG.

  As shown in FIG. 2 (a), the upper part of the image taken from the subject photographing camera 20 is taken from the vicinity of the subject's shoulder, and in particular, the entire face is taken. In addition, as shown in FIG. 2B, the left and right eyeball portions of the subject 13 are photographed from the image photographed from the eyeball camera 30.

  Next, specific functions of the subject imaging apparatus 11 and the eyeball imaging apparatus 12 in FIG. 1 will be described. The facial feature extraction unit 21 performs a facial feature extraction process or the like based on a facial image included in the video captured by the subject photographing camera 20, for example, the position of one of the left and right eyeballs, for example, the pupil position. Extract the center coordinates. It should be noted that the left or right eyeball is extracted by extracting the same eyeball as the eyeball photographed by the eyeball photographing device 12.

  Here, there are various facial feature extraction methods. For example, an edge extraction filter or the like is applied to a facial grayscale image, and the position of the eyeball is determined by matching the edge image with a template of a facial part made of edges. There are methods of extraction. The face feature extraction unit 21 outputs the extracted position information of the eyeball to the camera direction calculation unit 22.

  The camera orientation calculator 22 obtains camera orientation control information for the eyeball camera 30 from the orientation of the camera currently photographed by the subject photographing camera 20 obtained from the eyeball position information extracted by the face feature extractor 21. calculate. In addition, the camera orientation calculation unit 22 outputs the calculated camera orientation control information to the first camera orientation control unit 23.

  In the tracking state check unit 24, the first camera direction control unit 23 determines whether or not the eyeball camera 30 can correctly measure the eyeball of the subject 13 from the position information of the subject's line of sight obtained from the line-of-sight calculation unit 35. The control information of the first camera direction control unit 23 is output to the servo controller 32 based on the check result.

  The tracking state check unit 24 is based on the position information (for example, the coordinate value of the center of the pupil) of the eyeball imaged by the eyeball camera 30 obtained from the line-of-sight calculation unit 35. The tracking state is checked, and the check result is output to the first camera direction control unit 23.

  On the other hand, in the eyeball photographing apparatus 12, the eyeball photographing camera 30 photographs one eyeball of a subject set in advance. Note that the camera body of the eyeball photographing camera 30 is movable, and the eyeball can be traced to some extent by moving the photographing range.

  The light source device 31 irradiates the subject 13 with infrared light by an LED (Light Emitting Diode) or the like. The subject in that state is photographed by the eyeball camera 30 to detect information in the vicinity of the iris part of the eyeball and acquire the line-of-sight direction. Specifically, the position of the iris is calculated from the difference in intensity of the reflected light of multiple wavelengths, and then the line-of-sight direction is obtained by following the reflected light etc. from the cornea and changes in the apparent area of the iris and pupil. Can do. Note that the eyeball position detection method is not limited to this, and the eyeball may be detected using another image processing method. Further, the eyeball image photographed from the eyeball camera 30 is output to the eyeball tracking unit 34 and the line-of-sight calculation unit 35.

  The servo controller 32 generates a correction signal for correcting the position / orientation / posture of the eyeball camera 30 based on the control information obtained from the second camera direction control unit 32 or the first camera direction control unit 23. By outputting to the eyeball photographing camera 30, the eyeball photographing camera 30 is controlled. Further, the second camera direction control unit 33 corrects the position of the photographing range captured by the eyeball photographing camera 30 from the position of the eyeball obtained from the eyeball tracking unit 34. Further, the eyeball tracking unit 34 outputs the tracked eyeball position information to the second camera direction control unit 33.

  The line-of-sight calculation unit 35 outputs information such as the line-of-sight information of the subject 13 and the pupil diameter from the video imaged by the eyeball camera 30. The line-of-sight calculation unit 35 outputs eyeball position information (for example, the coordinate value of the center of the pupil) to the tracking state check unit 24. The line-of-sight calculation unit 35 outputs a new eyeball position information to the tracking state check unit 24 every time it calculates, or outputs it to the tracking state check unit 24 based on a preset period. In response to an eyeball position information acquisition request from the unit 24, coordinate values can be output.

  Thus, when the eyeball cannot be photographed due to the subject 13 moving during the eyeball photographing in the eyeball photographing device 12, the subject photographing camera 20 photographs the eyeball from the check result in the tracking state check unit 24. The control information obtained from the position information of the eyeball can be output to the eyeball photographing device 12 to correct the position by controlling the camera direction of the eyeball photographing camera 30.

<Camera orientation calculation unit: conversion method from position information to control information>
Here, in the camera orientation calculation unit 22, the position information of the eyeball of the subject 13 obtained from the video photographed at the wide angle by the subject photography camera 20 corresponds to control information for controlling the camera orientation of the eyeball photography camera 30. The conversion method for making this happen will be described with reference to the drawings. FIG. 3 is a diagram for explaining a conversion method into control information for controlling the orientation of the camera in the present embodiment.

  As shown in FIG. 3, in the subject photographing apparatus 11 and the eyeball photographing apparatus 12, the position information is converted into control information by positioning the position of the eyeball of the subject 13 as the same coordinates. Specifically, a parameter for determining the orientation (position) of the eyeball camera 30 from the coordinate value (positional information of the eyeball) of the center of the pupil in the video photographed by the subject photographing camera 20 (how much, such as pan and Control information indicating whether to move the camera such as tilt).

  When the conversion is performed, the coordinate axes (xy axis) for the imaging ranges 41 and 42 captured from the respective cameras (subject imaging camera 20 and eyeball imaging camera 30) are changed to the same coordinate axis (mn axis). Perform the conversion. Note that the xy coordinate values coincide with the coordinate values in the video imaged by the subject imaging camera 20 (wide-angle camera). Further, the mn coordinate value is a parameter for specifying an imaging region in the eyeball camera 30.

First, three calibration points 43 are set for the photographing range 41 of the subject photographing camera 20 shown in FIG. 3, and the respective coordinates are (x 1 , y 1 ), (x 2 , y 2 ), ( x 3 , y 3 ). Moreover, the same position consisting of these coordinates is expressed as (m 1 , n 1 ), (m 2 , n 2 ), and (m 3 , n 3 ) as different coordinate systems. Thereby, the conversion from the xy axis to the mn axis can be expressed by the following equation (1).

In the above-described equation (1), conversion is performed by linear interpolation using two calibration points in the x and y directions. Specifically, in the horizontal direction, (x 2 , m 2 ) and (x 3 , m 3 ) are passed from the conversion formula in the m direction (x direction) using the calibration points 43-2 and 43-3. A straight line expression is obtained, and in the vertical direction, calibration points 43-1 and 43-2 are used, and (x 2 , m 2 ) and (x 3 , m 3 ) The equation of the straight line passing through is obtained.

Here, when the coordinate value of the center 44 of the pupil imaged by the eyeball camera 30 is (x e , y e ), the conversion formula to the mn coordinate is expressed by the following formula (2). be able to.

Here, the coordinate value (m, n) in the mn coordinate system corresponds to the direction of the camera (angle in the horizontal and vertical directions). That is, based on the coordinate value (m, n) obtained by the mn coordinate system, the position of the eyeball camera 30 is moved by m ° in the vertical direction and n ° in the horizontal direction, thereby correcting the camera position. And correction can be made quickly from a state where the eyeball is out of the imaging range. Thereby, highly accurate eye movement measurement can be realized.

<Eyeball tracking unit>
Next, a specific example in the eyeball tracking unit 34 will be described. There are cases where the eyeball of the subject 13 suddenly deviates from the photographing range, and the eyeball photographed by the eyeball photographing camera 30 gradually shifts out of the photographing range and out of the frame. The eyeball tracking unit 34 corrects the photographing range of the eyeball photographing camera 30 by tracking the eyeball when the eyeball gradually shifts from the photographing range.

  Here, an example of tracking will be described. FIG. 4 is a diagram illustrating an example of an eyeball photographed by an eyeball camera. As shown in FIG. 4, as for eyeball photography, as long as the center of the pupil is positioned at the center position of the imaging range 51 as shown in FIG. As shown in FIG. 4B and FIG. Further, when the center of the pupil is always aligned with the center of the photographing range, camera position correction frequently occurs.

  Therefore, the eyeball tracking unit 34 provides a preset allowable range within the photographing range, and if the pupil center position is within the allowable range, the camera position is not corrected, and when the lens is out of the allowable range. The control information is output to the second camera orientation control unit 33. Thereby, frequent position correction can be prevented.

  Here, an example of setting the allowable range will be described with reference to the drawings. FIG. 5 is a diagram illustrating an example of a preset allowable range. In FIG. 5, for example, the resolution of the captured image is 320 pixels in the x direction and 240 pixels in the y direction.

  As shown in FIG. 5, the camera position correction is performed when the center of the pupil is included in the correction target region (shaded portion in FIG. 5) in the shooting range 51 of the camera. The correction target range width is preferably about 40 to 50 pixels inward from the outer frame of the imaging range 51. However, depending on the size of the eyeball or pupil being imaged, the accuracy of the camera (eg, resolution, etc.), etc. Set an appropriate width. Further, in the photographing range 51, a range other than the correction target range is an allowable range, and when the center of the pupil is within this range, position correction is not performed.

  Here, the control procedure of the camera direction based on the eyeball tracking of the eyeball tracking unit will be described using a flowchart. FIG. 6 is a flowchart illustrating an example of a camera-oriented control processing procedure based on eyeball tracking. Note that the above-described correction target range width in FIG. 6 is an area 40 pixels inside the outer frame of the imaging area in the resolution shown in FIG.

  In FIG. 6, first, the center coordinate position (x, y) of the pupil is acquired. Next, it is determined whether the x-coordinate value is smaller than 40 pixels (S02). If the x-coordinate value is smaller than 40 pixels (YES in S02), control information for controlling the direction of the eyeball camera 30 to the left is generated. To the second camera orientation control unit 33 (S03). At this time, the coordinates of the center position of the eyeball are also output. If the x-coordinate value is 40 or more in S02 (NO in S02), it is next determined whether the x-coordinate value is larger than 280 pixels (S04).

  If the x-coordinate value is larger than 280 pixels (YES in S04), control information for controlling the direction of the eyeball camera 30 to the right is generated and output to the second camera direction control unit 33 ( S05). At this time, the coordinates of the center position of the eyeball are also output.

  In S04, when the x-coordinate value is 280 or less (NO in S04), the x-coordinate of the center coordinate position (x, y) of the pupil is within the correction allowable range. Is smaller than 40 pixels (S06).

  If the y-coordinate value is smaller than 40 pixels (YES in S06), control information for controlling the direction of the eyeball camera 30 upward is generated and output to the second camera direction control unit 33 ( S07). At this time, the coordinates of the center position of the eyeball are also output. If the y coordinate value is 40 or more in S06 (NO in S06), it is then determined whether the y coordinate value is greater than 200 pixels (S08).

  If the y-coordinate value is larger than 200 pixels (YES in S08), control information for controlling the direction of the eyeball camera 30 downward is generated and output to the second camera direction control unit 33 ( S09). At this time, the coordinates of the center position of the eyeball are also output.

  Next, whether the y-coordinate value is 200 or less (NO in S08) or whether the camera orientation control process in eye tracking is to be ended after any of the processes of S03, S05, S07, and S09 is completed. Judgment is made (S10). When the camera-oriented control process is not terminated (NO in S10), the process returns to S01 and eyeball tracking is continued. When the camera-oriented control process is to be ended (S10), the process is ended as it is.

  Based on the control information obtained by the above-described camera orientation control processing in eyeball tracking, the second camera orientation controller 33 generates a correction signal for correcting the position of the eyeball camera 30 and outputs the correction signal to the servo controller 32. Thus, the position correction when the pupil center is in the imaging range of the eyeball camera 30 can be realized only by the eyeball imaging device 12.

<Tracking status check part>
Next, a specific example of the tracking state check in the tracking state check unit 24 will be described. The tracking state check unit 24 acquires position information of the line of sight (pupil center) obtained by the line-of-sight calculation unit 35, for example, coordinate values (x g , y g ). Here, the tracking state check unit 24 checks whether the line of sight is correctly measured from the acquired coordinate values (x g , y g ). As a result of the check, if it is determined that the measurement is correctly performed, control information indicating that the state is normal (OK) is output to the first camera direction control unit 23 as the check result. If the measurement is correct (when OK), nothing may be output to the first camera direction control unit 23. Further, if it is determined that the measurement cannot be performed correctly as a result of the check, the control for causing the first controller for controlling the camera 23 to output the camera orientation control information obtained by the camera orientation calculating unit 22 to the servo controller 32 as the check result. Output information.

  Here, there are various methods for determining whether the line of sight is correctly measured. For example, a method for determining whether or not the next coordinate value of the eyeball can be acquired at predetermined intervals, If the variance of coordinate values acquired within a time interval is small and the spatial locality is short in a short time (that is, whether coordinate values can be acquired at an allowable time interval), the line of sight cannot be measured correctly. There are methods to judge.

  Here, the processing procedure of the tracking state check described above will be described using a flowchart. FIG. 7 is a flowchart illustrating an example of a processing procedure of the first tracking state check in the present embodiment. Note that the processing procedure shown in FIG. 7 is a processing procedure in the case where it is determined whether or not the next coordinate value has been acquired in advance within a preset time.

  In FIG. 7, first, a time (timer) is set in order to check the tracking state (S21). Further, the acquisition interval of the coordinate values obtained by the line-of-sight calculation unit 35 is calculated (S22).

  Here, it is determined whether or not the coordinate value acquisition interval acquired in S22 is longer than the set time set in S01 (S23). If the acquisition interval is large (YES in S23), control information for causing the servo controller 32 to output the camera orientation control information obtained by the camera orientation calculator 22 to the first camera orientation controller 23 is output ( S24).

  If the acquisition interval is equal to or shorter than the set time in S23 (NO in S23), the process is terminated as it is. In this case, control information indicating that measurement is correctly performed may be output to the first camera orientation control unit 23.

  This makes it possible to determine whether the line of sight can be measured efficiently and correctly based on the coordinate value acquisition interval (rate) and the set time.

  FIG. 8 is a flowchart showing an example of the processing procedure of the second tracking state check in the present embodiment. FIG. 8 shows a processing procedure in the case of making a determination based on the number of coordinate values acquired within a preset time and the preset number.

  In FIG. 8, first, the coordinate value (xg, yg) of the line of sight measured by the line-of-sight calculation unit 35 is acquired (S31). Next, each of the acquired coordinate values xg and yg is output to a buffer having the tracking state check unit 24 (S32). Also, the variance of the xg and yg buffers of the predetermined number of coordinate values accumulated is calculated, and it is determined whether or not the variance of both the xg and yg buffers is equal to or greater than a threshold value (S34).

  Here, if the variance of both buffers is equal to or greater than the threshold (YES in S34), the first camera direction controller 23 causes the servo controller 32 to output the camera direction control information obtained by the camera direction calculator 22. The control information is output (S35).

  If the distribution of both buffers is not equal to or greater than the threshold value (NO in S34), the process ends as it is. In this case, control information indicating that measurement is correctly performed may be output to the first camera orientation control unit 23.

  In this way, by checking the dispersion value with respect to the acquired coordinate value, it is possible to grasp whether or not the eyeball position (the coordinate value of the pupil center) is correctly acquired as a whole. Therefore, highly accurate eye movement measurement can be performed.

  Further, when the eyeball imaging device 12 cannot acquire the center of the pupil by the processing procedure as shown in FIG. 7 or FIG. 8, the camera orientation control information obtained from the subject imaging device 11 is output to the servo controller 32. Thus, the position of the eyeball camera 30 can be quickly corrected by controlling the camera direction.

  Here, in the embodiment shown in FIG. 1, the eye movement measurement device 10 includes the subject imaging device 11 and the eyeball imaging device 12, but has the functions of the subject imaging device 11 and the eyeball imaging device 12. You may comprise as one apparatus.

  In addition, the movement of the eyeball camera 30 is controlled by the servo controller 32. For example, two mirrors that can move in the x-axis direction and the y-axis direction, respectively, and a mirror driving unit that drives the mirrors are provided. By having it, a position correction can be quickly performed by driving a lighter mirror than the camera body. Here, the above-described configuration will be described below as a second embodiment.

  FIG. 9 is a diagram showing a second embodiment of the eye movement measurement device according to the present invention. In FIG. 9, the subject imaging device 11 and the eyeball imaging device 12 are configured as an integral unit, and further include a mirror and a mirror driving unit. However, in this embodiment, the subject imaging device 11 and the eyeball imaging are performed. It can also be configured as an eye movement measurement device in which the device 12 is integrated or an eye movement measurement device each having a mirror and a mirror drive unit.

  The eye movement measuring device 60 shown in FIG. 9 includes a subject photographing camera (first imaging device) 61, a face feature extracting unit 62, a mirror orientation calculating unit 63, a tracking state checking unit 64, and an eye photographing camera. (Second imaging device) 65, a light source device 66, a servo controller 67, a mirror orientation control unit 68, an eyeball tracking unit 69, a line-of-sight calculation unit 70, a mirror 71, and a mirror driving unit 72. It is configured as follows. In addition, in the main components in FIG. 9, components having the same names as those in FIG. 1 have the same functions.

  The subject imaging device 61 captures an image of the entire subject as shown in FIG. Further, the face feature extraction unit 62 extracts the position information of the eyeball from the face feature as described above. Further, the face feature extraction unit 62 outputs the extracted position information to the mirror direction calculation unit 63.

  The mirror direction calculation unit 63 obtains camera direction control information for the eyeball camera 65 based on the camera direction currently captured by the subject imaging camera 61 obtained from the eyeball position information extracted by the face feature extraction unit 62. calculate. Further, the mirror direction calculation unit 63 outputs the calculated camera direction control information to the mirror direction control unit 68.

  The tracking state check unit 64 checks whether the eyeball camera 65 can capture the eyeball of the subject 13 via the mirror 71 from the position of the subject's line of sight obtained from the line-of-sight calculation unit 70, and uses the check result. Based on this, control information for causing the mirror direction control unit 68 to select either the control information from the mirror direction calculation unit 63 or the control information from the eyeball tracking unit 69 is generated and output to the mirror direction control unit 68.

  The eyeball photographing camera 65 photographs one eyeball of a subject set in advance. In the second embodiment, the main body of the eyeball camera 65 is fixed. In addition, the light source device 66 irradiates the subject 13 with infrared light using an LED (Light Emitting Diode) or the like. The subject in that state is photographed by the eyeball photographing camera 65, thereby detecting information near the iris portion of the eyeball and acquiring the line-of-sight direction. Further, the eyeball image photographed from the eyeball camera 65 is output to the eyeball tracking unit 69 and the line-of-sight calculation unit 70.

  The servo controller 67 outputs a correction signal for correcting the position / orientation / attitude of the mirrors 71-1 and 71-2 to the mirror driving unit 72 based on the control information obtained from the mirror direction control unit 68. Then, the direction of the mirror is controlled to control the photographing range of the eyeball photographing camera 65.

  The mirror direction control unit 68 is based on the control information obtained from the eyeball tracking unit 68 or the control information obtained from the mirror direction calculation unit 63 and the check result (control information for selection) obtained from the tracking state check unit 64. Then, either control information is selected to control the mirror orientation.

  The eyeball tracking unit 69 acquires eyeball position information from the eyeball image obtained by the eyeball camera 65 as described above, generates control information based on the eyeball tracking described above, and controls the generated control information to mirror direction control. The data is output to the unit 68.

  The line-of-sight calculation unit 70 outputs the line-of-sight information of the subject 13 and information such as the pupil diameter from the video imaged by the eyeball camera 65. The line-of-sight calculation unit 70 outputs eyeball position information (for example, the coordinate value of the center of the pupil) to the tracking state check unit 64. The line-of-sight calculation unit 70 outputs the position information of the new eyeball to the tracking state check unit 64 every time it calculates, or outputs it to the tracking state check unit 64 based on a preset period. In response to the acquisition request of the position information of the eyeball from the unit 64, the coordinate value can be output.

  The mirror driving unit 72 moves the two mirrors 71-1 and 71-2 to a predetermined angle based on a correction signal for controlling the mirror direction obtained by the servo controller 67.

  Thereby, when the subject 13 moves during the eyeball photographing in the eye movement measuring device 60, the control information obtained from the position information of the eyeball photographed by the subject photographing camera 61 from the check result in the tracking state check unit 64. Alternatively, one of the control information obtained by eyeball tracking can be selected from the video photographed by the eyeball camera 65 and the photographing range of the eyeball camera 65 can be corrected by mirror orientation control. Further, since the mirror itself is lighter than the camera body, position correction can be realized quickly, and high-accuracy eye movement imaging can be realized.

  Here, the position correction contents of the mirror 71 in the second embodiment will be described. As for the position correction contents of the mirrors 71-1 and 71-2, the coordinate values (m, n) are made to correspond to the mirror directions (horizontal and vertical angles) using the mn coordinate system described above. Make corrections.

  FIG. 10 is a diagram illustrating an example of mirror position correction. As shown in FIG. 10, the mirror 71-1 is moved in the vertical direction n ° in correspondence with the coordinate value (m, n) in the mn coordinate system described above, and the mirror 71-2 is moved in the horizontal direction m. By moving, the imaging range of the eyeball camera 65 can be quickly corrected.

  Here, the eye movement measuring device described above generates, for example, an execution program that can cause a computer to execute the above-described eye movement measurement, and for example, by installing the program on a general-purpose personal computer, workstation, or the like, The measurement of eye movement in the present invention can be realized.

<Hardware configuration>
Here, a hardware configuration example of a computer capable of performing eye movement measurement according to the present invention will be described with reference to the drawings. FIG. 11 is a diagram illustrating an example of a hardware configuration capable of realizing eye movement measurement according to the present invention.

  11 includes an input device 81, an output device 82, a drive device 83, an auxiliary storage device 84, a memory device 85, a CPU (Central Processing Unit) 86 for performing various controls, and a network connection device. 87, which are connected to each other by a system bus B.

  The input device 81 has a pointing device such as a keyboard and a mouse operated by a user, and inputs various operation signals such as execution of a program from the user. The output device 82 has a display for displaying various windows and data necessary for operating the computer main body for performing the processing in the present invention, and displays the program execution progress and results by the control program of the CPU 86. can do.

  Here, in the present invention, the execution program installed in the computer main body is provided by, for example, a recording medium 88 such as a CD-ROM. The recording medium 88 on which the program is recorded can be set in the drive device 83, and the execution program included in the recording medium 88 is installed in the auxiliary storage device 84 from the recording medium 88 via the drive device 83.

  The auxiliary storage device 84 is a storage means such as a hard disk, and can store an execution program according to the present invention, a control program provided in a computer, and perform input / output as necessary.

  Based on a control program such as an OS (Operating System) and an execution program read and stored by the memory device 85, the CPU 86 performs various operations and input / output of data to / from each hardware component. Each process in the eye movement measurement can be realized by controlling the process. Various information necessary during the execution of the program can be acquired from the auxiliary storage device 84 and can also be stored.

  The network connection device 87 acquires an execution program from another terminal connected to the communication network by connecting to a communication network or the like, or an execution result obtained by executing the program or an execution in the present invention The program itself can be provided to other terminals.

  With the hardware configuration as described above, it is possible to realize measurement of eye movement at a low cost without requiring a special device configuration. In addition, by installing the program, it is possible to easily realize eye movement measurement.

  Next, a processing procedure in the execution program will be described using a flowchart.

<Eye movement measurement processing procedure>
FIG. 12 is a flowchart of an example showing the eye movement measurement processing procedure in the present invention. Note that the eye movement measurement processing procedure shown in FIG. 12 performs position correction by controlling the orientation of the camera in the same manner as in the first embodiment described above.

  First, camera photographing is started by the subject photographing camera and the eyeball photographing camera (S41). Next, the eyeball is tracked by the eyeball camera (S42). At the same time, the camera orientation is calculated from the video taken from the subject photographing camera (S43).

  Here, it is determined whether or not the position of the eyeball camera needs to be corrected (S44). Specifically, whether or not correction is necessary is determined based on whether or not the center of the pupil of the subject is within the allowable range of the eyeball camera. If correction is necessary (YES in S44), the tracking state is checked to determine whether the measurement is correctly performed (OK) (S45). Specifically, it is determined whether the line of sight is correctly measured by the tracking state check as described above.

  If the result of the tracking state check is OK in S45 (YES in S45), the camera orientation is controlled based on the control information obtained by the eye tracking process in S42 (S46). If the result of the tracking state check indicates that the measurement has not been performed correctly (NO in S45), the camera orientation is controlled based on the control information obtained by the camera orientation calculation process in S43 (S47).

  Next, in S44, when correction is not necessary (NO in S44), or after the processing of S46 or S47 is finished, it is determined whether or not the eye movement measurement is finished (S48), and when it is not finished (NO in S48) ), The processing from S42 and S43 is continued. If the measurement is to be terminated (YES in S48), the camera shooting is terminated (S49).

  Thereby, eye movement measurement can be realized with high accuracy. Moreover, eye movement measurement can be realized at low cost without requiring a special device configuration. Furthermore, it is possible to easily realize eye movement measurement by installing a program. Further, as shown in the second embodiment, when a mirror is provided, a similar effect can be obtained by generating an execution program that performs mirror control instead of camera control.

  As described above, according to the present invention, it is possible to realize highly accurate eye movement measurement by quickly correcting the position. Specifically, when measuring eye movements, a camera that captures and tracks the eyeball and a camera with a pan and tilt function and a wide-angle camera that captures the surroundings of the subject are used to capture the eyeball. The eyeball position detected from the video of the wide-angle camera is acquired when the eyeball is untracked, so that the direction of the camera that captures the eyeball is directed to the correct position, and the degree of freedom of movement of the subject during eye movement measurement is increased. At the same time, complicated measurement work can be performed efficiently.

  In other words, there is no need for the subject to wear anything, and even if eye tracking by the eye movement measurement device does not function, tracking can be automatically resumed, so that the environment allows relatively free movement of the subject. It is possible to easily realize eye movement measurement with a lens.

  By applying the eye movement measurement according to the present invention, for example, a user generally acquires various information such as icons, menus, charts, and the like from a display in order to operate the computer. By clarifying the target of the line of sight, the computer uses the information, and various computer operation methods can be realized. Further, the present invention is not limited to the field of man-machine interface as described above, and can be applied to fields such as the medical field and psychology by measuring eye movement.

  Although the preferred embodiment of the present invention has been described in detail above, the present invention is not limited to the specific embodiment, and various modifications, within the scope of the gist of the present invention described in the claims, It can be changed.

It is a figure showing a 1st embodiment of an eye movement measuring device in the present invention. It is a figure for demonstrating a mode that the image | photographed target object was image | photographed. It is a figure for demonstrating the conversion method to the control information which controls the direction of the camera in this embodiment. It is a figure which shows an example of the eyeball image | photographed with the camera for eyeball imaging | photography. It is a figure which shows an example of the tolerance | permissible_range set beforehand. It is a flowchart of an example for demonstrating the control processing procedure of the camera direction based on eyeball tracking. It is a flowchart which shows an example of the process sequence of the 1st tracking state check in this embodiment. It is a flowchart which shows an example of the process sequence of the 2nd tracking state check in this embodiment. It is a figure which shows 2nd Embodiment of the eye movement measurement apparatus in this invention. It is a figure of an example which shows the mode of position correction of a mirror. It is a figure which shows an example of the hardware constitutions which can implement | achieve the eye movement measurement in this invention. It is a flowchart of an example which shows the eye movement measurement process procedure in this invention.

Explanation of symbols

DESCRIPTION OF SYMBOLS 10,60 Eye movement measuring apparatus 11 Test subject imaging device 12 Eyeball imaging device 13 Test subject 20, 61 Test subject imaging camera 21, 62 Face feature extraction unit 22 Camera direction calculation unit 23 First camera direction control unit 24, 64 Tracking state check unit 30, 65 Camera for eyeball imaging 31, 66 Light source device 32, 67 Servo controller 33 Second camera orientation control unit 34, 69 Eye tracking unit 35, 70 Eye-gaze calculation unit 41, 42, 51 Imaging range 43 Calibration point 44 Pupil of pupil Center 63 Mirror orientation calculation section 68 Mirror orientation control section 71 Mirror 72 Mirror drive section 81 Input device 82 Output device 83 Drive device 84 Auxiliary storage device 85 Memory device 86 CPU
87 Network connection device 88 Recording medium

Claims (6)

  1. An eye movement measuring device that measures eye movement of the subject by a first imaging device that photographs the subject at a wide angle and a second imaging device that photographs the eyeball of the subject,
    A camera orientation calculator that calculates control information for controlling the orientation of the second imaging device camera from the video imaged by the first imaging device;
    An eyeball tracking unit that tracks a position of an eyeball from an image captured by the second imaging device and generates control information for controlling a direction of the second imaging device camera;
    A line-of-sight calculation unit that calculates position information of the line of sight from an image captured by the second imaging device;
    A tracking state check unit that checks the tracking state of the eyeball in the line-of-sight calculation unit;
    Based on the check result obtained by the tracking state check unit, the camera control unit corrects the position by controlling the direction of the second imaging device by the control information obtained from the camera direction calculation unit or the eyeball tracking unit. It has a door,
    The tracking state check unit
    An eye movement measuring device that checks a tracking state based on a variance value of coordinate values acquired within a preset time interval obtained from the line-of-sight calculation unit and a preset threshold value .
  2. The camera orientation calculation unit
    The camera orientation control information is calculated by converting the coordinate value of the eyeball in the video imaged by the first imaging device into a coordinate value corresponding to the second imaging device. Eye movement measurement device.
  3. The eye tracking unit
    A preset allowable range is provided in the imaging range of the second imaging device, and the second position when the center position of the pupil of the eye image captured by the second imaging device is outside the allowable range The eye movement measurement device according to claim 1, wherein control information for correcting the imaging device is output to the camera control unit.
  4. The tracking state check unit
    The tracking state is checked by determining whether or not the coordinate value of the center of the pupil obtained from the line-of-sight calculation unit can be acquired within a preset time interval. The eye movement measuring device according to claim 1.
  5. A mirror for moving the imaging range of the second imaging device; and a mirror driving unit for driving the mirror;
    The camera control unit generates control information for controlling the position of the mirror, the generated control information to any one of claims 1 to 4 and outputs to the mirror driver The eye movement measuring apparatus as described.
  6. An eye movement measurement program for causing a computer to execute a process of measuring eye movement of the subject by a first imaging apparatus that takes an image of the subject at a wide angle and a second imaging apparatus that takes an image of the eyeball of the subject. And
    A camera orientation calculation process for calculating control information for controlling the orientation of the second imaging device camera from the video imaged by the first imaging device;
    Eyeball tracking processing for tracking the position of the eyeball from the video imaged by the second imaging device and generating control information for controlling the orientation of the second imaging device camera;
    Line-of-sight calculation processing for calculating line-of-sight position information from the video imaged by the second imaging device;
    A tracking state check process for checking a tracking state of an eyeball in the line-of-sight calculation process;
    On the basis of the tracking status check processing by checking obtained results, the control information obtained from the camera orientation calculation process or the eye-tracking process, the camera control process to correct the position by controlling the orientation of the second imaging device And let the computer run
    The tracking state check process includes:
    An eye movement measurement program for performing a tracking state check based on a variance value of coordinate values acquired in a preset time interval obtained from the line-of-sight calculation process and a preset threshold value .
JP2004146068A 2004-05-17 2004-05-17 Eye movement measuring device and eye movement measuring program Expired - Fee Related JP4568024B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2004146068A JP4568024B2 (en) 2004-05-17 2004-05-17 Eye movement measuring device and eye movement measuring program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2004146068A JP4568024B2 (en) 2004-05-17 2004-05-17 Eye movement measuring device and eye movement measuring program

Publications (2)

Publication Number Publication Date
JP2005323905A JP2005323905A (en) 2005-11-24
JP4568024B2 true JP4568024B2 (en) 2010-10-27

Family

ID=35470712

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2004146068A Expired - Fee Related JP4568024B2 (en) 2004-05-17 2004-05-17 Eye movement measuring device and eye movement measuring program

Country Status (1)

Country Link
JP (1) JP4568024B2 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5089940B2 (en) 2006-08-29 2012-12-05 株式会社トプコン Eye movement measuring device, eye movement measuring method, and eye movement measuring program
JP5622431B2 (en) * 2010-04-21 2014-11-12 オリンパス株式会社 Head-mounted pupil detection device
JP5719216B2 (en) * 2011-04-05 2015-05-13 日本放送協会 Gaze measurement apparatus and gaze measurement program
JP5949319B2 (en) * 2012-08-21 2016-07-06 富士通株式会社 Gaze detection apparatus and gaze detection method
JP6176070B2 (en) * 2013-11-13 2017-08-09 株式会社デンソー Gaze direction detector

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001276111A (en) * 2000-03-31 2001-10-09 Nidek Co Ltd Ophthalmologic surgical apparatus
JP2002291792A (en) * 2001-03-29 2002-10-08 Sumitomo Heavy Ind Ltd Control method and device for eyeball radiation

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0868630A (en) * 1994-08-29 1996-03-12 Nissan Motor Co Ltd Visual line direction measuring apparatus for vehicle and image input device used for it
JP3050808B2 (en) * 1996-06-28 2000-06-12 ミノルタ株式会社 Positioning device
JP2991134B2 (en) * 1996-11-21 1999-12-20 日本電気株式会社 Attention point detection system on screen

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001276111A (en) * 2000-03-31 2001-10-09 Nidek Co Ltd Ophthalmologic surgical apparatus
JP2002291792A (en) * 2001-03-29 2002-10-08 Sumitomo Heavy Ind Ltd Control method and device for eyeball radiation

Also Published As

Publication number Publication date
JP2005323905A (en) 2005-11-24

Similar Documents

Publication Publication Date Title
US9921663B2 (en) Moving object detecting apparatus, moving object detecting method, pointing device, and storage medium
US9791927B2 (en) Systems and methods of eye tracking calibration
US10192135B2 (en) 3D image analyzer for determining the gaze direction
US10670395B2 (en) Dual-resolution 3D scanner and method of using
JP6690041B2 (en) Method and device for determining point of gaze on three-dimensional object
CN108351514B (en) Use the eye tracks of structure light
US10165176B2 (en) Methods, systems, and computer readable media for leveraging user gaze in user monitoring subregion selection systems
US10650533B2 (en) Apparatus and method for estimating eye gaze location
KR101483501B1 (en) Ophthalmologic apparatus and control method of the same
US9111177B2 (en) Position/orientation measurement apparatus, processing method therefor, and non-transitory computer-readable storage medium
US9172931B2 (en) Projection display device, information processing device, projection display system, and program
JP6573354B2 (en) Image processing apparatus, image processing method, and program
JP5474202B2 (en) Method and apparatus for detecting a gazing point based on face detection and image measurement
KR101506525B1 (en) Point of gaze detection device, point of gaze detection method, individual parameter computation device, individual parameter computation method, program, and computer-readable recording medium
JP5297415B2 (en) Ophthalmic device and ophthalmic method
JP5812599B2 (en) information processing method and apparatus
US20170098117A1 (en) Method and apparatus for robustly collecting facial, ocular, and iris images
JP4517049B2 (en) Gaze detection method and gaze detection apparatus
JP5189057B2 (en) Gaze tracking method and gaze tracking system
US9785233B2 (en) Systems and methods of eye tracking calibration
EP2543483A1 (en) Information processing apparatus and information processing method
Jianfeng et al. Eye-model-based gaze estimation by RGB-D camera
WO2010035472A1 (en) Line-of-sight direction determination device and line-of-sight direction determination method
JP5949319B2 (en) Gaze detection apparatus and gaze detection method
JP2004517359A (en) System and method for automatically adjusting lens performance by gaze tracking

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20070226

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20100126

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20100216

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20100303

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20100713

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20100806

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130813

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140813

Year of fee payment: 4

LAPS Cancellation because of no payment of annual fees