EP2836893A1 - Method for determining the direction in which a user is looking - Google Patents
Method for determining the direction in which a user is lookingInfo
- Publication number
- EP2836893A1 EP2836893A1 EP13725787.9A EP13725787A EP2836893A1 EP 2836893 A1 EP2836893 A1 EP 2836893A1 EP 13725787 A EP13725787 A EP 13725787A EP 2836893 A1 EP2836893 A1 EP 2836893A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- eye
- user
- images
- image
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
Definitions
- the present invention relates to the determination of the direction of gaze of a person by monitoring his eye movements, especially when this person visualizes a screen of a computer system.
- US 6,113,237 relates to a device for detecting the horizontal and vertical movements of a person's eyes. It gives the position of the pupil, but not the direction of the gaze. It does not allow to obtain a great precision.
- Patent applications US 2005/0110950 and US 2010/0283972 relate to medical applications of the detection of saccadic movements of the eye and do not teach more particularly to determine the direction of gaze.
- the application US 2010/045933 proposes a method for identifying the direction of the gaze involving a reflection on the retina which requires a complex optical apparatus (mobile mirrors scanning the area measured from a very small point light source) associated with a method of deferred processing of information based on fixed repositories.
- This device does not use the synergistic and self-sustaining aspects of a dual source of real-time information with relative referencing and therefore can not satisfactorily address the issues of instantaneous information processing and self-correction of data. drifts.
- the application EP 2 261 772 describes a method for reconstructing the direction of gaze from a still, remote camera facing the user.
- a first process identifies and then tracks the position and orientation of the head by optical flow, and a second process identifies one or both of the user's irises.
- This fixed device not embedded, significantly restricts the mobility of the user and does not meet the precision constraints expected from such a system.
- the application US 2012/019645 describes a device for determining the look data on an onboard display screen. It does not include SEO to the outside world.
- the invention aims to meet all or part of these needs.
- the object of the invention is thus, according to a first of its aspects, a method of determining the direction of the gaze of a user comprising the acquisition of images of the eye, in particular using an optical sensor. , preferably a camera, the method comprising:
- the best accuracy reduces the drift over time compared to the initial calibration and thus extend the duration of use without the need for a new calibration.
- the comfort of use is increased.
- the first and second treatments do not necessarily take place in the order a) then b) and can also take place in order b) then a), the treatments being performed in a loop to provide at intervals of time regular information about the direction of the gaze.
- the image acquisition method used in the method may advantageously be non-intrusive to the eye.
- the images are used to determine the position and / or the displacement of the eye are images of the outside of the eye and not images of a deep surface of the eye, such as the retina or the macula, or involving refractions and / or reflections on elements of the eye, such as the retina or cornea.
- the images that can be used in the invention are surface appearance images of the eye, such as at least a portion of the contour of the iris and / or sclera. This may make it possible to maintain a relatively simple acquisition lens that is easy to integrate into glasses, for example. This can also simplify the lighting integrated in the device worn by the user, because it is not necessary to penetrate a light beam within the eye in particular incidence conditions.
- Diffuse illumination of the surface of the eye, especially the iris and sclera may be appropriate.
- the invention is not based on the observation of the reflection of light sources on the eye and can thus be more independent of the light environment.
- Acquired images may represent anatomical and / or external kinematic data of the eye.
- the first and second treatments may be based on the acquisition of images at different acquisition frequencies, and preferably the first processing is performed at a first acquisition frequency less than the second processing which takes place at a second frequency, preferably with a factor of at least 10 between the first and second frequencies.
- the first and second treatments may be based on the acquisition of images at different resolutions, and preferably the first processing is based on the acquisition of images at a resolution greater than the resolution of the acquisition made during the second processing, for example greater by a factor of 5 at least.
- the information delivered by the first treatment can make it possible to correct the drifts related to the second treatment, in the precision of the measurement.
- the first treatment provides information on the orientation of the eye, from the observation of an area of the eye whose appearance varies with the rotation of the eye.
- the orientation is determined statically regardless of its evolution over time.
- This first treatment can be performed from a single image. It can be done for each image.
- This first treatment may comprise the determination of the parameters related to the shape of the ellipse corresponding to the pupil, in particular to determine the position of a point of the eye, for example the center of the pupil.
- An algorithm for reconstructing the shape of the pupil on the image can be used by determining by image processing the parameters of the ellipse corresponding to the circle of the pupil observed in a direction making an angle with the normal to the plane of the pupil. this circle. We can deduce the orientation of the eye by analyzing the evolution of the shape of this ellipse.
- the reconstruction of an ellipse is for example possible from five points of its contour.
- a limited number of contour points, in particular five, are sufficient to calculate the descriptive parameters of this ellipse (center, minor axis, major axis and rotation angle). Knowing these characteristics and those of the camera, it is thus possible to reconstruct in 3D the direction of the vector normal to the disk whose projection is the ellipse.
- the reconstruction of the ellipse may require minimal resolution to obtain the necessary precision.
- an ellipse reconstruction algorithm is used to take into account the intensity of the gray levels in the entire image, rather than focusing on the outline of the task corresponding to the pupil to trace the ellipse.
- the direction of the eye in the repository of the camera is given by the normal to the disc of the pupil taken in its center. This normal and the center of the disc, different from the center of the ellipse, are obtained from the characteristics of the ellipse.
- the direction of gaze in the reference frame of the head is obtained by applying a rotation matrix with 3 axes which is fixed and known.
- the processed image does not necessarily include a view of the iris of the eye in its entirety.
- the invention makes it possible to obtain the ellipse even when the image comprises only a portion of the iris of the eye, whether the eye has been partially closed when the image is taken, or that the image is taken on the eye with significant magnification.
- the treatment can provide the desired information, even if only a partial view of the border of the iris, or only a border of the iris with the white of the eye.
- this method Unlike conventional methods that calculate the orientation of the eye as a non-linear function of the distance between the center of the ellipse and the center of one of the reflections on the cornea of an external illumination, called points of Purkinje), this method allows to know the orientation of the eye without using an external reference such as Purkinje points. The linearity of this solution makes it possible to get lost from the calibration methods inherent in the classical method.
- the second treatment provides information on the kinematics of the eye by comparing at least two successive images, that is to say information on the evolution of the position of the eye in its orbit between two instants.
- This second treatment can be performed from two consecutive images. It can be performed for each pair of successive images acquired by the optical sensor.
- the second treatment provides information on the movement of the eye in its orbit, including information on the angle of rotation of the eye, assuming that the eye is a sphere.
- the second treatment involves determining the optical flux between two successive images of the eye, that is to say the apparent movement caused by the relative movement between the optical sensor that acquires images and the eye.
- a corresponding algorithm is known from the article "Determining Optical Flow” Berthold KP Horn et al, Artificial Intellingence, vol. 17, pp. 185-203, 1981.
- the variation of the flux of pixels in an image is measured by difference between two or more images.
- Two successive images can be very close together in time.
- Two successive images are preferably of constant intensity.
- One can locate between two images of the same area of a moving part of the eye having contrasts, for example two successive images of the sclera comprising blood vessels, or any region including all or part of the Iris or pupil, the choice of the region is not limiting, the variation of the location of characteristic areas by their shapes or intensity.
- the variation in the location of the characteristic zones is observed by the variation of the intensity flux of pixels in the plane parallel to the plane of the sensor.
- the measurement of this variation of the optical pixel flux does not require having to explicitly identify a reference image or features or reference patterns in the image.
- these "images" can be taken at very high frequencies by a second sensor independent of the first treatment. It is not necessary that this second sensor has a good resolution or a good focus provided that the measured area is contrasted.
- the measurement of this variation of the optical flux is facilitated if the intensity of each image is constant.
- the information resulting from the two treatments can be combined to generate information about the direction of the gaze, which can be information about the direction of an eye.
- the information concerning the direction of gaze may result from the addition of the two pieces of information, each being weighted according to a law that may depend on the two pieces of information themselves.
- the first treatment makes it possible to obtain information concerning a change of orientation of the direction of gaze, in the reference frame of the camera, by the measurement of the normal to the pupil.
- the second treatment makes it possible to know the differential of movement of one or more points of the sphere modeling the eye in the plane of the camera.
- These two informations come from two independent treatments. They are combined and weighted to obtain a measure of the direction of the final eye. This weighting is a function of the coherence of the movement at this moment.
- the solution of the rotation enabling get the final state at time t + dt, which can be the next image, depending on the two information obtained.
- the chosen solution is a weighting of the two results as a function of their consistency with the imposed rotation model. Both treatments can be performed simultaneously or almost simultaneously. This can make it possible to take advantage of their functional and / or topographical complementarity to accelerate or facilitate the treatments or to improve the quality.
- One of the information can be used to correct the other or to help determine it. One can for example proceed by interpolation to determine one of the information.
- the second treatment makes it possible to obtain information on the displacement of the eye in its orbit with a lesser influence of the refraction, insofar as the zone of the eye used is preferably situated relatively in the corner of the latter, for example on the sclera.
- the second treatment can produce errors in the case where the eye rotates too fast relative to the capture speed of optical sensors, hence the advantage of using successive images close in time, so a high optical sensor acquisition frequency, and the combination with the first treatment.
- the center of rotation of the pupil in space is the minimizing solution, for example by the method of the least squares, the equation of a sphere knowing the positions in the plane of the multi-point camera of this sphere as well as normal to the sphere in each of these points.
- the first treatment makes it possible to precisely know the position of the pupil when the movement of the eye is slow or zero compared to the acquisition frequency, angular velocity corresponding to a movement frequency of, for example, 30 Hz. If the displacement is very fast, especially with an angular velocity corresponding to a frequency greater than or equal to 200 Hz, the image is distorted, appears blurred, scrambled and reconstruction bad.
- the second treatment measures a rotation speed.
- the angular orientation is obtained by integrating this speed in time.
- the acquisition system is much faster than the movement of the eye, this method allows accurate measurement during very fast movements of the eye. Conversely it is less interesting during slow movements of the eye because the background noise of the image becomes important in the face of the small displacement.
- the combination of the two treatments is therefore particularly advantageous and makes it possible to obtain both a good spatial accuracy and a good temporal accuracy of the orientation of the eye by allowing a reciprocal self-correction of the two treatments.
- a device carried by the user comprising at least a first camera configured to acquire an image of all or part of an eye of the user.
- the device may also comprise an electronic circuit for processing the images acquired according to the method according to the invention, so as to determine a relative movement of the eye relative to the camera and the evolution of the direction of gaze over time .
- This device can be worn in whole or in part by the user.
- the device may further comprise at least one information representation system including screen type or semi-transparent screen, projector, speaker or earpiece, vibratory force feedback system.
- the representation system can for example to represent visually, graphically, sound or other information obtained in step a) or b) or derived from such information.
- the device may further comprise at least one sensor of a physiological information relating to the wearer of the device.
- the latter sensor is for example selected from a microphone, a motion or acceleration sensor, an image sensor, one or more electrodes, a sweat sensor.
- the treatment can take place fully integrated with glasses worn by the user, for example, or remotely, thanks to remote transmission of images for example, or in a mixed way, partly at the glasses and partly in a remote manner, a pretreatment being for example performed through an electrical circuit carried by the glasses and information sent to a remote circuit by a non-wire link. Preprocessing reduces the data transfer rate.
- the eye or both eyes are preferably illuminated with infrared lighting, in particular with one or more LEDs.
- the light source or sources are for example relatively wide angular aperture, especially greater than 60 °, to cover a large area of the eye.
- the light source or sources can be arranged on the sides, especially outside, this advantageously allows to minimize the reflections observed by the camera.
- the device may have only one optical sensor to acquire an image of the user's eye.
- the first and only optical sensor can alone to obtain the images required for both treatments.
- the successive images captured by the optical sensor (s) may be images of the corner of the eye and / or the pupil and / or the sclera.
- the device may comprise at least one second optical sensor configured to acquire an image of all or part of the eye.
- the first and second optical sensors can allow the acquisition of images of the same eye of the user.
- the use of two sensors for one eye makes it less possible to depend on any significant variations in the direction of gaze.
- the first and second optical sensors are each directed to a different part of the same eye of the user, for example the pupil and the sclera.
- the acquisition and processing chain associated with one eye can be duplicated for the other eye.
- the device may comprise at least a third optical sensor carried by the user and configured to acquire an image of all or part of the second eye of the user.
- the device may include a fourth optical sensor carried by the user and configured to acquire an image of all or part of the second eye of the user.
- the third and fourth optical sensors are each directed to a different part of the user's second eye, for example the pupil and the sclera.
- the frequency of closure of the eyelid of the studied eye or the studied eyes can be detected.
- An electromyogram and / or an electroocculogram can be used.
- the closing of one or both eyelids can be used to indicate that the user validates the positioning of the mouse at the target point of the screen, that is to say that he clicks with his eyes on the screen.
- the user may, for example, have to close his eyes strongly enough to indicate that he wishes to validate a choice of position.
- the sensor or sensors, in particular the second and / or fourth sensors may be sufficiently fast digital sensors, for example operating at a frequency of at least 200 Hz, or even at least 500 Hz, or even at least 1000 Hz, by example at a frequency of about 1500 Hz, or even a frequency of at least 2000 Hz.
- the sensor or sensors in particular the second and / or fourth sensors, may have a relatively low resolution, for example less than 500 * 500 pixels, or even less than 200 * 200 pixels, for example between 16 * 16 and 100 * 100 pixels.
- An image captured may for example have a resolution of the order of 100 * 100 pixels, or even 32 * 32 pixels, or even 24 * 24 pixels.
- the sensor or sensors may be thermal or infrared radiation.
- the sensor or sensors may be associated with one or more LEDs.
- the sensor or sensors may include a camera, including RGB.
- the camera (s) can be thermal (s).
- the camera (s) can be configured so that the eye is located in the focal plane of the camera.
- the sensor or sensors are preferably arranged on the device other than centered with respect to an eye. They are preferably distant from a plane passing through the pupil of the corresponding eye.
- the placement of the sensors can provide enough contrast between two acquired images to visualize a difference between the two images.
- the sensors are preferably arranged sufficiently far from the eyelashes.
- the sensor or sensors can capture an area of the eye of a size of about 5 mm by 5 mm.
- the sensor or sensors may for example be arranged at a distance from the eye of between 1 cm and 5 cm, for example of the order of 2 cm. The choice of distance ensures that the sensor is close enough to the eye, and therefore only sees the eye, so as to improve accuracy.
- the sensor or sensors may be configured to preferably provide a light localized, ie rather diffuse. Indeed, we try to avoid producing on the eye a reflection of the light source.
- the first sensor and the optional third sensor may have speed and resolution characteristics different from the second and fourth sensors.
- the first sensor and the optional third sensor can operate at a frequency between 10 Hz and 150 Hz, in particular about 30 images / sec and have a resolution of at least 100 * 100 pixels, for example between 300 * 300 pixels and 1280 * 960 pixels.
- the possible second sensor and the possible fourth sensor can operate at a frequency greater than 200Hz, in particular between 200Hz and 2000Hz, or even between 300Hz and 2000Hz, in particular equal to 500Hz and have a resolution of between 32 * 32pixels and 140 * 140 pixels.
- the frequency ratio between the two sensors is preferably about 10 or more, i.e., the acquisition frequency of the second sensor is preferably greater than 10 times that of the first sensor.
- the movement of the device relative to a basic module can be determined.
- the basic module can for example be attached to a screen of a computer system.
- computer system any computer system comprising a screen for interacting with the user, for example a computer or laptop, a tablet, a landline or mobile phone, this list is not limiting.
- optical beacons or light sources disposed on the device, in particular fixed to the device, for example on branches of the latter, and a sensor, for example a camera, arranged on the base module.
- optical beacon here designates a passive element such as an optical reflector or a material reflecting a certain wavelength when it is illuminated by an external source.
- passive element such as an optical reflector or a material reflecting a certain wavelength when it is illuminated by an external source.
- tags offer an alternative possibility to the presence of active elements such as light sources of a certain wavelength belonging to the device.
- the wavelength emitted directly by a light source of the device or reflected by a beacon can be in the visible, or the near infrared, which is preferable because being invisible to the human eye.
- This luminous flux emitted directly by the light source of the device or reflected by a beacon can have one or more wavelengths simultaneously or alternately in time, can be polarized linearly, circularly, or in any other way, can be continuous in amplitude or in phase, or modulated in amplitude or phase.
- the movement of the device relative to the base module is determined for example by triangulation.
- at least three points defined by beacons or light sources [see your paragraph below on possible shapes] can be used on the device and a camera on the basic module.
- the at least three points are not aligned and are arranged in a plane not perpendicular to the optical axis of said camera.
- the at least three points may be arranged in a plane substantially parallel to the optical axis of said camera.
- a corresponding algorithm is described in the article "3D Pose from 3 Matching Points" under Weak-Perspective Projection by TD Alter, Massachusetts Institute of Technology, Artifical Intelligence Laboratory, AI Memo No. 1378, July 1992.
- Optical beacons or light sources can be of different shapes.
- each point then comprising an optical beacon or a light source may be one-off, for example of the number LED type, which may be between 3 and 9, each point then comprising an optical beacon or a light source.
- They may be of linear shape, curved, straight, continuous or non-continuous, for example comprising a lateral illumination optical fiber.
- Optical beacons or light sources can form an identifiable and unambiguous pattern.
- Sources or beacons are for example arranged to the outside of the head.
- the light sources used to determine the movement of the device relative to the base module can be for example infrared LEDs.
- the camera can be an infrared camera.
- one can for example use two gyroscopes and a distance sensor, one disposed on the device and the other disposed on the base module, which can be attached to a screen.
- the screen is not independent of the device but integral with the latter, so that the determination of the direction of the user's gaze relative to the device is sufficient to determine the direction of gaze of the user relative to the screen.
- the screen is remote from the device.
- the observation point of the user can be determined on a screen on which the basic module is fixed, in particular a screen of the computer system mentioned above, from the meeting point of the two visual axes of the user determined in FIG. from the information concerning the direction of the gaze.
- visual axis is meant the axis between the fovea and the optical center of the eye of the user.
- the fovea is an area of the retina slightly offset from the center of the retina, and allowing the best vision of the color.
- the meeting point of the two visual axes thus indicates the point of observation on the screen.
- the "optical axis” is the axis between the center of the retina of the user and the optical center of the eye.
- the invention makes it possible to obtain the measurement of the convergent visual axes. on the screen.
- the measurement of the direction of the optical axes does not call for the reconstruction of the corneal reflection point (glint in English) introducing a non-linearity to be corrected by a heavy calibration.
- the invention facilitates the determination of the observation point.
- Calibration can be done by asking the user to follow a moving target on the screen, ie the eye is forced to follow the eye.
- the eye physiologically optimizes its movement to minimize it, and uses the same area of the fovea, so that this calibration makes it possible to better define the position of the fovea.
- An alternative is to ask the user to set one or more points on the screen at specific times. In the foregoing, the user is warned that a calibration is performed.
- Another aspect of the invention is a device for determining the the direction of a user's gaze, in particular glasses or headphones, intended to be immobilized on the user's head, for controlling a computer system, comprising:
- At least one first optical sensor configured to acquire an image of all or part of an eye of the user
- the wireless transmission of data makes it necessary to reduce the volume of data to be transmitted.
- the wireless transmitter can have receiver functions and be a transmitter / receiver.
- the transmitted data can be analog or digital. These are, for example, data of the audio or video type.
- the processor speeds up the final processing of data through preprocessing.
- the use of an on-board processor makes it possible to pre-process the data obtained from the sensors and thus to reduce the amount of data to be transmitted to the basic module, thereby speeding up the process.
- the processor may include at least some of the previously mentioned algorithms necessary for data processing.
- the device is preferably positioned on the nose and / or the ears of the user, similarly to a pair of glasses.
- the device may comprise a battery giving it sufficient autonomy, for example at least several hours of operation without recharging, or even at least a day.
- the use of low resolution images makes it possible to reduce the power consumption and thus to increase the autonomy of the device.
- the user's head may be stationary relative to the device.
- a basic module intended to be attached to a screen of a computer system and connected to the latter.
- the device can communicate with the base module by a wireless link.
- the base module can be configured to be used as soon as it is connected to the computer system. In other words, it can be recognized quickly, easily and automatically by the operating system of the computer system upon connection or upon restart after the hardware installation ("Plug and Play" in English). This procedure allows the installation by requiring a minimum of intervention from the user and thus minimizing handling and parameterization errors.
- the assembly may comprise several devices able to communicate with the same basic module.
- FIG. 1 is a schematic and partial perspective view of a device for determining the direction of gaze according to the invention
- FIG. 2 schematically and partially illustrates an assembly comprising the device of FIG. 1,
- FIG. 3 is a block diagram illustrating the method for determining the gaze direction according to the invention
- FIG. 4 illustrates the capture of the images on the eye
- FIG. 5 schematically illustrates an assembly according to the invention in a given environment.
- FIG. 1 shows a user U carrying a device 10 for determining the direction of the gaze, in the form of spectacles worn by the user, comprising branches 11 resting on the ears and a central portion 12 resting on the nose, the glasses 13 of the glasses may include an anti-reflective coating.
- the device comprises in the example described two infrared LEDs 16 disposed in the central portion 12 on either side of the nose and each directed towards one of the eyes of the user, and four RGB cameras 15a, 15b can detect infrared radiation.
- Each of the cameras 15a, 15b is arranged and oriented towards one of the eyes of the user, being disposed below the eyes on the periphery of the lenses 13 of the device and arranged each other than in a vertical plane passing through the center from the pupil of the user.
- two cameras 15a, 15b are arranged on either side of the latter, one on the side of the nose and the other on the side of the branch resting on the corresponding ear, being oriented towards the corresponding eye of the user to acquire images of the latter.
- the cameras 15a are oriented towards the pupil of the user, and the cameras 15b towards the sclera of the user.
- the device 10 also comprises an electronic circuit 17 housed in the example described in one of the branches 11 of the device, this electronic circuit 17 for processing the images acquired by the cameras 15a, 15b, as will be described later.
- the device further comprises a battery not visible in the figure, arranged for example in the second branch of the device and giving it sufficient autonomy to avoid having to be recharged for an acceptable duration, for example several hours, or even a whole day.
- the device further comprises a wireless transmitter also housed in one of the branches, transmitting the data to a base module 20 attached to a screen 25 of a computer system and connected thereto, as illustrated in FIG.
- the device further comprises light sources 19 which make it possible, thanks to a camera 22 arranged on the base module 20, to determine the movement of the device 10 relative to the base module 20, so as to determine a relative displacement of the eye to the camera and the evolution of the direction of gaze over time.
- Said light sources 19 are, in the example described, infrared LEDs, for example arranged in a non-aligned manner and in a plane that is not perpendicular to an axis of the camera of the base module, as illustrated.
- one of the light sources 19 is disposed above the nose of the user, while the others are arranged on either side of the eyes in the upper part of the device.
- the light sources 19 are oriented outwards, that is to say towards a camera 22 of the base module 20.
- optical beacons are used instead of the light sources 19.
- each of the cameras 15a, 15b captures images I at regular time intervals from the corresponding eye of the user or more precisely from the part of the eye of the corresponding user.
- the cameras 15a capture an image A of an area of the eye having at least partially the pupil while the sensors 15b capture an at least partial B image of the sclera of the eye, as can be seen see FIG. 4.
- each of the sensors could capture images of both the pupil and the sclera without going beyond the scope of the present invention.
- the images I are processed in a step 40 at least partially to perform at least partly two separate treatments (a) and (b), by means of the electronic circuit 17 of the device 10, which makes it possible to carry out these two treatments at least in part, the treatments being then continued in the basic module.
- the two processes can be performed in the on-board processor on the device, the data then being transmitted by a wireless link F to the base module in which the rest of the processing is performed.
- the first treatment (a) provides information on the orientation of the eye from the observation of an area of the eye whose appearance varies with the rotation of the eye. This treatment can be performed for each of the captured images. It comprises the determination in step 41 of the parameters related to the shape of the ellipse corresponding to the pupil.
- the first processing (a) allows to deduce the optical axes in the repository of the user's head in step 42.
- the pupil orientation reconstruction steps 41 and 42 comprise the selection of the darkest zone of the processed image, the isolation of the pupil in this image, and then the obtaining of the contour of the pupil. pupil.
- the center of rotation of the pupil is then reconstructed by determining a sphere from measurement points and previously determined normals. We deduce the center of rotation and the radius of this sphere.
- a second image processing (b) is performed which provides information on the kinematics of the eye by comparing at least two successive images in step 45.
- the change of orientation of the eye is calculated from the set of infinitesimal displacements determined previously.
- the second treatment (b) is used to correct at step 47 corneal refraction which can produce non-linearities from the pupil refraction on the cornea, which make it possible to correct the information of the first treatment (a). ) at step 42.
- the direction of the normal obtained above makes it possible to determine a direction of the optical axis that can be tainted by error due to corneal diffraction, and noise from the reconstruction of the ellipse.
- the angular displacement makes it possible to obtain the variation of the angular variation of the optical axis from any initial direction. This measurement may also be marred by an error related to the resolution of the correlation measurement.
- the direction of the optical axis is obtained by weighting these two measurements at each iteration, taking into account the positioning obtained in the previous step, the three-dimensional reconstruction of the eye, and a parametrization resulting from the calibration.
- a processing 50 is performed to determine the position of the user's head relative to the base module, that is to say the movement of the device 10 relative to the base module 20.
- the camera 22 arranged on the basic module 20 captures images I of the head of the user on which is visible light from at least three points defined by the optical beacons or light sources 19 of the device 10. These images are processed to determine the position of the device relative to the base module.
- a first step 51 the light spots in the image are selected.
- the noise is eliminated so as to ensure that the light spots displayed on the images correspond to the beacons or light sources 19 intended to determine the movement of the device relative to the base module.
- the position of the light spots in the image is determined and from these data, in a step 55, the position of the head relative to the base module is calculated.
- the four angles of the fovea with respect to the cameras are furthermore determined from physiological data 61, preset parameters 62 and pre-calibrated screen coordinates 63.
- the preset parameters 62 may be device and base module construction parameters, for example the position of the cameras or lighting.
- step coordinates of the screen viewed by the user which can be transmitted to the user, are calculated at step 80. step 81 to the computer system.
- Example 1 evolution in a given environment
- the device makes it possible to establish a relationship between the direction of an operator's gaze and a work environment in order to improve the design and ergonomics of this environment.
- the environment can be for example an aircraft pilot cockpit, as shown in Figure 5, car, simulator or a multi-screen environment control.
- the assembly 100 of FIG. 5 comprises a device 10 in the form of glasses, not illustrated, communicating by a wireless link with the base module 20, the latter connected to a computer system 90.
- the assembly 100 is arranged in a environment 200, here a cockpit flying plane.
- the base module 20 can be used as soon as it is connected to the computer system.
- the assembly 100 shown comprises several devices 10 and 10 ', worn for example by the pilot and the co-pilot.
- each device 10, 10 ' comprises a transmitter / receiver 40 for wireless communication with the base 20, an information representation system 60 comprising in the example illustrated a semi-transparent screen 62 partially superimposed on the spectacle lenses and an atrium 63.
- the device 1 also comprises a sensor of a physiological information 70 in order to evaluate the psychological state of the wearer of the device 10, in particular in an emergency situation generating potential stress.
- Example 2 Training and / or game
- the device makes it possible to quantify the look and the effectiveness of the training, in particular to measure compliance with the safety standards of the operator in a critical environment.
- the device according to the invention may in particular act as a substitute for the mouse and the keyboard for paraplegic upper limbs
- the binocular measurement Simultaneous measurement of the direction of the two eyes in the same frame
- vergence of the user which is a fundamental parameter having applications in the field of ophthalmology or three-dimensional registration in space.
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1253395A FR2989482B1 (en) | 2012-04-12 | 2012-04-12 | METHOD FOR DETERMINING THE DIRECTION OF A USER'S LOOK. |
PCT/IB2013/052930 WO2013153538A1 (en) | 2012-04-12 | 2013-04-12 | Method for determining the direction in which a user is looking |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2836893A1 true EP2836893A1 (en) | 2015-02-18 |
Family
ID=46889151
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP13725787.9A Withdrawn EP2836893A1 (en) | 2012-04-12 | 2013-04-12 | Method for determining the direction in which a user is looking |
Country Status (4)
Country | Link |
---|---|
US (1) | US9715274B2 (en) |
EP (1) | EP2836893A1 (en) |
FR (1) | FR2989482B1 (en) |
WO (1) | WO2013153538A1 (en) |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR3011952B1 (en) | 2013-10-14 | 2017-01-27 | Suricog | METHOD OF INTERACTION BY LOOK AND ASSOCIATED DEVICE |
US9563805B2 (en) * | 2014-09-02 | 2017-02-07 | Hong Kong Baptist University | Method and apparatus for eye gaze tracking |
CN104391567B (en) * | 2014-09-30 | 2017-10-31 | 深圳市魔眼科技有限公司 | A kind of 3D hologram dummy object display control method based on tracing of human eye |
US9885887B2 (en) * | 2015-04-22 | 2018-02-06 | Kurt Matthew Gardner | Method of determining eyeglass frame measurements from an image by executing computer-executable instructions stored on a non-transitory computer-readable medium |
US20170168323A1 (en) * | 2015-04-22 | 2017-06-15 | Kurt Matthew Gardner | Method of Determining Eyeglass Fitting Measurements from an Image by Executing Computer-Executable Instructions Stored on a Non-Transitory Computer-Readable Medium |
FR3041230B1 (en) | 2015-09-18 | 2022-04-15 | Suricog | METHOD FOR DETERMINING ANATOMICAL PARAMETERS |
FR3041231B1 (en) | 2015-09-18 | 2017-10-20 | Suricog | PORTABLE SYSTEM COMPRISING A DEFORMABLE SUPPORT |
US10565446B2 (en) * | 2015-09-24 | 2020-02-18 | Tobii Ab | Eye-tracking enabled wearable devices |
CN108700934B (en) | 2015-09-24 | 2022-07-26 | 托比股份公司 | Wearable device capable of eye tracking |
JP2017163180A (en) * | 2016-03-07 | 2017-09-14 | 富士通株式会社 | Deviation determination program, deviation determination method, and information processing device |
US9898082B1 (en) * | 2016-11-01 | 2018-02-20 | Massachusetts Institute Of Technology | Methods and apparatus for eye tracking |
CN106526869A (en) * | 2016-12-14 | 2017-03-22 | 柯轩 | Protective instrument for eyes |
EP3672478A4 (en) | 2017-08-23 | 2021-05-19 | Neurable Inc. | Brain-computer interface with high-speed eye tracking features |
US10664050B2 (en) | 2018-09-21 | 2020-05-26 | Neurable Inc. | Human-computer interface using high-speed and accurate tracking of user interactions |
US11353952B2 (en) | 2018-11-26 | 2022-06-07 | Tobii Ab | Controlling illuminators for optimal glints |
CN111208905A (en) * | 2020-01-08 | 2020-05-29 | 北京未动科技有限公司 | Multi-module sight tracking method and system and sight tracking equipment |
CN111556305B (en) | 2020-05-20 | 2022-04-15 | 京东方科技集团股份有限公司 | Image processing method, VR device, terminal, display system and computer-readable storage medium |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6152563A (en) * | 1998-02-20 | 2000-11-28 | Hutchinson; Thomas E. | Eye gaze direction tracker |
US6113237A (en) | 1999-12-06 | 2000-09-05 | Ober; Jan Krzysztof | Adaptable eye movement measurement device |
EP1405122B1 (en) * | 2000-10-07 | 2007-07-18 | David Dickerson | Device for determining the orientation of an eye |
US6943754B2 (en) | 2002-09-27 | 2005-09-13 | The Boeing Company | Gaze tracking system, eye-tracking assembly and an associated method of calibration |
US20050110950A1 (en) | 2003-03-13 | 2005-05-26 | Thorpe William P. | Saccadic motion sensing |
US7682024B2 (en) | 2003-03-13 | 2010-03-23 | Plant Charles P | Saccadic motion sensing |
US7682026B2 (en) * | 2006-08-22 | 2010-03-23 | Southwest Research Institute | Eye location and gaze detection system and method |
US8269822B2 (en) * | 2007-04-03 | 2012-09-18 | Sony Computer Entertainment America, LLC | Display viewing system and methods for optimizing display view based on active tracking |
DE102008053248A1 (en) | 2008-10-25 | 2010-04-29 | Schaeffler Kg | Rolling bearing cage |
KR101564387B1 (en) | 2009-01-26 | 2015-11-06 | 토비 에이비 | Detection of gaze point assisted by optical reference signals |
WO2010118292A1 (en) * | 2009-04-09 | 2010-10-14 | Dynavox Systems, Llc | Calibration free, motion tolerant eye-gaze direction detector with contextually aware computer interaction and communication methods |
EP2261772A1 (en) * | 2009-06-12 | 2010-12-15 | Star Nav | Method for controlling an input device based on the detection of attitude or eye gaze |
US8531394B2 (en) * | 2010-07-23 | 2013-09-10 | Gregory A. Maltz | Unitized, vision-controlled, wireless eyeglasses transceiver |
AU2011204946C1 (en) | 2011-07-22 | 2012-07-26 | Microsoft Technology Licensing, Llc | Automatic text scrolling on a head-mounted display |
-
2012
- 2012-04-12 FR FR1253395A patent/FR2989482B1/en not_active Expired - Fee Related
-
2013
- 2013-04-12 WO PCT/IB2013/052930 patent/WO2013153538A1/en active Application Filing
- 2013-04-12 US US14/391,579 patent/US9715274B2/en active Active
- 2013-04-12 EP EP13725787.9A patent/EP2836893A1/en not_active Withdrawn
Non-Patent Citations (2)
Title |
---|
None * |
See also references of WO2013153538A1 * |
Also Published As
Publication number | Publication date |
---|---|
US9715274B2 (en) | 2017-07-25 |
FR2989482B1 (en) | 2022-12-23 |
FR2989482A1 (en) | 2013-10-18 |
WO2013153538A1 (en) | 2013-10-17 |
US20150042558A1 (en) | 2015-02-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013153538A1 (en) | Method for determining the direction in which a user is looking | |
US20220061660A1 (en) | Method for Determining at Least One Parameter of Two Eyes by Setting Data Rates and Optical Measuring Device | |
EP2498671B1 (en) | Method and device for automatically measuring at least one refractive characteristic of both eyes of person | |
EP0596868B1 (en) | Eye tracking method using an image pickup apparatus | |
KR20180115285A (en) | Spherical specular tracking of cornea to create eye model | |
US20140055747A1 (en) | Optical Measuring Device and Method for Capturing at Least One Parameter of at Least One Eye Wherein an Illumination Characteristic is Adjustable | |
US20180336720A1 (en) | Systems and Methods For Generating and Using Three-Dimensional Images | |
EP3092616A1 (en) | Mapping glints to light sources | |
EP0733338A1 (en) | Device for measuring the position of the fixation point of an eye on a screen | |
KR101094766B1 (en) | Apparatus and mehtod for tracking eye | |
JPH0782539B2 (en) | Pupil imager | |
WO2018154272A1 (en) | Systems and methods for obtaining information about the face and eyes of a subject | |
JP7165994B2 (en) | Methods and devices for collecting eye measurements | |
EP0547931B1 (en) | Process and apparatus to measure the movements of the eyes | |
EP3758578B1 (en) | Device for exploring the visual system | |
WO2020128173A1 (en) | Medical device for improving environmental perception for blind or visually impaired users | |
CA3216150A1 (en) | Method for simulating optical products |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20141009 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: SURICOG |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: SURICOG |
|
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20180723 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20190205 |