EP3583542A1 - Vorrichtung und verfahren zur echtzeitblickverfolgungsanalyse - Google Patents

Vorrichtung und verfahren zur echtzeitblickverfolgungsanalyse

Info

Publication number
EP3583542A1
EP3583542A1 EP18704561.2A EP18704561A EP3583542A1 EP 3583542 A1 EP3583542 A1 EP 3583542A1 EP 18704561 A EP18704561 A EP 18704561A EP 3583542 A1 EP3583542 A1 EP 3583542A1
Authority
EP
European Patent Office
Prior art keywords
subject
scene
gaze
eye
eye tracking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP18704561.2A
Other languages
English (en)
French (fr)
Inventor
Yannick James
Serge Couvet
Pascal PEYRONNET
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Thales SA
Original Assignee
Thales SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thales SA filed Critical Thales SA
Publication of EP3583542A1 publication Critical patent/EP3583542A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction

Definitions

  • the invention relates to the field of gaze analysis, and in particular relates to a device and a method for capturing, measuring and analyzing a person's gaze in real time.
  • a particularly interesting application is that of assessing the behavior of a subject operating in a cockpit environment, whether it is a land or air cockpit, airplane cockpit or helicopter.
  • REGT Remote Eye Gaze Tracker
  • Solutions based on installations with cameras fixed in relation to a cockpit and facing the subject to be observed allow on the one hand to detect the position and the orientation of the head in the fixes fixed cameras, and secondly to detect the position and size of the pupil and the orientation of the gaze relative to the head of the subject, then by construction to put these measurements in the reference of the fixed cameras.
  • This type of device is effective when the subject remains in a fixed and limited angular coverage, due to the number of fixed cameras to pose in the environment. It is therefore constrained in angular coverage of the orientation of the head on the one hand and the capture volume of the position of the head on the other hand, by the number of cameras arranged in the working environment, which in the current devices are from 4 to 8.
  • an "eye-tracking" device based on cameras fixed on a frame carried by a subject and providing a gaze orientation and a position of the pupil relative to the reference mark.
  • the mount and secondly by a device of "head-tracking" based on a camera fixed on the head of the subject (one speaks of "inside-out tracking") with markers posed in the scene where evolves the subject and which determine the position of the head in the reference of these fixed markers.
  • the position of the gaze and the pupil in the marker of the fixed markers are calculated.
  • an "eye-tracking" device based on cameras fixed on a frame carried by a subject and providing an orientation of the gaze and the position of the pupil with respect to the frame of the frame, and secondly by a camera positioned on the frame and oriented towards the front, so-called egocentric camera, providing a video of the field of observation by the subject and on which a point locating the position of the gaze can be positioned after treatment.
  • the video of the egocentric camera can be exploited to make a video correction process in the scene without markers. Even if such a device makes it possible to make a registration in the frame of the scene, nevertheless it does not work in real-time but in offset processing.
  • a device and a method for capturing, measuring and analyzing the gaze that provides, in real time, a positioning of a point viewed by a subject, in the frame of a scene, without having to carry out a particular intervention or installation that modifies or alters the environment in which the subject evolves (a cockpit or a driving position).
  • the present invention proposes to meet the aforementioned needs.
  • An object of the present invention is to provide a device and a method for allowing the capture, measurement and analysis in real-time of the gaze of a subject.
  • the general principle of the invention consists in performing a 3D and real-time registration in the scene where a subject evolves without adding markers, and without modifying the environment.
  • the device of the invention does not use markers added to the environment and makes it possible to exploit naturally occurring characteristic points in cockpit environments, aircraft cabins or cockpits, such as material contrasts. , the boutonnique, the equipments, ....
  • the registration in the frame of the scene is made from the exploitation of a video stream from egocentric cameras integrated in the device of the invention.
  • the invention thus relates to a lightened device of the frame type that can be worn by a subject, equipped with cameras that follow the gaze of the subject wearing the frame, of one or more egocentric cameras oriented towards the outside environment, coupled to a data processing module or computer which is able to analyze and combine in real time the information from the camera captures and perform a 3D registration in the reference of the scene observed by the subject.
  • the invention will find advantageous applications in the fields of crew control, called “Crew Monitoring", as a new service for the control, analysis and evaluation of the behavior of a crew.
  • the device of the invention can be used in existing driving simulators without having to degrade or alter the capabilities and certification of the simulator.
  • the invention thus relates to a device for analyzing the gaze of a subject observing from an aircraft cockpit a three-dimensional scene, the device comprising:
  • an eye tracking system capable of providing information on the direction of at least one eye of the subject in a measurement frame linked to the device
  • a head posture measurement system capable of providing information on the position and orientation of the subject's head in a measurement frame related to the scene observed by the subject;
  • a video capture system synchronized with the eye tracking and head posture measurement systems, capable of capturing an instantaneous video stream of the scene observed by the subject;
  • a data processing system in communication with the eye tracking, head posture measurement and video capture systems configured to combine the information produced by the eye tracking and head posture measurement systems to compute in real-time the direction of the gaze of the subject in the measurement frame related to the scene observed by the subject, and to reproject in real time in the captured video stream, a marker corresponding to the position of an impact point the calculated direction of the subject's gaze.
  • the device of the invention may comprise:
  • an eye tracking system which comprises at least one camera directed towards the pupil of at least one eye allowing the capture of the position of the eye;
  • a head posture measurement system which includes at least one egocentric camera directed to a predefined area of registration in the scene observed by the subject;
  • a video capture system that includes an egocentric forward-facing camera for capturing video from the point of view of the subject;
  • an eye tracking system an eye tracking system, a head posture measurement system and a video capture system, integrated on an eyeglass-type frame or on a headband-type frame;
  • a data processing system wired or wirelessly coupled to eye tracking, head posture measurement and video capture systems
  • a data processing system which comprises a processor configured to synchronize the information provided by the eye tracking and head posture measurement systems with the video capture system, and for:
  • the remote viewing station comprises a human-machine interface capable of displaying the scene in the form of a three-dimensional model.
  • the point that is calculated by the processor is displayed on the three-dimensional model.
  • a head posture measurement system generating, by a head posture measurement system, information on the position and orientation of the subject's head in a measurement frame linked to the scene observed by the subject;
  • the step of generating information on the direction of at least one eye comprises the use of a "mapping" algorithm.
  • the step of generating information on the position and orientation of the subject's head includes the use of location-based algorithm and simultaneous mapping;
  • the step of generating information on the position and orientation of the subject's head includes the use of a Kalman filter sensor fusion algorithm
  • the step of calculating the direction of gaze of the subject in the measurement frame related to the observed scene consists in determining a point of the scene viewed by the subject, and the reprojecting step consists in: determining in real time, from the calculated direction of view of the subject of the 3D coordinates of an impact point in the three-dimensional model of the scene;
  • an initial step including a modeling step and a calibration step is performed.
  • the modeling step generates a three-dimensional model of the scene
  • the calibration step calibrates the eye tracking, head posture measurement and video capture systems.
  • the invention also covers a computer program product, said computer program comprising code instructions for performing the steps of the method of real-time analysis of the gaze of a subject observing a scene, when said program is running on a computer.
  • Figure 1 schematically illustrates the device of the invention according to a spectacle embodiment
  • Figure 2 schematically illustrates the device of the invention according to a headband embodiment
  • Figure 3 shows sequence sequences of the method of capturing a scene according to one embodiment
  • FIG. 4 shows sequence sequences of the method of the invention according to one embodiment
  • Figure 5 schematically illustrates the use of the device of the invention according to Figure 1 in a cockpit environment.
  • the problem solved by the invention is that of providing a light device for capturing, measuring and analyzing in real time the gaze of a subject, without resorting to additions of material or markers in the evolution environment.
  • the lightness of the device and its non-intrusiveness in the scene make it easily integrated into any training and training environment, even in aircraft in flight.
  • the device also makes it possible, in the context of a training or training session, to display the impact of the gaze direction on the scene, seen by an observer, who can be located at a position and any orientation in or out of the scene.
  • the real-time rendering of the subject's gaze analysis gives him an understanding of the distribution of the areas observed, the sequence of his actions and his mastery of the procedures and the situation.
  • real-time rendering is defined as the ability to restore in a time that allows an instructor to react hot during a training or training exercise, and to react on the current scenario.
  • the order of magnitude of this real-time capacity can vary from a few tens of milliseconds to a few seconds (or even minutes) depending on the envisaged playback device and the quantities and indicators used to trace the relevant information to the instructor. Calculations on each data and at each step of the process have a duration equal to or less than the sampling time, typically 60 seconds samples or less, and the entire process does not exceed three sampling cycles.
  • the restitution can comprise both raw parameters, and parameters composed from the raw parameters and calculated over time windows. It allows the instructor to have information that allows him to react directly during the exercise / session and to make changes in the exercise, recommendations, alerts, etc. thus making it easier adapt the training session to the student's behavior and provide advice.
  • the device of the invention can be implemented as a portable system by a subject, which can take various forms. Two examples are described to explain the principles of the invention, but are not limiting, and other embodiments may be provided.
  • Figure 1 schematically illustrates a device (100) for real-time measurement of the view according to an embodiment of the invention of the telescope type.
  • the shape and design of the bezel shown in FIG. 1 are simplified to allow a clear description of the characteristics of the device of the invention.
  • a telescope or pair of glasses which generally comprises two zones in a front part (1 02) of the frame (for inserting glasses as part of a pair of corrective glasses or solar), and branches (104) which are connected to the front part.
  • the front portion (102) generally comprises a central bridge (106) which bears against the nose.
  • the configuration of the telescope allows it to be stably carried by a subject.
  • the elements can be made of plastic or metal or other materials for the insertion of sensors and other components of the device.
  • the device (100) includes an eye tracking system capable of measuring the direction of each eye, consisting of a camera (108, 1 10) for tracking each eye.
  • a camera 108, 1 10 for tracking each eye.
  • Each camera is positioned on the telescope so as to be directed towards the pupil and allow the capture of the respective position of an eye.
  • the cameras are arranged on the lower part of the frame.
  • the eye tracking cameras are cameras configured to use the visible spectrum in the case of a measurement method based on image analysis without a specific light source. Alternatively, they can be configured to use the near-infrared spectrum in the case of the use of an external light source (source provided for example by "LED" (Light Emitting Diode) Infrared generating lightning or "glints" In English to obtain more precise measurements.
  • the device (100) also includes a head posture measurement system, which is composed of at least one egocentric camera (1 14, 1 1 6) positioned on the telescope so as to be directed to a predefined area of tracking in the scene observed by the subject. The cue area can be defined as an area at the top of the scene or above the subject's head.
  • the head posture measurement system measures the position and orientation of the subject's head in the scene mark relative to a predefined marker area.
  • the device (100) also comprises an egocentric camera (1 12) disposed in the central front portion (106) of the frame, on the median plane between the two eyes of the subject.
  • the camera (1 12) is directed forward to capture video from the observer's point of view.
  • the central camera (1 12) can also be used to measure the position and orientation of the head, in particular for cases where the focal length is sufficiently short, or for certain scene configurations. where the opaque elements occupy a large volume.
  • egocentric cameras may be short-focal cameras in the visible or near-infrared spectral range, or Z short focal cameras giving scene distance information on each pixel (LIDAR), or "light” cameras.
  • Field Area giving information of the direction of arrival of the optical ray on each pixel of the camera.
  • an "IMU” inertial microcentre (1 18) composed of a gyroscope 3 axes and a 3-axis accelerometer, can be added to the device of the invention and integrated into the frame.
  • the use of an IMU in the frame may require the addition of a second IMU attached to the cabin to improve the measurements of the movements of the cabin.
  • the difference in measurement between the 2 IMUs makes it possible to correct the errors of inertial measurements introduced by the movements of the cabin.
  • An alternative to adding a second inertial micro-center may be to use real-time position and orientation information of the booth relative to the inertial reference provided by another measurement device.
  • the device of the invention also comprises a data processing system in communication with the eye tracking, head posture measurement and video capture systems able to receive and combine the information produced by the devices.
  • the data processing system may be an on-board calculation processor coupled to the various sensors in a wired or wireless manner (according to known technologies of WiFi type, RFID for example) configured to calculate in real time the direction of the gaze of the subject in the measurement frame linked to the scene observed by the subject.
  • FIG. 2 schematically illustrates in a profile view a device (200) for real-time measurement of the gaze according to an embodiment of the invention of the headband type.
  • the headband is generally held on the subject's head by a rear clamp (202) and a forehead (204).
  • the device is fixed from the top to the structure of the headband, and comprises an eye tracking system capable of measuring the direction of each eye, composed of at least one camera ( 108) directed to the pupil to follow the movements of an eye and allow the capture of the respective position of the eye.
  • a second camera can be positioned symmetrically to follow the movements of the second eye.
  • the device (200) also includes a head posture measurement system, composed of at least one egocentric camera (1 14, 1 1 6) positioned on the mount so as to be directed to the top of the scene that the subject will observe or above his head.
  • the head posture measurement system measures the position and orientation of the subject's head in the scene mark.
  • the device (200) also includes an egocentric camera (112) directed forward to capture video from the viewpoint of the observer.
  • the egocentric video stream capturing camera may be disposed in the vertical plane of an eye for monocular type embodiments or positioned in a median vertical plane between the subject's two eyes for binocular type embodiments.
  • Figure 3 shows sequence sequences of the method (300) of capturing a scene according to one embodiment.
  • a preparation phase (called “off-line") relating to the environment where a subject will evolve, is performed before the implementation (or “on-line” phase) of the method of measuring the gaze in real time, described in This preparatory phase consists of the capture of the scene that will be the scene observed by the subject and on which he will look.
  • the method of capturing the scene comprises a step (302) of acquiring a three-dimensional model (3D) of the entire scene. In the case of a cockpit, the entire scene covers the dashboard, the dome light, the pedestal, but also the canopy, the side walls, and the rear of the cockpit.
  • the step (302) allows the acquisition of a precise model, with a precision that can be millimetric for the measurement of the posture of the head, covering the entire operating scene.
  • the 3D model of the scene is subsequently used as a reference virtual object during the step of resetting the posture of the subject's head (detailed with reference to FIG. 4).
  • the acquisition of the 3D model can be performed from a 3D scan.
  • the generated 3D model serves more as a synthetic 3D display object on a human-machine interface (HMI) in which it is possible to add elements that may be useful to an instructor or an operator responsible for operating the data analyzed by the processor.
  • HMI human-machine interface
  • a next step (304) of the method areas of interest are represented on the 3D model generated in the previous step.
  • the areas of interest are represented as labeled surfaces.
  • the labeled surfaces represent a segmentation of the scene corresponding to elements for which the method of measuring the gaze in real time, will determine if they are instantly viewed.
  • the labeled areas may correspond to parts of the dashboard, to instruments, to areas seen through the canopy, to areas occupied by other crew members, etc. Areas of interest can be defined according to the application and the scene in which the subject is moving.
  • This step (304) makes it possible to obtain a labeled 3D model of the operation scene.
  • the scene capture method also includes a step (306) of acquiring a photographic image of the entire operation scene.
  • Photo-textures are created in a next step (308).
  • the photo-textures are the most realistic possible they are created under several moods: daytime atmosphere and night atmosphere, for example. Other conditions may be provided, such as more or less sunny day, cloudy, rainy, etc., moods of rising sun, setting sun, night moods with and without moon, etc.
  • the photo-textures are combined with the labeled 3D model (304).
  • the method then allows (312) to generate a realistic 3D photo object in a unified frame of the scene, under different lighting conditions.
  • This object is then in direct coherence with the measurement of gaze that is performed according to the method of the invention in this same frame of the scene.
  • the model can be transmitted on a remote station and thus in the operational phase, be observed by an instructor whatever the angle of view and the position of the observation point.
  • the preparation phase also consists of a calibration phase of the device of the invention.
  • the calibration covers the calibration of hardware and software components. It consists in calibrating the cameras directed on the pupil, the egocentric camera (s), the device of measurement of the position and orientation of the head in the mark of the scene.
  • the calibration covers the inertial micro center of the mount or the head and that of the inertial microcentral attached to the scene.
  • the calibration also carries (via the position of the pupil and "glints" and possibly via a model of the initial theoretical eye) on the algorithm that makes it possible to measure the direction of gaze in the reference frame of the frame. on the transformation algorithm of the measurement of the direction of gaze of the reference of the mount or the head in the reference of the scene.
  • FIG. 4 shows, according to one embodiment, sequences of steps of the method (400) of the invention for measuring the look in real time.
  • the calculation steps are performed by the on-board data processing system and according to the principle of the invention, the method makes it possible, after the instantaneous capture of a set of parameters of the gaze, to readjust reality of this capture vis-à-vis the reference of the scene where evolves a subject whose eyes are measured.
  • the method allows the capture of the position of the eyes (or at least one eye) with respect to the mount using the corresponding camera (108, 1 10) positioned in the part lower mount.
  • the capture of the data is preferably at a frequency of 60 or 120 Hz.
  • the captured information concerns both the direction of the eyes, but also the intrinsic parameters of the eye, such as the opening, the dilation of the pupil, blinking of the eye, etc.
  • This step provides information on the direction eyes in the frame of reference (type glasses or other support constituting the reference carried by the subject).
  • the calculation of the direction of the eyes can be performed according to various algorithmic techniques, for example those called “2D-2D mapping, or 3D-3D or 2D-3D, described in the article by Mansouryar et al. 201 6, "3D Gaze Estimate from 2D Pupil Positions on Monocular Head-Mounted Eye Trackers”.
  • the method makes it possible to capture the videos of the egocentric cameras (1 14, 1 1 6) of the posture measurement system of the head installed on the frame, in order to calculate in real time the posture of the mount and therefore the head (in orientation and position) in the frame of the scene.
  • Different known methods or algorithms can be used to perform this calculation.
  • a first technique is based on localization and simultaneous mapping, for example, the "SLAM” method for Simultaneous Localization and Mapping in English or the "CML” method for Concurrent Mapping and Localization in English, which can be assisted and accelerated with using the 3D model of the already built scene.
  • SLAM Simultaneous Localization and Mapping in English
  • CML Concurrent Mapping and Localization in English
  • Another approach may be to rely solely on the 3D photorealism model and perform a pose calculation without using a location based on the history of the video images, but use a location based on a single video image.
  • the use of a Kalman filter sensor fusion algorithm makes it possible to obtain a precision equal to that obtained by calculating the posture of the head by the video camera, but it is increased by a time of response and a sampling frequency equivalent to that of an inertial unit.
  • this step (404) makes it possible to provide information on the posture of the head (or the frame carried by the subject) in the repository of the scene.
  • the method allows the video capture by the central egocentric camera (1 12) from the point of view of the subject.
  • the capture of the video is preferably at a frequency of 24 or 30 Hz, but can also be done at higher frequencies.
  • This step makes it possible to obtain an instantaneous video stream in the direction of the head, the egocentric camera being positioned on the frame.
  • the steps of capturing the parameters relating to the eyes, the head and video are performed by the various sensors integrated into a frame (40).
  • the method makes it possible to perform a real-time calculation of a change of reference of the direction of the gaze.
  • the method Using the information from the eye, head and video sensors, the method enables the two direction vectors of the eyes calculated during the eye direction determining step to be readjusted in order to determine the direction of the gaze in the reference mark of the eyes. the scene, and determine a point watched.
  • Step (408) is a real-time reprojection of a marker into the video stream.
  • the reprojection which consists in changing the coordinates of a point in one system into coordinates in another system, is operated in real time.
  • the direction of the eyes is projected in the 3D model, which gives a point of impact of coordinates (X, Y, Z) in the three-dimensional model. From this point of 3D impact, equivalent 2D (X, Y) coordinates are calculated in the video image.
  • a marker is embedded in real time in the video stream of the scene, at the calculated position (X, Y) (410).
  • the proposed method allows by a real-time processing to determine the position of the gaze of a subject, unlike most known methods for worn devices that do not perform the return of an observed point in real time , but in a phase subsequent to the session, in counting.
  • the method of the invention allows a registration of the position of the point looked at in the reference frame of the scene, unlike most known methods for worn devices which recalient in the local frame of the frame, the point looked in a picture.
  • the real-time method of the invention operates without resorting to the installation of elements or markers or hardware (sensor, camera) modifying the observed scene.
  • the use of an egocentric camera installed on the mount to make the registration in the reference of the scene associated with the processing of the various data recorded, allows by the processing performed by the processor, the registration in the marker of the scene without marker.
  • the method allows for real-time embedding in the video stream of the front egocentric camera and the 3D realistic photo model which is displayed on a remote station, the intersection point of the direction of the gaze with the scene.
  • the impact of the gaze can be represented on the video symbolically by a point or by any other form, by conventional techniques that the person skilled in the art can apply.
  • the impact of the gaze may be represented by a highlighted point for example or by a highlight of the viewed element (instrument or other), also by conventional techniques.
  • Other useful parameters such as a heat map of fixations or saccades can be represented on the interface of the observer (or the instructor) who observes a dynamic 3D synthetic scene of the position, with a modifiable angle in time. real.
  • FIG. 5 schematically illustrates a cockpit (500), where a pilot (or co-pilot) carries a bezel type frame equipped with the device of the invention (100).
  • the cockpit in the chosen example, includes two pilot seats. Although the illustration shows only a single pilot equipped with a frame, the person skilled in the art can extend the principles described for use by each of the pilots.
  • the cockpit also includes on a dashboard (502) display screens (504-1 to 502-n), actuators, joysticks and other conventional instruments of a cockpit.
  • the scene considered is constituted by the driver's environment and covers the dashboard (502), the ceiling light (506), the canopy (508). ), the side walls (510), the center console (512), the floor, the rear of the cockpit.
  • the areas of interest that are represented as labeled surfaces correspond to elements of the environment, and may be parts of the dashboard, instruments, areas visible behind the canopy, an area occupied by another pilot, etc.
  • the device of the invention carried by the pilot allows by analyzing the measurements provided by the various sensors (eye tracking, head posture, instantaneous video stream) to the embedded processor (not shown) to determine in time- the direction of the pilot's gaze (illustrated by the black arrow).

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Ophthalmology & Optometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Eye Examination Apparatus (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)
EP18704561.2A 2017-02-14 2018-02-14 Vorrichtung und verfahren zur echtzeitblickverfolgungsanalyse Withdrawn EP3583542A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1700159A FR3062938B1 (fr) 2017-02-14 2017-02-14 Dispositif et procede d'analyse du regard en temps reel
PCT/EP2018/053676 WO2018149875A1 (fr) 2017-02-14 2018-02-14 Dispositif et procede d'analyse du regard en temps-reel

Publications (1)

Publication Number Publication Date
EP3583542A1 true EP3583542A1 (de) 2019-12-25

Family

ID=59811349

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18704561.2A Withdrawn EP3583542A1 (de) 2017-02-14 2018-02-14 Vorrichtung und verfahren zur echtzeitblickverfolgungsanalyse

Country Status (4)

Country Link
EP (1) EP3583542A1 (de)
AU (1) AU2018222619A1 (de)
FR (1) FR3062938B1 (de)
WO (1) WO2018149875A1 (de)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019154510A1 (en) 2018-02-09 2019-08-15 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
EP3750029A1 (de) 2018-02-09 2020-12-16 Pupil Labs GmbH Vorrichtungen, systeme und verfahren zur vorhersage von parametern im zusammen mit dem blick unter verwendung eines neuronalen netzes
WO2019154509A1 (en) 2018-02-09 2019-08-15 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
WO2020147948A1 (en) 2019-01-16 2020-07-23 Pupil Labs Gmbh Methods for generating calibration data for head-wearable devices and eye tracking system
WO2020244752A1 (en) 2019-06-05 2020-12-10 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
FR3098389A1 (fr) * 2019-07-11 2021-01-15 Thales Dispositif et procede d'analyse du comportement d'un sujet
CN111680546A (zh) * 2020-04-26 2020-09-18 北京三快在线科技有限公司 注意力检测方法、装置、电子设备及存储介质
CN112633128A (zh) * 2020-12-18 2021-04-09 上海影创信息科技有限公司 余光区域中感兴趣对象信息的推送方法和系统
CN114415832B (zh) * 2022-01-07 2023-08-04 中山大学 一种战斗机头盔显示瞄准系统、方法、设备及终端

Also Published As

Publication number Publication date
WO2018149875A1 (fr) 2018-08-23
FR3062938B1 (fr) 2021-10-08
AU2018222619A1 (en) 2019-09-05
FR3062938A1 (fr) 2018-08-17

Similar Documents

Publication Publication Date Title
EP3583542A1 (de) Vorrichtung und verfahren zur echtzeitblickverfolgungsanalyse
CN106575039B (zh) 具有确定用户眼镜特性的眼睛跟踪设备的平视显示器
KR102493749B1 (ko) 동적 환경에서의 좌표 프레임의 결정
EP2783252B1 (de) Verfahren zur verwendung von augenverfolgung zur konzentration auf einen bildinhalt auf einer anzeige
US8971570B1 (en) Dual LED usage for glint detection
US11269402B1 (en) User interface interaction paradigms for eyewear device with limited field of view
EP2030193B1 (de) System und verfahren zur anzeige von wartungs- und betriebsanweisungen einer vorrichtung unter verwendung von erweiterter realität
EP2933707B1 (de) Darstellungseinstellung für kopfmontierte Anzeige
CN103558909B (zh) 交互投射显示方法及交互投射显示系统
US10510137B1 (en) Head mounted display (HMD) apparatus with a synthetic targeting system and method of use
FR3011952A1 (fr) Procede d'interaction par le regard et dispositif associe
WO2016077508A1 (en) System for automatic eye tracking calibration of head mounted display device
US20180053055A1 (en) Integrating augmented reality content and thermal imagery
CN105378632A (zh) 用户焦点控制的有向用户输入
US20170263017A1 (en) System and method for tracking gaze position
WO2016130533A1 (en) Dynamic lighting for head mounted device
EP2533095B1 (de) Pilotenassistenzsystem und Luftfahrzeug
US20230359038A1 (en) Eyewear having unsynchronized rolling shutter cameras
US11656471B2 (en) Eyewear including a push-pull lens set
WO2022133219A1 (en) Mixed-reality visor for in-situ vehicular operations training
CN108351689A (zh) 使用全息显示系统的交互方法和系统
US20210373336A1 (en) Systems and methods for providing mixed-reality experiences under low light conditions
US20220365354A1 (en) Segmented illumination display
Hwang et al. A rapport and gait monitoring system using a single head-worn IMU during walk and talk
Botezatu et al. Development of a versatile assistive system for the visually impaired based on sensor fusion

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20190808

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20211020

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20220503