EP3058444A1 - Verfahren zur interaktion durch benutzerblicke und zugehörige vorrichtung - Google Patents

Verfahren zur interaktion durch benutzerblicke und zugehörige vorrichtung

Info

Publication number
EP3058444A1
EP3058444A1 EP14798990.9A EP14798990A EP3058444A1 EP 3058444 A1 EP3058444 A1 EP 3058444A1 EP 14798990 A EP14798990 A EP 14798990A EP 3058444 A1 EP3058444 A1 EP 3058444A1
Authority
EP
European Patent Office
Prior art keywords
user
electronic device
environment
gaze
marker
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP14798990.9A
Other languages
English (en)
French (fr)
Inventor
Marc Massonneau
Marc SWYNGHEDAUW
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suricog
Original Assignee
Suricog
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suricog filed Critical Suricog
Publication of EP3058444A1 publication Critical patent/EP3058444A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Definitions

  • the present invention relates to a method allowing a user to interact with the gaze with a real environment in which he evolves as well as an interaction system and a gaze tracking device for implementing such a method.
  • the known gaze tracking devices make it possible to know the orientation of the gaze relative to the device (and therefore in relation to the head if it is fixed on the head).
  • Wiimote-type intelligent optical sensor and eyeglass-mounted LED beacons it is known to use a Wiimote-type intelligent optical sensor and eyeglass-mounted LED beacons to determine, with respect to a screen, the movements of a user's head, without measuring an absolute position and orientation of the user's head. head to the screen.
  • US patent application 2006/0227211 discloses an object space localization system provided with three visual markers and a gyro sensor. The reconstruction of the position and orientation of the objects is achieved by combining the visual data of a camera and gyroscopic embedded data, but this system does not perform any coupling with a digital map of the environment.
  • the patent application US 2013/0076617 A1 proposes an optical method for identifying and reconstructing the position and orientation of objects in a room in order to establish an interaction link between screens, objects, bodies , gloves or chopsticks.
  • this method using fixed cameras in the environment, the geo-spatialization of objects does not make use of a digital representation of the surrounding space but of luminous markers signaling all the objects likely to interact. Unmarked objects are ignored. This document does not mention interaction between the environment and the user.
  • the US2012 / 155713 patent application thus discloses an interior geo-spatialization system, using passive markers arranged in a room and coding their position in space.
  • the user wears a scene camera and a computational module that recognizes the markers and reconstructs its position and orientation relative to these markers, but is not interested in eye interaction.
  • US Patent 4568159 relates to an on-board tracking system for measuring the direction of gaze in the reference frame of the head.
  • the positioning of the head relative to the screen is deduced from the image seen by a scene camera embedded on the frame; this document does not provide for coupling with a digital cartography of the environment.
  • the wearing of a scene camera presents the problem of having to board a calculation processor and a battery with the camera. and / or a wireless transmission system that makes the portable device heavier, slowing the acquisition rate and data availability, and reducing overall battery life.
  • real-time recognition of objects or areas of interest by image processing requires computing power that is not yet available on embedded processors.
  • the camera is not aligned with the visual axis of the user's eyes and can not satisfactorily replace a gaze tracking device due to parallax errors.
  • US Patent 5016282 discloses a tracking system using a camera not worn by the user, wherein the positioning of the head is determined by three markers positioned in a triangle on glasses.
  • the structure of the marker motif is non-coding, the markers are passive, and the system is not embedded, the mobility of the user is greatly reduced.
  • US patent application 2009/0196460 A1 describes a system combining a measurement of the direction of gaze using a lens provided with an infrared reflector carried by the eye, and a measurement of the direction / position of the head to an external module using three passive markers placed on the front of the user and two stereoscopic cameras. This device does not provide for interaction with a three-dimensional environment.
  • the application EP 1 195 574 A1 describes a method in which an audio message is generated when, in a cockpit, a pilot is looking in a reference direction, in a reference linked to the cockpit. It is not described in this memory request storing a digital presentation of the cockpit.
  • the invention aims to meet all or part of the aforementioned needs and it achieves this by means of a system for locating the position and for detecting the orientation of one or more mobile users with respect to a physical environment. thanks to one or more electronic recognition and calculation devices making it possible to establish an interaction relationship between the gaze of these users and regions of interest of the environment or of other users present in the environment.
  • the invention makes it possible to establish an interaction link between the user's gaze and real objects or virtual objects represented on a display system of the environment.
  • the display system is for example of a shape and type known as a TV screen, computer, mobile or tablet, projection by video projection on a wall or screen, etc.
  • the invention thus has, according to a first aspect, an electronic device comprising:
  • a receiver of information transmitted by a gaze tracking device comprising a marker forming a visual identification pattern and carried by a user operating in a real environment
  • the received information providing information on the direction of the user's gaze in a repository of the gaze tracking device
  • At least one camera for acquiring an image of the marker, a memory, storing a digital representation of the environment of the user in an environment-specific frame of reference,
  • a processor configured, based on the information received, on the image and the digital representation stored in the memory, to determine in real time the direction of the user's gaze relative to the environment and / or the environment area viewed by the user.
  • environment is meant a delimited space in which the user moves.
  • the environment can be indoor or outdoor.
  • the environment is for example a closed space type room, shed, cockpit or vehicle interior.
  • the environment may comprise several closed spaces, for example inside a building, in which the user moves and the system then preferably comprises an electronic device for each closed space.
  • the environment may include a display device, including a screen, displaying virtual objects.
  • a display device including a screen, displaying virtual objects.
  • digital representation of the environment in a repository specific to the environment it is necessary to understand a set of location data of the object or objects in said repository, these data being known for example by means of a modeling method known elsewhere, in particular by a CAD model or three-dimensional scan, translated as a three-dimensional digital representation stored in the memory of the electronic device.
  • the modeling method can use known photogrammetry techniques and algorithms by using a camera to move it in said environment, take a set of photos, and then from a set of remarkable points common to different shots build a 3D model of the environment.
  • the camera used to model the environment may be the camera of the electronic device or a third-party camera.
  • Obtaining a digital model of the environment can thus make use of 3D scanning techniques using at least one depth camera operating in a near-infrared mode, compatible with the emission wavelength markers, for example a Kinect ® type camera.
  • the electronic device thus knows a digital way of mapping the environment.
  • a user is meant a person wearing a gaze tracking device.
  • the user may be sighted, visually impaired or blind.
  • the gaze tracking device can determine the direction of gaze by observing at least one eye of the user, including video oculography, electro oculography, "sciera coil” or any other known method.
  • the gaze tracking device can determine the direction of gaze by observation of the two eyes of the user, including video oculography, electro oculography, "sciera coil” or any other known method.
  • the direction of the user's gaze calculated by the gaze tracking device may correspond to the optical axis of the eye, the visual axis of the eye or the cone of vision of the fovea.
  • the gaze tracking device preferably comprises an embedded system, wireless, in particular mounted on a frame type glasses, headphones or video device HMD ("head mounted display") on the head of a user.
  • a system embedded on the mount makes it possible to determine the direction of the gaze of the user in the repository of the gaze tracking device, that is to say a repository linked to the head of the user.
  • the gaze tracking device comprises a communication means, in particular wireless, for example of the RF, Bluetooth, Wifi or other type, for sending at least one piece of information to the electronic device according to the invention.
  • the communication means is of the transmitter / receiver type.
  • the term "camera” means any optical sensor comprising at least one camera, in particular an infrared or non-infrared camera, a 3D camera, a set of two stereoscopic calibrated cameras, or a camera equipped with a camera. rotation.
  • the camera may be associated with another sensor or positioning or orientation detection system, for example a GPS device, an accelerometer or a gyroscope.
  • the camera is for example an infrared camera.
  • the camera may comprise a bimodal system, manual or automatic, filtering incident light, which has the advantage of simplifying and accelerating the identification of markers, as well as the calibration of the position of the electronic device relative to its environment, as detailed below.
  • a first mode no filtration of the light is made.
  • the camera is for example sensitive to the wavelengths of the visible range and the near infrared, in particular between 380 and 1000 nm.
  • a filter is placed on the optics of the camera and lets pass, for example, only near-infrared wavelengths between 700 nm and 1000 nm. Filtering and mode selection can also be done electronically, if necessary.
  • the camera is preferably provided with a two-dimensional optical sensor of the CCD or CMOS type.
  • the camera may be provided with a polarizing filter, the polarization of the light being in particular linear or circular.
  • the electronic device preferably comprises a housing integrating the receiver, the camera, the memory and the processor, which can process in real time the information received from the gaze tracking device and the images of the camera.
  • the electronic device may comprise an autonomous power supply system, in particular integrated into the housing.
  • the electronic device may include a wireless communication system, configured to receive information from the eye tracking device carried by the user or by each user.
  • the wireless communication system can be RF, Wifi, RFID, Bluetooth, without this list being limiting.
  • the wireless communication system in a preferred embodiment is configured to also send data to the gaze tracking device, for example to control the markers to associate them with a particular identifier.
  • the electronic device can be connected by a wired or wireless link to a third party computer.
  • the electronic device can thus exchange spatialization and interaction data with a third party computer.
  • the electronic device can be arranged to be connected, via the computer or directly, to a screen, in particular to display under the action of the processor images or messages.
  • the electronic device may comprise an interface, in particular in the form of a set of input (s) and output (s) integrated into the box, for example of the USB port or FireWire type, audio, video or other output.
  • an interface in particular in the form of a set of input (s) and output (s) integrated into the box, for example of the USB port or FireWire type, audio, video or other output.
  • the interface or the wireless communication system may allow the exchange of data between the electronic device and various third party systems, including a computer, screen, projector, sound diffusion system, robot, industrial installation ...
  • the electronic device except during the calibration phase, can be stationary in the environment, in particular placed in solidarity with a stationary object of the environment.
  • the electronic device may also be integral with an object moving in the environment, in particular an object arranged to transmit in real time to the electronic device its moving characteristics.
  • object in the sense of the invention means a geometric shape belonging to the environment and known modeling, corresponding or not to a physical object.
  • the electronic device is for example integral with a screen, the latter being fixed or mobile.
  • Each user can be associated with a coding pattern, using the marker of the device for monitoring the gaze that it carries.
  • the marker can form a pattern that allows the electronic device to both identify and geospatialize the user.
  • the marker can be rigid.
  • the marker may comprise at least one light source emitting in the visible, infrared and / or near infrared range.
  • the light source may in particular be a point light source such as an LED or a side-lighted optical fiber.
  • the emitted light can be polarized linearly or circularly.
  • the light can be modulated in amplitude or frequency.
  • the optical characteristics (wavelength, polarization, modulation) of the markers correspond to the optical characteristics of the camera of the electronic device.
  • the marker may be encoded by amplitude or frequency modulation of the light sources.
  • the marker may be encoded by the geometric pattern formed by a particular arrangement of the light sources.
  • the arrangement of the light sources allows the processor to exploit the projection invariance property which results in measurable invariant values in space and in 2D projections.
  • This property described in particular by Meer in the article “Efficient Invariant Representations” [International Journal of Computer Vision 26 (2), 137-152 (1998)] and Bergamasco in “Pi-Tag: A Fast Image-Based Marker Design Based on Projective Invariants “[Machine Vision and Applications August 2013, Volume 24, Issue 6, pp 1295-1310], is particularly advantageous for simplifying and accelerating the recognition and reconstruction algorithms of the markers patterns in space. Indeed, depending on the complexity of sought patterns, this recognition can be very expensive in time and computing power.
  • the image received from the camera comprises a large number of light spots among which the processor must recognize the markers.
  • the marker preferably comprises at least four light sources, in particular at least four collinear point light sources or five coplanar point light sources of which no triplet of points is collinear.
  • the marker may include a non-point light source.
  • the marker may include a side-illuminated optical fiber.
  • the fiber may be partially obscured and in particular form light spots and / or light segments.
  • the marker may in particular form a pattern comprising at least two non-collinear line segments. These segments form or not simple structures, in particular polygonal (square, rectangle).
  • the subject of the invention is a system of interaction by the gaze between one or more users and a real environment in which they evolve, the system comprising:
  • a portable eye tracking device identifiable by a visual marker, worn by each user
  • At least one electronic device as described above.
  • the interaction system may comprise a plurality of electronic devices configured to exchange data with each other over a wireless link.
  • the interaction system may comprise an interaction link established between the user's gaze and a visual representation of a digital content, displayed in a format known by an environment display device, for example on a display device.
  • an environment display device for example on a display device.
  • the interaction system may comprise an interaction link established between the user's gaze and a zone of the space previously defined as "zone of interest".
  • the "zone viewed" can be determined by the processor, in particular among the predefined areas of interest.
  • the interaction system may include an audio device for transmitting an audio message to the user.
  • the audio device can be worn by the user.
  • the audio device can be a headset or a headset.
  • the triggering of the transmission of the audio message and / or its content may be linked at least to the direction of the user's gaze relative to the environment and / or the zone of the environment viewed by the user, and in particular, be linked at least to the existence of an interaction link established between the user's gaze and an area of interest.
  • the interaction system may comprise a button, also called a push button, intended to be actuated by the user to trigger or stop the transmission of the sound message.
  • the interaction system may comprise a light indicator visible to the user, in particular worn by the user, which may be illuminated to signal to the user that he is looking at an area of interest for which an explanatory audio message exists. and can be listened to if the user requests it, for example by acting on the push-button.
  • the interaction system may include a switch having an on state for activating the audio device and / or the user's portable eye tracking device and an off state for disabling the audio device and / or the portable device for monitoring the gaze of the user.
  • the push button may, for example, trigger the transmission of a sound message relating to an area of interest if, at the same time, it is detected that the user is looking at the area of interest, that the switch is in the "on” state. And that the user actuates the push button for example by pressing on it.
  • the switch is in the "off” state, the audio device and / or the portable eye tracking device is disabled. No audible message can be emitted even when the user presses the push button.
  • the interaction system may comprise an audio device as described above, per user. It may also include a switch per user, and / or a pushbutton per user, and / or a light indicator per user, as described above.
  • Each zone of interest can be identified by a unique identifier known from the memory.
  • the memory may include data associated with at least one area of interest.
  • the zone of interest is for example defined by a part of the space defined by the environment, in particular all or part of a modeled material element, in particular a wall, an opening such as a door, a piece of furniture, a object, a switch, a screen, a light, etc ).
  • the area of interest may be mobile and identifiable, or not, by a visual marker.
  • the area of interest may belong to a mobile agent having its own operation, moving in the environment, for example a robot, a drone or a vehicle.
  • the agent may comprise a set of internal sensors (gyroscope, accelerometer, GPS, etc.) expressing a direction and / or a position relative to a specific internal repository.
  • the agent comprises, for example, a robotic arm or a steerable camera, which it controls internally and of which it knows the position and orientation characteristics expressed in its own reference system.
  • the agent comprises a wireless communication means configured to exchange information with the electronic device.
  • the agent comprises a visual marker, in particular carrying a coding pattern and observable by the camera to enable the electronic device to recognize the agent and determine its position and / or its orientation with respect to the reference system. electronic device.
  • the marker of the agent may or may not be of the same type as the marker of the gaze tracking device.
  • the area of interest can be fixed in the environment, that is to say having no intrinsic mobility.
  • a fixed area of interest may not have a visual marker.
  • the environment comprises for example a screen having one or more areas of interest.
  • the area of interest may be a point, an area surrounding a point, a surface or a volume.
  • the zone of interest is for example defined by a rectangular zone such as a frame, or a screen, anchored on a wall or a table or by a three-dimensional physical object or by a volume around a marker serving for example to schematize a robot or carrier vehicle marker.
  • the area of interest may be virtual, for example defined by part or all of the display area of a screen, or by a virtual object displayed on a screen.
  • the electronic device can be connected to a computer giving the instantaneous position of the virtual object which for example appears on the screen, moves and disappears.
  • one or more interaction rules can be associated with the area of interest.
  • the subject of the invention is a portable device for monitoring the gaze, in particular intended to exchange with an electronic device according to the invention, as defined previously, comprising:
  • a frame particularly of glasses type, for fixing on the head of a user
  • an on-board computer for determining the direction of the user's gaze in the reference frame of the head
  • a marker having at least four light sources forming a visual identification pattern.
  • the invention also relates to a portable eye tracking device for exchanging with an electronic device according to the invention, comprising: a frame, particularly of glasses type, for fixing on the head of a user,
  • an on-board computer for determining the direction of the user's gaze in the reference frame of the head
  • a marker comprising a lateral illumination optical fiber forming a visual identification pattern.
  • the subject of the invention is a portable eye tracking assembly, in particular for exchanging with an electronic device according to the invention, comprising:
  • one or more snap-fastening and / or clamping systems intended to fix, in particular in a removable manner, at least a part of the portable eye-tracking assembly, in particular the entire portable eye-tracking assembly, on a frame, in particular of eyeglass type, and / or on one or more lenses worn by the frame, for attachment to the head of a user,
  • an on-board computer for determining the direction of the user's gaze in the reference frame of the head
  • a marker comprising in particular at least four light sources or a lateral illumination optical fiber, forming a visual identification pattern
  • the portable eye tracking assembly forming a gaze tracking device, in particular a device according to the invention, when attached to the frame and / or the lens.
  • the portable eye tracking device and / or the portable eye tracking assembly may be, prior to its implementation, calibrated according to the user, for example by asking the user to set one or more points predefined at specific times, to follow a point in motion or to fix a fixed point while making a circular movement of the head.
  • This calibration can be renewed after each fixing of the portable eye tracking assembly on the frame and / or before each use of the portable eye tracking device.
  • the portable eye tracking device and / or the portable eye tracking assembly may include one or more optical sensors.
  • This or these optical sensors may be one or more cameras, including one or two cameras, which may be infrared cameras. In the case of several optical sensors, they can be arranged facing the same eye or face two different eyes once the frame in place on the head of the user.
  • the optical sensor (s) may be associated with one or more LEDs, notably infrared ones.
  • Calibration may include orientation of the optical sensor (s) to the eye (s), obtaining at least one image, image quality control, and indication to the user that the calibration is correctly carried out, in particular by issuing an audible signal or lighting a warning light.
  • the portable eye tracking device and / or the portable eye tracking assembly may have a plurality of markers such that at least one of them remains visible from the electronic device in the various possible orientation and tracking configurations. user's position in the environment.
  • the portable eye-tracking device and / or the portable eye-tracking unit may comprise two markers fixedly placed relative to one another, for example on the right and left sides of the front part of the glasses. serving as a mount.
  • the portable eye tracking device and / or the portable eye tracking assembly may include a battery.
  • the portable sight device and / or the portable eye tracking assembly may include a transmitter, including wireless, to transmit to the electronic device information relating to the direction of the gaze of the user.
  • the onboard computer may comprise an electronic circuit fixed or intended to be fixed on the frame.
  • the on-board computer may comprise an electronic circuit fixed or intended to be fixed on the frame and a portable base module carried by the user elsewhere than on the frame.
  • all the portable eye tracking assembly is fixed on the frame.
  • the entire portable eye tracking assembly is fixed to the frame, with the exception of the basic module.
  • the computer and / or one or more optical sensor (s) and / or a battery and / or wireless transmitter may be as described in PCT / IB2013 / 052930.
  • the frame and / or the lens or glasses may be manufactured to accommodate the computer and / or the marker and / or the optical sensor (s) and / or the battery and / or transmitter without wire and / or the one or more latching and / or clamping systems.
  • the mount may in particular include notches provided for the positioning and / or fixing of the snap-fastening system (s) and / or clamping system (s).
  • the portable eye tracking device may comprise a portable eye tracking assembly according to the invention.
  • the portable eye tracking assembly can be mono- or binocular, that is, it is intended to attach to one or two lenses respectively, and / or to the part of the frame that surrounds them.
  • the portable eye tracking assembly can be made to fit a frame and / or one or more glasses of a particular shape.
  • the portable eye tracking assembly can be manufactured to adapt to any type of frame and / or glass (s).
  • the portable eye tracking assembly can be fixed directly to the frame and / or to one or more lenses only.
  • the portable eye tracking unit may in a second variant be attached directly to a frame and / or one or more glasses, and on at least one other device.
  • the portable eye tracking unit may in a third variant be attached indirectly to a frame and / or to one or more lenses via another device.
  • Said other device of the second and third variants may for example be an element of smart glasses or a head-up display system (HMD). Fixing can in particular be carried out via a sensor or a display on the frame or glasses.
  • HMD head-up display system
  • the portable eye tracking assembly can be attached to the frame and / or the lens or glasses directly or indirectly, in particular via at least one intermediate piece.
  • a portable eye tracking assembly comprising a clamping system, it may be provided or not with screws.
  • the frame may comprise a front part and two branches.
  • the portable eye tracking assembly may include a snap and / or clamping system attaching to the front of the frame and / or one or more glasses.
  • the portable eye tracking assembly may include a plurality of snap-fastening and / or clamping systems attaching to the front portion of the frame.
  • the portable eye tracking assembly may comprise at least one snap-fastening and / or clamping system attaching to the front part of the frame and / or to one or more glasses and at least one locking system and / or clamping device attaching to at least one branch.
  • the portable eye tracking assembly may comprise at least two latching systems and / or clamping connected by a wire, for example a power supply wire and / or information exchange.
  • the portable eye tracking assembly may comprise a latching and / or clamping system comprising at least one lower relief, in which is placed the bottom of the frame and / or at least one glass, and at least one relief upper in which is placed the top of the frame and / or at least one glass.
  • the portable eye tracking assembly may at least partly conform to the shape of at least a portion of the frame and / or lens (s).
  • the latching and / or clamping system may comprise at least two arms so as to grip a glass and / or the frame between them. One of the two arms can be tilted.
  • the portable eye tracking device may include one or more lenses.
  • the portable eye tracking device can instead be devoid of glasses.
  • the lens or glasses may be transparent, corrective or not, possibly solar.
  • the lens or glasses may be black, especially when the user is visually impaired or blind.
  • the frame may in particular be an eyeglass frame or a visor of a helmet on which information is projected or an apparatus fixed by means of a mask on the face, for example of the night vision apparatus type. Interaction process by the eye
  • the subject of the invention is a method of interaction by the gaze between a user and a real environment in which he evolves, using a system according to the invention as defined above.
  • the method can comprise the following steps:
  • vs. determining by the processor, from the information, the image and data stored in the memory, the direction of the user's gaze relative to the environment and / or the zone of the environment viewed by the user.
  • the electronic device processes in real time the information received from the gaze tracking device.
  • the information received in step a) informs for example the direction of the gaze of the user in a repository of the gaze tracking device.
  • the information may include an identifier of the portable eye tracking device.
  • the identifier may correspond to the coding pattern of the marker, the memory storing for example a correspondence table between the identifiers and the coding patterns.
  • Each user can correspond to a unique identifier transmitted to the electronic device by the eye tracking device that it carries.
  • the information can also give a temporal indication of the instant (for example date and time) of measurement of the direction of gaze.
  • Step b) of acquiring at least one image of a marker of the portable device may comprise image filtering.
  • the processor can thus use a decoding algorithm known to those skilled in the art to differentiate the marker or markers from other light spots of the image.
  • the four colinear points always form a line segment.
  • the identification of sets of points aligned with the image received from the camera makes it possible to select the right segments of the space.
  • the processor recognizes the combinatorics of four luminous points that correspond to a known pattern and must therefore respect a proportionality constraint depending on the geometric pattern.
  • the electronic device by image processing performed by the processor, identifies the markers on the image, recognizes the known patterns present and reconstructs the position and orientation of the eye tracking device in the reference system of the electronic device.
  • the electronic device has access at any time to a digital map of the environment in which it is located (geometry, geography) or to the geometric structure of the object which it can be secured, including a screen. Moreover, he knows his instantaneous position and orientation in this environment.
  • the processor can thus, for each information received from the eye tracking device providing information on the direction of the user's gaze in a repository of the gaze tracking device at a given instant, perform a processing of the corresponding image observed by the camera and determine from the information, and data from the memory, the direction of the user's gaze relative to the environment.
  • the processor may determine by calculation the area viewed by the user.
  • This "zone viewed" can be an area of the environment to which the gaze of the user points or an area of the environment to which the user's eyes converge in 3D, or be calculated more complex.
  • the processor can determine if an area of interest is viewed by the user, especially if it has a surface or a volume cut by the direction of gaze or, when the direction of gaze is defined by a cone of vision, if the area of interest has a non-zero intersection with the volume of the cone.
  • the electronic device can put in interaction relation different users or users with areas of interest, in particular objects, present in the environment.
  • the method preferably comprises a step of calculation by the processor, as a function of the area viewed and of at least one programmed rule, of at least one result materialized in particular by a change of state of a logic output, the transmission of a datum or the stopping of its transmission, a visual, audio, mechanical or electrical effect.
  • a scheduled rule determines a result when a condition is met.
  • a rule consists for example, when the user looks at a predefined zone, to trigger the sending of a message, for example audio, including a headset or a headset worn by the user.
  • Another rule is, for example, when the user does not look at a predefined area for a defined time interval, to trigger another effect.
  • Another rule consists, for example, when the user looks at a predefined area, to trigger the lighting of a light, particularly noticeable by the user, for example present on a case worn by the user.
  • Another rule consists, for example, when the user looks at a predefined area and expresses his interest in receiving information, especially audio, in particular by actuating a button, to trigger the dissemination of said information.
  • Another rule consists, for example, when on the one hand the user is looking at an area other than a predefined area, on the other hand information, especially sound, is transmitted to him and that, on the other hand, the user expresses interest in no longer receive the sound message, especially by pressing a button, to stop the sound message.
  • Another rule consists, for example, when on the one hand the user looks at a predefined zone, that on the other hand information, especially sound, is transmitted to him and that, on the other hand, the user expresses his interest. to no longer receive the sound message, in particular by actuating a button, to stop the sound message and allow it to be picked up for a certain period of time, in particular less than or equal to 30 minutes, if the user is aware of its interest in the recovery, especially by actuating a button, particularly when the user looks at the predefined area at the time of actuation.
  • the method may comprise the calculation by the processor, as a function of the area viewed and of several rules programmed, in particular as defined above, of at least one result materialized in particular by a change of logic output state, the transmission of a data item or stopping of its transmission, a visual, audio, mechanical or electrical effect, the effect being exogenous or endogenous to the user.
  • the effect of the interaction between the gaze and the environment may be exogenous to the user, for example projection on a wall, sound broadcasting by loudspeakers, power off or power on equipment. .. or endogenous to the user, via a wireless communication with the user, for example a visual and / or sound broadcast on a screen and / or headset worn by the user.
  • the processor can generate a result depending on whether the area of interest is viewed by the user or not. This result being for example materialized by a change of state of a logic output, the transmission of data or the stopping of its transmission, a visual, audio, mechanical or electrical effect.
  • This result can in particular be materialized by the transmission or the stopping of the transmission of a sound message transmitted exogenously or endogenously to the endogenous user.
  • the interface or the wireless communication system of the electronic device makes it possible, depending on the result, to trigger an action on a third party system, in particular a screen, a projector, a sound distribution system, a robot, an installation industrial ...
  • a third party system in particular a screen, a projector, a sound distribution system, a robot, an installation industrial ...
  • the method according to the invention can in particular be used to interact with screens present in the environment of the user.
  • the method and the device according to the invention can be used to select an area or the totality of a screen present in the environment, for example a window or an object moving on the screen, or one of several screens, using the look alone or in combination with a mouse.
  • Such use may especially be particularly useful in the case where the environment has several screens visible to the user.
  • the method and the device according to the invention can make it possible to select the desired screen, which can make it possible to accelerate and facilitate the passage from one screen to another.
  • the method according to the invention can be used to, make appear or disappear a virtual object, move it, or highlight it visually, for example highlight, blink, highlight, change color, an area of interest on a screen that has been viewed by the user, or an area of interest on the screen that has not yet been viewed by the user, until the user changes the direction of his or her observe.
  • the electronic device can display images or patterns on a screen.
  • the method may further comprise a step of defining and / or updating areas of interest stored in the memory and / or interaction rules. This step can be implemented via a screen connected directly to an interface of the electronic device or via a third computer connected to the electronic device.
  • This step can be performed remotely on a computer connected to the electronic device through a suitable software editor.
  • the editor displays for example a graphical representation of the three-dimensional model of the environment on a screen, including a screen present in the user's environment, with all the conventional functions of rotation, translation, scaling, etc. ...
  • the user can, for example, select with a mouse and define the areas of interest of the environment from the displayed model.
  • an interaction rule comprising a condition for an interaction to occur and a result of the interaction depending on whether or not the condition is fulfilled.
  • all the geometric and topological characteristics, the identifier and the interaction rules of each area of interest are transmitted to the memory of the electronic device for storage.
  • the data of the memory are for example accessible in the form of a digital coding xml.
  • the electronic device can operate autonomously without being connected to the third computer.
  • the editor can also display a virtual space and similarly allow to define and / or modify an area of interest corresponding to a virtual object, which can be displayed on a screen of the environment.
  • an interaction rule can be associated with such an area of interest.
  • the subject of the invention is a method of calibrating an electronic device according to the invention as defined above, for use in an environment enabling the electronic device to calculate its position in a reference frame of the environment, especially with respect to an object of said environment, the method comprising at least the following steps:
  • the marker being fixed in the calibration position, acquire by the camera of the electronic device secured to the reference object a second image of the marker.
  • the electronic device may be intended to be fixed in normal operation to a reference object of the environment, in particular an environment screen.
  • an object provided with a marker for example a hand-held tracking device or placed on a foot, is placed in a fixed position close to the reference object.
  • the camera is then put in a first mode, with a great sensitivity range and arranged to observe both the reference object and the marker.
  • the reference object is preferably of a simple geometrical form.
  • the processor knowing the intrinsic characteristics of the camera, can by known mathematical methods, for example described in 2006 in "Real-Time Model-Based SLAM Using Line Segments" of Andrew P.
  • Gee, Walterio Mayol-Cuevas reconstruct the position and the orientation of the reference object in a specific reference system of the electronic device
  • the camera also sees the marker, allowing the processor in the same way to reconstruct the position and the orientation of the marker in the same reference system of the electronic device.
  • the processor then deduces the position and the relative orientation of the marker and the reference object relative to each other.
  • the camera is fixedly and securely placed on the screen, a filter is also preferably placed on the camera to facilitate the observation of the marker.
  • the acquisition of a second image by the camera makes it possible to recognize the new position and orientation of the marker in the repository of the camera, that is to say the electronic device.
  • the processor of the electronic device knowing the position and the relative orientation of the marker with respect to the reference object, the position and the relative orientation of the electronic device and the marker, then calculates by triangulation the position and orientation of the electronic device with respect to the reference object.
  • the operational phase can begin.
  • the electronic device reconstructs the position and orientation of the wearer of the gaze tracking device in its own repository, combines these data with the information received from the gaze tracking device, and then translates it into terms. position in the environment repository.
  • the electronic device remains here secured to the reference object, the latter being fixed or mobile in the environment.
  • the interaction system calibration method is performed using hardware and method for scanning the environment, simultaneously with obtaining a digital model of the environment.
  • the calibration / modeling method can thus use 3D scanning techniques using depth cameras operating in a near-infrared mode, compatible with the emission wavelength of the markers, for example of the Kinect® type.
  • a marker for example placed on a gaze tracking device, is stationary in the environment in the calibration position during the scan of the part.
  • the scan apparatus reconstructs the 3D space, while seeing the marker in the environment and can therefore position this tracking device relative to the digital model.
  • the electronic device that is to be calibrated is placed stationary in its position of use, in particular secured to a reference object. He sees and reconstructs his relative position and orientation with respect to the marker in the calibration position.
  • the subject of the invention is a method for monitoring the interest shown by one or more visitors equipped with a gaze tracking device in a given location, including a storage of associated data with regard to each visitor by the memory of the interaction system by the gaze according to the invention.
  • the method may include an identification of one or more areas of interest, the storage of associated data with respect to each visitor being relative at least in part to each area of interest.
  • the method may include an export to a data processing system.
  • FIG. 1 schematically and partially illustrates an electronic device according to the invention
  • FIG. 2 schematically illustrates the electronic device of FIG. 1 in an environment
  • FIG. 3A is a schematic and partial perspective view of a sighting device belonging to the electronic device of FIG. 1;
  • FIGS. 3B and 3C illustrate variations of gaze tracking device
  • FIG. 4 is a block diagram illustrating a method of interaction by the gaze, in accordance with the invention
  • FIG. 5 illustrates a step of definition and / or updating of areas of interest of an interaction method according to the invention
  • FIG. 6 is a block diagram of a method for calibrating a system according to the invention.
  • FIGS. 7A and 7B show an example of an environment during the implementation of the calibration method of FIG. 6,
  • FIG. 8 represents the implementation of an alternative calibration method
  • FIG. 9 represents an example of a portable eye tracking assembly according to the invention, fixed on a frame,
  • FIG. 10 illustrates a variant of a portable eye tracking assembly according to the invention fixed on an old prototype of "Google Glass",
  • FIG. 11 illustrates another example of a portable eye tracking assembly according to the invention as well as a frame and lenses to which it attaches,
  • FIG. 12 schematically illustrates the fixing of the assembly of FIG. 11 on the frame and the lenses of FIG. 12,
  • FIG. 13 represents another example of a portable eye tracking assembly according to the invention, fixed on a glass
  • FIG. 14A illustrates the portable eye tracking assembly of FIG. 13 in isolation
  • FIG. 14B represents a cross-section along XIV-XIV of a portion of the portable eye tracking assembly of FIG. 14A
  • FIG. 15 illustrates a variant of a portable eye tracking assembly according to the invention, fixed on a "Google Glass” model comprising a sensor, the portable assembly being fixed on the opposite side to the sensor and being worn by a user.
  • FIG. 16 represents detail A of FIG. 15,
  • FIG. 17 represents, in isolation and partially, the portable sight-following assembly of FIG. 15, seen from the side
  • FIG. 18 represents a variant of a portable eye tracking assembly according to the invention, fixed on a "Google Glass” model comprising a sensor, on the same side as the sensor, and
  • FIG. 19 illustrates the portable eye tracking assembly of FIG. 18, in isolation.
  • FIG. 1 shows an electronic device 100 in the form of a housing 110 comprising a camera 30, a processor 80 and a memory 60 as well as a receiver 20 for communicating wirelessly with a device for tracking the gaze 50 worn by a user.
  • the receiver 20 is in the form of a wireless input and output communication system, having the possibility of transmitting information to the gaze tracking device and / or to other external peripherals. .
  • the housing also includes a power supply 99. It is advantageous to have a power supply 99 integrated in the device housing, as in the example illustrated, when it is desired to move the latter in particular to scan the environment.
  • the device is powered by a supply integral with the housing but external to the latter or is directly connected to the sector.
  • the camera 30, for example a CMOS camera, is at least partially integrated with the housing 110. It comprises here, protruding from the housing 110, an optical light polarization system 38 and a bimodal light filtration system 37. Using a removable filter 36. Thus, in a first mode, the filter is absent and the camera is sensitive to the wavelengths of the visible. In the second mode, corresponding to normal operation, the filter 36 is in place to allow only IR or near-IR wavelengths to pass.
  • FIG. 2 illustrates a system of interaction by eye 500 between a user U carrying a gaze tracking device 50 on the head and an environment E which has several areas of interest 700.
  • the interaction system by the gaze 500 includes the electronic device 100 of Figure 1, arranged so that the camera 30 can see markers 1 and the eye tracking device 50.
  • the interaction system by the gaze 500 includes an audio device 701 for transmitting an audio message to the user, a push button 702 intended to be operated by the user to trigger or stop the transmission of the sound message, a light indicator 703 visible to the user to signal to the user that he is looking at an area of interest for which an explanatory audio message exists and can be listened to if the user requests it by operating the push button and a switch 704 having an "on" state to activate the audio device 701 and / or the user's portable eye tracking device 50 and an off state for disabling the audio apparatus 701 and / or the portable eye tracking device 50 of the user.
  • the eye follower 50 is shown in Figure 3A in isolation. It is in the form of spectacles intended to be worn by the user, with branches 14 resting on the ears and a central portion 12 resting on the nose, the glasses 13 of the glasses may include an anti-reflective coating.
  • the gaze tracking device 50 comprises in the example described two infrared LEDs 16 disposed in the central portion 12, on either side of the nose and each directed towards the one of the eyes of the user, as well as cameras, not shown, which can detect infrared radiation and oriented towards each of the eyes of the user, to acquire images of the latter.
  • the gaze tracking device 50 also comprises an electronic circuit 17 making it possible to process the images acquired by its cameras and a wireless transmitter 25 to transmit to the electronic device 100 information 11 relating to the direction of the gaze, this electronic circuit 17 and the wireless transmitter 25 being housed for example in a branch 14 of the device 50.
  • the device for monitoring the gaze 50 further comprises an autonomous power supply 59, arranged for example in the other branches 14 and giving it sufficient autonomy to not having to be recharged for an acceptable duration, for example several hours, or even a whole day.
  • Information 11 preferably includes both data concerning the gaze direction in the own repository of the device 50 at a given instant and data enabling the user to be identified.
  • the gaze tracking device 50 also comprises two markers 1 and each having four point light sources in the form of infrared LEDs placed on the front part of the glasses to be visible from the outside and forming a pattern that identifies the device 50, so the user U.
  • the markers 1 and are arranged on each side, right and left of the central portion 12.
  • the identifier transmitted in the information 11 corresponds to the coding pattern of the markers 1 and.
  • the gaze tracking device comprises a single marker coding with four collinear LEDs and a second non-coding marker, comprising for example one or two LEDs, used to facilitate the determination of the orientation of the tracking device. look.
  • the variant eye tracking device 50 illustrated in FIG. 3B comprises a single marker comprising five infrared LEDs, coplanarly disposed of which no triplet of LEDs is collinear.
  • the eye tracking device 50 illustrated in FIG. 3C comprises a marker 1 comprising two lateral-illuminated optical fibers.
  • Marker 1 has two non-collinear parallel segments forming in the illustrated example two opposite sides of a rectangle.
  • the device for monitoring the gaze 50 carried by the user transmits, in particular at a frequency between 30 and 200 Hz, information 1 1 providing information on the direction of gaze relative to a reference frame. specific to the gaze tracking device 50.
  • the information 11 also contains an identifier making it possible to recognize the user of the wearer of the eye tracking device 50, in particular corresponding to a coding pattern of one or more markers 1, of the eye tracking device 50.
  • the camera 30 observes an image 33 of the marker of the gaze tracking device 50 in the environment E.
  • the image 33 gives an overview of the environment.
  • the image 33 also provides a representation of the marker 1 for both identifying the user and knowing the position and orientation of the device 50 relative to the electronic device 100.
  • Step 920 can take place just before or just after step 910.
  • steps 910 and 920 are simultaneous.
  • step 930 the processor 80 analyzes the image 33 and deduces the position and the orientation of the eye tracking device 50, and combining with the information 11 and data stored in the memory 60 determines in a environment reference E the direction of the view of the user U and which zone 600 of the environment E is viewed by the user. The processor 80 determines in particular whether the area viewed corresponds to a zone of interest 700.
  • the processor 80 thus calculates which zone 600 is viewed in the environment, and checks whether this zone corresponds to a zone of interest 700.
  • the electronic device 100 determines in step 940 at least one result by applying at least one programmed rule. If one or more interaction conditions are fulfilled a result is sent to the interface 90 or the wireless connection 95 to trigger an action.
  • the steps 910 to 940 are carried out totally on board, that is to say that the calculations are carried out by the processor 80 of the electronic device 100, the communications between the electronic device 100 and the monitoring device of the device. look 50 or with one or more output devices are by wireless link from the interface 95.
  • part of the data processing is performed by a third computer with which the processor 80 exchanges data.
  • the method preferably comprises a step 960 for defining or updating the memory 60 by means of a software editor.
  • This step is done for example via an interface of the electronic device or via a third computer connected to the electronic device 100 as illustrated. in Figure 5.
  • the electronic device 100 is connected via a computer 300 to a screen 350 to display a representation E 0 of the digitized model of the environment E.
  • the editor makes it possible to select zones 750 displayed on the screen 350 to define areas of interest 700 of the environment and associated interaction rules.
  • the third computer 300 transfers the spatial and interaction data thus defined or updated to the memory 60 and the electronic device 100 can operate autonomously.
  • the electronic device 100 Before using a system 500, the electronic device 100 must be calibrated in order to calculate its position in an environment reference system, E, and be used in the environment E.
  • Fig. 7A illustrates a first step 810 of block diagrams of Fig. 6 corresponding to an example of a calibration method.
  • a marker 1 is placed in the environment E close to a reference object 780, for example a screen 730 on which a pattern, a picture or a known color of the electronic device has been displayed, for example a uniform color as illustrated. , facilitating the recognition of the active area of the screen by image processing means.
  • the camera 30 records a first image 34 of the marker 1 and the screen 730, making it possible to calculate in a reference frame of the electronic device 100 the orientation and the position of the marker 1 and the screen 730.
  • step 820 the electronic device 100 is fixed on the screen 730 while the marker 1 is not moved.
  • step 830 the camera 30 takes a second image of the environment on which the marker 1 is visible. Since the latter has moved in the reference system of the electronic device, from the change of the orientation and the position of the marker 1 in the repository of the electronic device 100, the processor 80 deduces the coordinates of the marker 1 in a reference linked to the environment E.
  • the electronic device 100 remains attached to the reference object 780, here the screen 730, and the gaze tracking device 50 can then be moved; the processor 80 can calculate at any time the position and the orientation of the gaze tracking device 50 in the frame linked to the environment E.
  • FIG. 8 illustrates another example of calibration of an interaction system 500 using a third camera 330, a depth camera in the example illustrated.
  • the calibration is implemented during the 3D scan of the environment, the camera 30 of the electronic device 100 being fixed on the reference object 780.
  • the environment E is for example a closed space.
  • the electronic device is not attached to a screen but to another reference object of the environment.
  • FIG. 9 represents an example of a portable eye tracking assembly 70 according to the invention fixed on a frame 120.
  • the frame has a front portion 121 and two branches, right 14a and left 14b.
  • the front portion 121 has a right upper portion 121c, a left upper portion 12d, a lower right portion 121e, a lower left portion 12f and a central portion 121g.
  • the portable eye tracking assembly 70 comprises a plurality of snap-fastening and / or clamping systems 71, namely a straight lateral piece 71a snap-fastened to the right branch 14a, a left-hand lateral piece 71b fixed by clamping to the left branch. 14b, a right upper piece 71c snap-fastened to the upper right portion of the front portion 121c, an upper left piece 71d snap-fastened to the upper left portion of the front portion 12d, a lower right piece 71c snapped to the lower right portion of the front portion 121e, a lower left piece 71f snap-fastened to the lower left portion of the front portion 121f.
  • Electrical son connect some parts together, for example the upper right piece 71c to the right side piece 71a, to ensure their power supply for example.
  • the portable assembly comprises a marker 1, itself comprising eight light sources 15, forming a visual identification pattern.
  • Six light sources 15 are disposed at the front portion 121 of the mount 120. Specifically two light sources 15 are disposed in each of the upper right and left 121c 121d ld and a light source 15 is disposed in each of the lower right parts 121e and left 12 lf.
  • the portable eye tracking assembly 70 includes an onboard computer (not shown) for determining the direction of the user's gaze in the reference frame of the head, a wireless transmitter (not shown) for transmitting to the electronic device 100 information 11 relating to the direction of the user's gaze and a battery housed in the right side piece 71a.
  • the lower left 71f and right 71e parts are each provided with a video sensor and an infrared LED (not shown), facing the eye, allowing observation of the latter.
  • the infrared LED provides lighting adapted to the sensor, the latter being precisely an infrared camera.
  • the eye tracking assembly 70 when mounted on the frame 120, forms a sighting device 50.
  • the eye tracking assembly 70 can be adapted to a very wide range of frames and glasses, in particular thanks to the various parts 71a to f.
  • a calibration phase may be provided after the setting up of the gaze tracking assembly 70 on the frame 120.
  • FIG. 10 illustrates a variant of a portable eye tracking assembly according to the invention, fixed on an old prototype of Google Glass.
  • the portable eye tracking assembly 70 of FIG. 10 comprises a number of latching and / or clamping systems 71 less than that of FIG. 9.
  • the various latching and / or clamping systems are a lateral piece. left 71b fixed by latching to the left arm 14b, a right lower piece 71e clamped to the lower right portion of the front portion 121e of the frame 120 and the right lens 13a, a lower left piece 71f fixed by clamping to the part lower left of the front portion 121f of the frame 120 and the left lens 13b, and a central piece 71g clamped to the central portion of the central portion 121g of the frame 120.
  • the portable assembly 70 comprises a marker 1 itself comprising three light sources 15 forming a visual identification pattern.
  • the portable eye tracking assembly 70 is here attached to a Google Glass, but can accommodate a very wide range of frames and lenses.
  • the various parts 71c, 71d, 71e and 71f of Figure 9 are replaced by a single piece 71h.
  • the front portion 121 of the frame 120 has only a right upper portion 121c, a left upper portion 12d and a central portion 121g.
  • Right 13a and left 13b glasses are attached to the frame.
  • the portable eye tracking assembly 70 comprises two latching and / or clamping system 71, namely a right side piece 71a identical to that of FIG. 9, snap-fastened to the right branch 14a, and part 71h. snap-fastened and clamped to the frame 120 and the glasses 13a and 13b.
  • the piece 71h has a lower straight groove 18e and a lower left groove 18f, each U-shaped, in which the glasses 13a and 13b are placed, two upper right grooves 19c and 19c ', each U-shaped inverted, which come to attach to the upper right portion 121c of the mount 120 and two left upper grooves 19d and 19d ', each U-shaped inverted, which are fixed on the upper left part 121d of the mount 120, as shown in Figure 12 .
  • the portable eye tracking kit 70 is adapted to a range of existing frames and lenses.
  • FIGs 13, 14A and 14B illustrate a monocular variant of portable eye tracking assembly 70 mounted on glasses.
  • the portable eye tracking assembly 70 is fixed on a single lens 13b.
  • the portable eye tracking assembly 70 comprises a latching and clamping system 71 in the form of a part comprising two uprights 2a and 2b which are fixed on either side of the glass 13b, at least one of the uprights. 2b being tiltable so that the glass 13b can be clamped between the two uprights 2a and 2b, an upper U-shaped groove 19d inverted in which is housed a flexible material M matching the shape of the glass 13b.
  • FIG. 15 illustrates an example of a portable eye tracking assembly 70 that is attached to a Google Glass model 120 having a sensory sensor 51.
  • the frame 120 has a central portion 121g and two branches 14a and
  • the portable eye tracking assembly 70 is fixed on the opposite side to the sensory sensor 51.
  • the sensory sensor 51 is located in the example of Figure 15 on the side of the right eye and the portable eye tracking assembly 70 is thus fixed to the side of the left eye.
  • the portable eye tracking assembly 70 comprises a clamping system 71 comprising two parts, internal 3a and external 3b, clamped together and around the branch 14b by means of at least one screw 4.
  • the portable eye tracking assembly 70 comprises a third part 5 comprising a housing 6 and a rigid arm 7 advancing towards the nearest eye, here the left eye.
  • the outer part 3b comprises a slideway 9 into which the housing 6 engages.
  • the latter is fixed to the outer part 3b by means of at least one screw 4, as illustrated in FIGS. 16 and 17.
  • the portable eye tracking assembly 70 comprises a marker 1 in the form of two light sources 15 along the arm 7 and two light sources on the housing 6.
  • the end 8 of the arm 7 houses an optical sensor (not shown) facing the eye.
  • the housing 6 houses an onboard computer, a battery and a wireless transmitter (not shown).
  • the dimensions of the housing 6 are a function, in particular, of the size of the on-board battery.
  • the portable eye tracking assembly 70 can be positioned in an adjustable manner, particularly in height and along the branch, or in a non-adjustable variant. Once fixed it is no longer adjustable.
  • the chosen configuration will have to be known by the device, for example using parameters measured manually by conventional means, and / or a rule and / or an automatic calibration procedure.
  • the portable eye tracking assembly 70 is set this time on the same side as the sensor sensor 51 of the Google Glass.
  • the portable eye tracking assembly 70 comprises a snap-fastening system 71 comprising a hook 72 attached to the branch 14b of the frame 120 and to the sensor sensory 51.
  • the hook 72 is placed towards the inner side of the branch 14b, in other words towards the user.
  • the latching system 71 is designed to avoid covering the area of the sensor sensor 51 of the Google Glass, so as to allow its use.
  • Example 1 - environment digital platforms
  • This example corresponds to a variant in which the electronic device is fixed on a screen.
  • the method replaces the use of the mouse, the keyboard or the touch screen in the use of digital platforms (tablets, computer screens, etc.) for persons with reduced upper limb mobility or null (disabled people, post-stroke, ).
  • Example 2 - environment museum or exhibition, including showroom
  • the user or users are one or visitors each provided with a watch device with a mount, and an audio device, for example headphones or earpieces.
  • the audio device is portable and communicates wirelessly with a central information system.
  • the eye interaction system comprises one or more electronic devices fixed in one or more rooms.
  • the museum or exhibition venue is equipped with as many electronic devices as needed.
  • areas of interest have been identified such as paintings, sculptures, objects, or parts of the exposed elements, for example the smile of the Mona Lisa.
  • the system determines precisely where he looks and then sends him for example an audio commentary on this work or the detail looked at.
  • the user or users can act on an on / off switch and a push button, which can be embedded on the mount or on an additional housing carried by hand, communicating with or wirelessly with the electronic card.
  • the interaction system by the look can include a light indicator per user, visible by it.
  • the indicator light can be worn by the user.
  • the indicator light can be a LED with low lighting directed towards the user's eye in order to be able to send a signal to him.
  • the gaze tracking device can be enabled or disabled at the discretion of the user through the switch.
  • the indicator light comes on, or in the absence of indicator light a sound signal is emitted.
  • the message stops when the user presses the button again, regardless of the behavior of his gaze during this period of time. If at this time the user is still looking at the same area of interest, the audio device goes into a state called "pause” for a certain time. Otherwise the audio device goes into a state called "stop".
  • the user can resume reading the message by looking at the same area of interest and pressing the push button. Playback then resumes at the point where it stopped just before, and not at the beginning of the message.
  • stop the user can not resume the reading where it had stopped.
  • the user can continue his visit, including watching the same or another known area of interest, the indicator light illuminating and the audio message being broadcast under the conditions described above.
  • the user can continue his visit, including watching the same or another known area of interest, the indicator light illuminating and the audio message being broadcast under the conditions described above.
  • An audio message can also be transmitted to the user so as to guide his gaze towards an area of interest. Once the user's gaze is correctly positioned, an audio commentary relating to the area of interest can be transmitted to the user.
  • This type of equipment can be manufactured at low cost and in multiple copies.
  • the system makes it possible to simultaneously manage several users with identifiable look-a-way tracking devices.
  • the system of interaction by the gaze thus makes it possible to improve the visitor experience of a place such as a museum or an exhibition.
  • the complete visual journey of the user can be recorded.
  • An assessment of the visit at the exit of the museum or exhibition site may be provided to the visitor or the institution for statistical purposes. For example, we can extract the following data: works most extensively viewed, reading or not cartels, time spent before each work, etc.
  • Example 3 - environment store or supermarket
  • a system according to the invention can be used in behavioral consumer studies in stores or supermarkets, in particular to improve the layout of products on shelves.
  • Electronic devices according to the invention are distributed in the merchandise departments.
  • the subjects of the study are provided with a device for monitoring the gaze and are totally free of their movements in the environment, not knowing a priori what are the 'sensitive' areas of experimentation thus making their behavior 'more normal' .
  • the evaluation metrics of subjects in this type of protocol is very sensitive to the degree of acceptance and sincerity of the subject to experimentation.
  • the implementation of the experiment for each subject is simple and fast.
  • the device for tracking the gaze is light and the system used does not require an additional recording box or an onboard scene camera, for example, unlike other existing solutions on the market that are more troublesome for them. people tested.
  • An industrial hangar environment has several agents, each with a marker.
  • One or more electronic devices are placed in such a way that they maximize the entire environment. They calculate the positions and directions of movement of the various agents and users with eye tracking devices. For example, if one believes that a collision between two agents or between an agent and a user is imminent, it warns users (alarm signal) or stops the agents.
  • the device for monitoring the gaze worn by the user may be different from the illustrated eye tracking devices.
  • the markers may be other, in particular by the nature, the number and / or the positioning of the light sources.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ophthalmology & Optometry (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
EP14798990.9A 2013-10-14 2014-10-14 Verfahren zur interaktion durch benutzerblicke und zugehörige vorrichtung Withdrawn EP3058444A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1359975A FR3011952B1 (fr) 2013-10-14 2013-10-14 Procede d'interaction par le regard et dispositif associe
PCT/IB2014/065307 WO2015056177A1 (fr) 2013-10-14 2014-10-14 Procédé d'interaction par le regard et dispositif associé

Publications (1)

Publication Number Publication Date
EP3058444A1 true EP3058444A1 (de) 2016-08-24

Family

ID=50289739

Family Applications (1)

Application Number Title Priority Date Filing Date
EP14798990.9A Withdrawn EP3058444A1 (de) 2013-10-14 2014-10-14 Verfahren zur interaktion durch benutzerblicke und zugehörige vorrichtung

Country Status (4)

Country Link
US (2) US10007338B2 (de)
EP (1) EP3058444A1 (de)
FR (1) FR3011952B1 (de)
WO (1) WO2015056177A1 (de)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109521397B (zh) 2013-06-13 2023-03-28 巴斯夫欧洲公司 用于光学地检测至少一个对象的检测器
CN106662636B (zh) 2014-07-08 2020-12-25 巴斯夫欧洲公司 用于确定至少一个对象的位置的检测器
JP6637980B2 (ja) 2014-12-09 2020-01-29 ビーエーエスエフ ソシエタス・ヨーロピアBasf Se 光学検出器
CN107438775B (zh) 2015-01-30 2022-01-21 特里纳米克斯股份有限公司 用于至少一个对象的光学检测的检测器
GB2539009A (en) * 2015-06-03 2016-12-07 Tobii Ab Gaze detection method and apparatus
US10607401B2 (en) 2015-06-03 2020-03-31 Tobii Ab Multi line trace gaze to object mapping for determining gaze focus targets
WO2017012986A1 (en) * 2015-07-17 2017-01-26 Trinamix Gmbh Detector for optically detecting at least one object
FR3041231B1 (fr) 2015-09-18 2017-10-20 Suricog Systeme portatif comportant un support deformable
FR3041230B1 (fr) 2015-09-18 2022-04-15 Suricog Procede de determination de parametres anatomiques
US10139901B2 (en) * 2016-07-05 2018-11-27 Immersv, Inc. Virtual reality distraction monitor
EP3491675B1 (de) 2016-07-29 2022-11-16 trinamiX GmbH Optischer sensor und detektor zur optischen detektion
WO2018077870A1 (en) 2016-10-25 2018-05-03 Trinamix Gmbh Nfrared optical detector with integrated filter
US11428787B2 (en) 2016-10-25 2022-08-30 Trinamix Gmbh Detector for an optical detection of at least one object
KR102452770B1 (ko) 2016-11-17 2022-10-12 트리나미엑스 게엠베하 적어도 하나의 대상체를 광학적으로 검출하기 위한 검출기
JP2019046438A (ja) * 2017-09-05 2019-03-22 株式会社Jvcケンウッド 評価装置、及び評価方法
US11393251B2 (en) 2018-02-09 2022-07-19 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
WO2019154511A1 (en) 2018-02-09 2019-08-15 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters using a neural network
EP3750028B1 (de) 2018-02-09 2022-10-19 Pupil Labs GmbH Vorrichtungen, systeme und verfahren zur vorhersage von blickbezogenen parametern
US10928900B2 (en) * 2018-04-27 2021-02-23 Technology Against Als Communication systems and methods
US11003244B2 (en) * 2018-08-27 2021-05-11 University Of Rochester System and method for real-time high-resolution eye-tracking
US11676422B2 (en) 2019-06-05 2023-06-13 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
US11036453B1 (en) * 2019-10-28 2021-06-15 Rockwell Collins, Inc. Bezel embedded head tracking fiducials

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7338A (en) * 1850-05-07 James l
US3017806A (en) * 1960-03-15 1962-01-23 Stolper & Voice Optical Co Inc Eyeglass frames of the semirimless type
US4568159A (en) 1982-11-26 1986-02-04 The United States Of America As Represented By The Secretary Of The Navy CCD Head and eye position indicator
US5016282A (en) 1988-07-14 1991-05-14 Atr Communication Systems Research Laboratories Eye tracking image pickup apparatus for separating noise from feature portions
IL138831A (en) * 2000-10-03 2007-07-24 Rafael Advanced Defense Sys An information system is operated by Mabat
AU2003215240A1 (en) * 2002-02-14 2003-09-04 Illumina, Inc. Automated information processing in randomly ordered arrays
US6824265B1 (en) * 2003-03-31 2004-11-30 Wesley Stephen Harper Illuminated safety and work glasses
US20050105041A1 (en) * 2003-10-02 2005-05-19 Ira Lerner Interchangeable eyewear assembly
KR20060131775A (ko) * 2003-11-26 2006-12-20 라파엘 아마먼트 디벨롭먼트 오쏘리티 엘티디. 헬멧 위치 측정 시스템, 헬멧 어셈블리, 및 동공 응시 방향계산 방법
JP4914019B2 (ja) 2005-04-06 2012-04-11 キヤノン株式会社 位置姿勢計測方法及び装置
US8941589B2 (en) * 2008-04-24 2015-01-27 Oblong Industries, Inc. Adaptive tracking system for spatial input devices
JP4777182B2 (ja) * 2006-08-01 2011-09-21 キヤノン株式会社 複合現実感提示装置及びその制御方法、プログラム
US20090196460A1 (en) 2008-01-17 2009-08-06 Thomas Jakobs Eye tracking system and method
US8941590B2 (en) 2008-04-24 2015-01-27 Oblong Industries, Inc. Adaptive tracking system for spatial input devices
US8155387B2 (en) 2008-10-13 2012-04-10 International Business Machines Corporation Method and system for position determination using image deformation
KR101383235B1 (ko) * 2010-06-17 2014-04-17 한국전자통신연구원 시선 추적을 이용한 좌표 입력 장치 및 그 방법
US8830329B2 (en) 2010-10-07 2014-09-09 Sony Computer Entertainment Inc. 3-D glasses with camera based head tracking
US8854282B1 (en) * 2011-09-06 2014-10-07 Google Inc. Measurement method
JP5762892B2 (ja) * 2011-09-06 2015-08-12 ビッグローブ株式会社 情報表示システム、情報表示方法、及び情報表示用プログラム
US20130100015A1 (en) * 2011-10-25 2013-04-25 Kenneth Edward Salsman Optical input devices
US20150097772A1 (en) * 2012-01-06 2015-04-09 Thad Eugene Starner Gaze Signal Based on Physical Characteristics of the Eye
US9024844B2 (en) * 2012-01-25 2015-05-05 Microsoft Technology Licensing, Llc Recognition of image on external display
US20140247286A1 (en) * 2012-02-20 2014-09-04 Google Inc. Active Stabilization for Heads-Up Displays
US8970571B1 (en) * 2012-03-13 2015-03-03 Google Inc. Apparatus and method for display lighting adjustment
JP5912059B2 (ja) * 2012-04-06 2016-04-27 ソニー株式会社 情報処理装置、情報処理方法及び情報処理システム
FR2989482B1 (fr) 2012-04-12 2022-12-23 Marc Massonneau Procede de determination de la direction du regard d'un utilisateur.
US20130297460A1 (en) * 2012-05-01 2013-11-07 Zambala Lllp System and method for facilitating transactions of a physical product or real life service via an augmented reality environment
GB2501767A (en) * 2012-05-04 2013-11-06 Sony Comp Entertainment Europe Noise cancelling headset
US20130293530A1 (en) * 2012-05-04 2013-11-07 Kathryn Stone Perez Product augmentation and advertising in see through displays
US9262680B2 (en) * 2012-07-31 2016-02-16 Japan Science And Technology Agency Point-of-gaze detection device, point-of-gaze detecting method, personal parameter calculating device, personal parameter calculating method, program, and computer-readable storage medium
US9250445B2 (en) * 2012-08-08 2016-02-02 Carol Ann Tosaya Multiple-pixel-beam retinal displays
US20140085198A1 (en) * 2012-09-26 2014-03-27 Grinbath, Llc Correlating Pupil Position to Gaze Location Within a Scene
US9928652B2 (en) * 2013-03-01 2018-03-27 Apple Inc. Registration between actual mobile device position and environmental model

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
None *
See also references of WO2015056177A1 *

Also Published As

Publication number Publication date
WO2015056177A1 (fr) 2015-04-23
US20160224110A1 (en) 2016-08-04
FR3011952B1 (fr) 2017-01-27
FR3011952A1 (fr) 2015-04-17
US20180275755A1 (en) 2018-09-27
US10007338B2 (en) 2018-06-26

Similar Documents

Publication Publication Date Title
WO2015056177A1 (fr) Procédé d'interaction par le regard et dispositif associé
US10423837B2 (en) Method and apparatus for a wearable computer
CN106415444B (zh) 注视滑扫选择
KR102417177B1 (ko) 인사이드-아웃 위치, 사용자 신체 및 환경 추적을 갖는 가상 및 혼합 현실을 위한 머리 장착 디스플레이
US10643389B2 (en) Mechanism to give holographic objects saliency in multiple spaces
CA2984147C (en) Privacy-sensitive consumer cameras coupled to augmented reality systems
CN105431763B (zh) 在佩戴移动设备时跟踪头部移动
US10565797B2 (en) System and method of enhancing user's immersion in mixed reality mode of display apparatus
CN104603675B (zh) 图像显示设备、图像显示方法和记录介质
US11269406B1 (en) Systems and methods for calibrating eye tracking
JP5456791B2 (ja) 空間領域の映像に対する人の注視点を決定するためのシステム及びその方法
US20150262424A1 (en) Depth and Focus Discrimination for a Head-mountable device using a Light-Field Display System
KR20180096434A (ko) 가상 이미지 표시 방법, 저장 매체 및 이를 위한 전자 장치
US9123272B1 (en) Realistic image lighting and shading
US20160033770A1 (en) Head-mounted display device, control method of head-mounted display device, and display system
US11841502B2 (en) Reflective polarizer for augmented reality and virtual reality display
CN109923499B (zh) 便携式眼睛追踪装置
US10528128B1 (en) Head-mounted display devices with transparent display panels for eye tracking
KR20150092226A (ko) 헤드 탑재형 디스플레이 자원 관리
WO2016130533A1 (en) Dynamic lighting for head mounted device
US11353955B1 (en) Systems and methods for using scene understanding for calibrating eye tracking
KR20200082109A (ko) 비주얼 데이터와 3D LiDAR 데이터 융합 기반 계층형 특징정보 추출 및 응용 시스템
US11070789B1 (en) Switchable fringe pattern illuminator
US11450113B1 (en) Method and apparatus for a wearable computer
US20180267601A1 (en) Light Projection for Guiding a User within a Physical User Area During Virtual Reality Operations

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20160517

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20180723

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20230927