WO2023213663A1 - Procédé de détermination d'un point de regard par électro-oculographie dans des conditions de pose et de position de tête non stationnaires - Google Patents

Procédé de détermination d'un point de regard par électro-oculographie dans des conditions de pose et de position de tête non stationnaires Download PDF

Info

Publication number
WO2023213663A1
WO2023213663A1 PCT/EP2023/061035 EP2023061035W WO2023213663A1 WO 2023213663 A1 WO2023213663 A1 WO 2023213663A1 EP 2023061035 W EP2023061035 W EP 2023061035W WO 2023213663 A1 WO2023213663 A1 WO 2023213663A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye movement
sequence
models
regard
movement models
Prior art date
Application number
PCT/EP2023/061035
Other languages
English (en)
Inventor
Nathaniel BARBARA
Tracey CAMILLERI
Kenneth CAMILLERI
Original Assignee
L-Università Ta' Malta
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by L-Università Ta' Malta filed Critical L-Università Ta' Malta
Publication of WO2023213663A1 publication Critical patent/WO2023213663A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/398Electrooculography [EOG], e.g. detecting nystagmus; Electroretinography [ERG]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection

Definitions

  • This invention relates to a method of determining a point of regard and an apparatus for determining a point of regard.
  • EOG electrooculography
  • a method of determining a point of regard comprising the steps of: measuring an actual eye movement of an eye using electrooculography, and generating at least one electrooculography signal corresponding to the measured actual eye movement; using the or each electrooculography signal to build a sequence of eye movement models that corresponds to actual possible eye movements, wherein each eye movement model in the sequence of eye movement models is selected from a plurality of candidate eye movement models, wherein the plurality of candidate eye movement models represents different types of eye movements; estimating a gaze angle based on the sequence of eye movement models; and determining the point of regard based at least partially on the estimated gaze angle.
  • the human eye is capable of many different types of movements for different purposes. For example, a fixation may occur when focusing on a particular point in space, a saccade may occur when shifting focus from one point to another, a vestibulo-ocular reflex may occur to compensate for a head movement and particular characteristics of eye movement are also associated with blinking.
  • a gaze angle of a user i.e., the angle at which an eye is oriented relative to the user's face
  • a sequence of eye movement models which may be representative of a plurality of different types of eye movements.
  • the method involves transitioning between different models so that a user's gaze angle may be estimated according to the type of eye movement actually being performed rather than using a single model which may be accurate for one type of eye movement but inaccurate for other types of eye movement.
  • the user's point of regard i.e., the point in space that the user is looking at, may also be determined.
  • the actual eye movements may be measured continuously using electrooculography. Therefore, the sequence of eye movement models may also be continuously updated.
  • the gaze angle, and subsequently the point of regard may be estimated/determined recursively based on the most up to date sequence of eye movement models and, hence, the user's gaze angle and point of regard may be estimated/determined continuously.
  • the method may further comprise the step of measuring an actual head pose and/or an actual head position, wherein the step of determining the point of regard may be additionally based on the measured actual head pose and/or the measured actual head position.
  • a user's actual head pose i.e., the head's orientation relative to a three-dimensional space
  • a user's actual head position i.e., the head's translational position in three-dimensional space
  • the point of regard may be determined by combining this information with the estimated gaze angle.
  • the user's actual head pose and position essentially represent the starting point of the user's line of sight.
  • the determined gaze angle then represents the direction of that line of sight. If the contents of the user's field of view is known, then the starting point and direction of the user's line of sight may be sufficient to determine a point of regard. For example, the user may be looking at a fixed screen, the position of which may be known relative to the user's actual head pose and position.
  • the orientation of both of the user's eyes may be estimated and the verging point of those gaze angles may also be determined in order to determine the user's point of regard in three-dimensional space.
  • the user's actual head pose and/or position may be fixed such that it is not necessary to measure the actual head pose and/or position in order to calculate the point of regard.
  • the user may be resting his or her head on a chin rest or the user's head may be secured to a static head brace.
  • the user's actual head pose and/or position may be immaterial to the user's point of regard.
  • the user may be viewing a virtual or augmented reality in which the user's actual head pose and/or position has no effect on what is being viewed and, hence, the point of regard.
  • the step of using the or each electrooculography signal to build a sequence of eye movement models that corresponds to actual possible eye movements may comprise the steps of: populating a sequence tree with a plurality of candidate eye movement models based on the or each electrooculography signal; determining an activity probability for each sequence of candidate eye movement models forming part of the sequence tree, each activity probability being representative of the likelihood that the measured actual eye movement includes the type or types of eye movement represented by the respective sequence of candidate eye movement models; selecting a sequence of candidate eye movement models with the highest activity probability as the sequence of candidate eye movement models that corresponds to actual possible eye movements.
  • a first EOG signal corresponding to an actual eye movement measured for an initial interval of time may be generated.
  • a sequence tree may be populated with a plurality of candidate eye movement models based on an initial electrooculography signal so that the sequence tree comprises a branch per candidate eye movement model.
  • An activity probability for each candidate eye movement model may be determined wherein each activity probability represents the likelihood that the measured actual eye movement corresponding to the initial EOG signal is the same type of eye movement as that represented by the respective candidate eye movement model.
  • the or each EOG signal may be generated continuously and, at respective instants of time, the sequence tree may be populated with a plurality of candidate eye movement models based on the or each most recently generated EOG signal.
  • a new branch per candidate eye movement model may be added to the end of each existing branch so that the tree grows with each time interval, wherein the newest branches are based on the or each most up to date EOG signal.
  • a plurality of candidate eye movement models added to the sequence tree at a particular time instant may be referred to from here on as a sample.
  • an activity probability may be determined for each sequence of candidate eye movement models rather than just the candidate eye movement models forming part of the most recently sample.
  • Each activity probability may therefore be representative of the likelihood that the measured actual eye movement includes the type or types of eye movement represented by the respective sequence of candidate eye movement models. So, if a particular sequence of candidate eye movement models is representative of the measured actual eye movement comprising a sequence of transitions between the various eye movement models, the respective activity probability may represent the likelihood of such transitions occurring according to the timings predicted by the respective sequence of candidate eye movement models.
  • each activity probability that is determined may be dependent on known characteristics of eye movement event occurrence, such as the typical duration of saccades, for example.
  • the determined activity probabilities may also be based on other factors. For example, determined activity probabilities may be at least partially dependent on the correlation of the relevant one or more EOG signals to historical EOG signals corresponding to the respective types of eye movement, or the classification of one or more specific characteristics of an EOG signal and the degree to which those characteristics are associated with the respective types of eye movement.
  • the sequence of candidate eye movement models with the highest activity probability may be selected as the sequence of eye movement models corresponding to the actual eye movement and it is this sequence on which the estimation of gaze angle, and hence the determination of point of regard, may be based.
  • a user's point of regard may need to be determined over a prolonged period of time.
  • the sequence tree is continually populated without limitation, then the processing power required to calculate the increasing number of activity probabilities associated with every possible sequence forming part of the sequence tree may become impractical.
  • the step of selecting a sequence of candidate eye movement models with the highest activity probability as the sequence of eye movement models that corresponds to actual possible eye movements may comprise pruning the sequence tree if the sequence tree meets a pruning condition or if one or more of a plurality of pruning conditions is met.
  • the step of pruning the sequence tree may comprise eliminating all sequences of candidate eye movement models except the sequence of candidate eye movement models with the highest activity probability. Therefore, immediately following a pruning occurrence, the sequence tree consists of a single sequence of candidate eye movement models and the activity probability of that single remaining sequence may be considered to equal 1.
  • the method may further comprise the step of determining a probability of head movement based on the measured actual head pose and the measured actual head position. Since certain eye movements are carried out when one moves their head, one or more pruning conditions may be dependent on the determined probability of head movement.
  • Examples of possible pruning conditions include:
  • sequence of candidate eye movement models with the highest activity probability comprising a transition to a new candidate eye movement model, has the newly active candidate eye movement model been active for a number of samples greater than a minimum new model threshold? 4) If the sequence of candidate eye movement models with the highest activity probability comprises one or more candidate eye movement models representative of a blink, has an estimated eyelid aperture angle dropped below an eyelid aperture angle threshold?
  • the method may comprise the step of overwriting the sequence of eye movement models with the highest activity probability with a sequence of eye movement models having the highest activity probability while also having a predetermined characteristic.
  • One or more overwriting conditions may be determined to prevent sequences of candidate eye movements that meet no pruning conditions from being selected repeatedly to the extent that the sequence tree is caused to grow excessively large.
  • one or more overwriting conditions may be dependent on the determined probability of head movement.
  • Examples of possible overwriting conditions and associated overwriting actions include:
  • sequence of candidate eye movement models with the highest activity probability comprises a biink-to-fixation transition
  • the saccade model was active just before the blink model became active re-estimate the states during this saccade-blink interval assuming that only the blink model is active during this interval, that is, the transition to the saccade model does not occur but instead a transition to the blink model occurs directly.
  • the types of eye movements may comprise one or more of fixations, saccades, blinks and vestibulo-ocular reflexes.
  • the plurality of candidate eye movement models may comprise one or more candidate eye movement models respectively representing one or more of fixations, saccades, blinks and vestibulo-ocular reflexes.
  • Embodiments of the invention may also be augmented with any number of different candidate eye movement models, including candidate eye movement models representative of types of eye movement other than those listed above, such as smooth pursuits.
  • EOG signals corresponding to a measured actual eye movement may be considered as comprising three components - a noise component, a baseline component and an ocular movement component.
  • the noise component may be ignored while the ocular movement component is the component on which eye movement modelling may be based.
  • EOG signals corresponding to a measured actual eye movement also suffer from a phenomenon referred to as baseline drift, which is a gradual drift of the baseline component over time.
  • baseline drift is largely unrelated to an eye's movements and can instead be caused factors such as electrode polarisation or changes in contact impedance due to the user moving and/or sweating.
  • the baseline component must be estimated and separated from the ocular movement component. Accordingly, accurate eye movement modelling relies on accurate estimation of the baseline component.
  • Some known methods mitigate against baseline drift by requiring the EOG data to be zero-centred. For example, a user of such methods may have to return his or her gaze to, and spend the majority of the time gazing at, a pre-determined fixed position (typically aligned with the primary gaze position of the user) after attending to a particular action. This results in recorded EOG signals which are correspondent to gazing at such a pre-determined fixed position and with occasional brief excursions away from this position, that is, possessing zero-centred characteristics.
  • a pre-determined fixed position typically aligned with the primary gaze position of the user
  • the method may further comprise the step of estimating a baseline component of the or each electrooculography signal based on the sequence of eye movement models.
  • the baseline component may be estimated in real time and the EOG signal may be interpreted accordingly so that the ocular movement component, on which gaze angle estimation is based, is substantially free from disruption or distortion due to baseline drift.
  • the steps of estimating a gaze angle based on the sequence of eye movement models and estimating a baseline component of the or each eiectrooculography signal based on the sequence of eye movement models may be carried out simultaneously and recursively.
  • each estimation of the gaze angle may be based on a real-time estimation of the baseline component.
  • each estimation of the baseline component may be based on a real-time estimation of the gaze angle.
  • the simultaneous and recursive estimations of the gaze angle and baseline component may be carried out using any suitable estimation technique.
  • the steps of estimating a gaze angle based on the sequence of eye movement models and estimating a baseline component of the or each electrooculography signal based on the sequence of eye movement models may be carried out using a Dual Kalman Filter.
  • the estimated gaze angle comprises one or more of a horizontal gaze angle, a vertical gaze angle and an eyelid aperture angle.
  • the horizontal gaze angle may be considered as the degree to which an eye is looking left or right, relative to the user's head.
  • the vertical gaze angle may be considered as the degree to which an eye is looking up or down, relative to the user's head.
  • the eyelid aperture angle may be considered as the openness of eyelids covering the eye.
  • a battery model may be used to map the horizontal gaze angle, the vertical gaze angle and/or the eyelid aperture angle to the electrooculography signal.
  • the battery model may be replaced by a suitable alternative.
  • the point of regard may be constrained to a two- dimensional plane. Accordingly, the method of determining a point of regard may be considered as a method of determining a point at which an eye's gaze intersects a two- dimensional plane. Such embodiments of the invention may have particular use if a user's point of regard is to be determined in order to navigate a graphical user interface displayed on a screen.
  • an apparatus for determining a point of regard comprising: an electrooculography device configured to measure an actual eye movement of an eye using electrooculography and generate at least one electrooculography signal corresponding to the measured actual eye movement; and a point of regard determination device programmed to: use the or each electrooculography signal to build a sequence of eye movement models that corresponds to actual possible eye movements, wherein each eye movement model in the sequence of eye movement models is selected from a plurality of candidate eye movement models, wherein the plurality of candidate eye movement models represents different types of eye movements; estimate a gaze angle based on the sequence of eye movement models; and determine the point of regard based at least partially on the estimated gaze angle.
  • the point of regard determination device may comprise at least one processor and at least one memory including computer program code.
  • the at least one memory and computer program code may be configured to, with the at least one processor, cause the point of regard determination device to carry out the actions it is programmed for.
  • the point of regard determination device may be, may include, may communicate with or may form part of one or more of an electronic device, a portable electronic device, a portable telecommunications device, a microprocessor, a mobile phone, a personal digital assistant, a tablet, a phablet, a desktop computer, a laptop computer, a server, a cloud computing network, a smartphone, a smartwatch, smart eyewear, and a module for one or more of the same.
  • the electrooculography device may comprise one or more monopolar electrodes, the or each monopolar electrode configured to measure a potential of the eye.
  • the electrooculography device may also be configured to generate at least one monopolar electrooculography signal based on the or each potential measured by the or each monopolar electrode.
  • the apparatus may be configurable to work with any number of monopolar electrodes and the electrooculography device may be configured to simultaneously generate any number of monopolar electrooculography signals based on the potentials simultaneously measured by the monopolar electrodes. Further, the point of regard determination device may be programmed to use any number of monopolar electrooculography signals to build the sequence of eye movement models that corresponds to the actual eye movement.
  • the or each monopolar electrode may be placeable anywhere in the periocular region of a user to measure a potential of the eye.
  • the exact placement of the or each monopolar electrode may be immaterial to how the apparatus functions as the point of regard determination device may use the monopolar electrode in conjunction with the potential obtained from the electrode for generating an EOG signal. However, the positioning of the or each monopolar electrode may affect the performance of the apparatus in determining point of regard.
  • the electrooculography device may comprise one or more pairs of bipolar electrodes, the or each pair of bipolar electrodes configured to measure a potential of the eye.
  • the electrooculography device may also be configured to generate at least one bipolar electrooculography signal based on the or each potential measured by the or each bipolar electrode.
  • the apparatus may be configurable to work with any number of pairs of bipolar electrodes and the electrooculography device may be configured to simultaneously generate any number of bipolar electrooculography signals based on the potentials simultaneously measured by the bipolar electrodes. Further, the point of regard determination device may be programmed to use bipolar electrooculography signals to build the sequence of eye movement models that corresponds to the actual eye movement.
  • the apparatus may comprise two pairs of bipolar electrodes. Both pairs may be placeable in the periocular region of a user with one pair arranged in vertical alignment with the eye and the other pair arranged in horizontal alignment with the eye.
  • the apparatus may comprise four pairs of bipolar electrodes, so that two pairs of bipolar electrodes may be arranged to measure potentials of one eye of the user and the other two pairs of bipolar electrodes may be arranged to measure potentials of the user's other eye.
  • the apparatus may also comprise a combination of monopolar and bipolar electrodes, and the electrooculography device may be configured to generate both monopolar and bipolar electrooculography signal based on the potentials measured by the respective electrodes.
  • the apparatus may comprise a plurality of monopolar electrodes that may be placed around the eye or eyes of a user and the electrooculography device may be configured to generate electrooculography signals based on the potentials measured by the monopolar electrodes. Each electrooculography signal may then be processed as a monopolar channel, thereby generating a monopolar electrooculography signal.
  • a pair of electrooculography signals may be processed as a bipolar channel which, in simple terms, involves taking the electrooculography signal from one electrode and subtracting it from the electrooculography signal of another electrode to provide a bipolar electrooculography signal.
  • a pair of monopolar electrodes may be used as separate monopolar electrodes or may be considered as a bipolar electrode and may be used as such to generate bipolar eiectrooculography signals.
  • the apparatus may comprise a screen for displaying a graphical user interface, wherein the point of regard determination device is programmed to determine a point of regard constrained to a two-dimensional plane corresponding to the screen.
  • the apparatus may be configured to determine the user's point of regard as a point at which the user's gaze intersects a two-dimensional plane, wherein the two-dimensional plane and the screen are co-planar.
  • the apparatus may be configured to determine the user's point of regard only when it is within the dimensions of the screen. Alternatively, if the user's actual point of regard is beyond the dimensions of the screen, the apparatus may determine a point on the screen that is closest to the determined point of regard of the user and consider this closest point as being the user's point of regard for the purpose of navigating the graphical user interface displayed on the screen.
  • the apparatus may further comprise a head measurement device configurable to measure an actual head pose and an actual head position of a user of the apparatus.
  • Suitable head measurement devices may include an electromagnetic-based tracking system (such as the 3D Guidance TrakSTARTM), or optical-based solutions (such as Vicon®), among others.
  • the point of regard determination device may be programmed to determine the point of regard based on the measured actual head pose and the measured actual head position in addition to the estimated gaze angle.
  • a user's point of regard may be determined accurately, even if the user moves and/or tilts his or her head.
  • the user's actual head pose and/or position may be fixed such that it is not necessary to measure the actual head pose and/or position in order to calculate the point of regard.
  • the user may be resting his or her head on a chin rest or the user's head may be secured to a static head brace.
  • the user's actual head pose and/or position may be immaterial to the user's point of regard.
  • the user may be viewing a virtual or augmented reality in which the user's actual head pose and/or position has no effect on what is being viewed and, hence, the point of regard.
  • the apparatus particularly the point of regard determination device, is capable of functioning under non-stationary head pose and position or under stationary head pose and position conditions. That is, either with or without measuring actual head pose and the measured actual head position.
  • the point of regard determination device may be further programmed to: populate a sequence tree with a plurality of candidate eye movement models based on the or each electroocuiography signal; determine an activity probability for each sequence of candidate eye movement models forming part of the sequence tree, each activity probability being representative of the likelihood that the measured actual eye movement includes the type or types of eye movement represented by the respective candidate eye movement models; select a sequence of candidate eye movement models with the highest activity probability as the sequence of eye movement models that corresponds to actual possible eye movements.
  • the point of regard determination device may also be further programmed to: prune the sequence tree if the sequence tree meets a pruning condition; overwrite the sequence of eye movement models having the highest activity probability with a sequence of eye movement models having the highest activity probability amongst those having a predetermined characteristic, if the sequence tree meets an overwriting condition; determine a probability of head movement based on the measured actual head pose and the measured actual head position, wherein the overwriting condition is dependent on the probability of head movement; estimate a baseline component of the or each electrooculography signal based on the sequence of candidate eye movement models; and/or estimate the gaze angle and the baseline component simultaneously and recursively, optionally using a Dual Kalman Filter.
  • a computer-implemented method of determining a point of regard comprising the steps of: receiving at least one electrooculography signal corresponding to a measured actual eye movement; and use the or each electrooculography signal to build a sequence of eye movement models that corresponds to actual possible eye movements, wherein each eye movement model in the sequence of eye movement models is selected from a plurality of candidate eye movement models, wherein the plurality of candidate eye movement models represents different types of eye movements; estimate a gaze angle based on the sequence of eye movement models; and determine the point of regard based at least partially on the estimated gaze angle.
  • a computer program comprising computer code configured to perform the computer-implemented method of any one of the third aspect of the invention and its embodiments.
  • Figure 1 shows schematically an apparatus according to a first embodiment of the invention and a method according to a second embodiment of the invention
  • Figure 2 shows schematically a sequence tree
  • Figure 3 shows schematically an apparatus according to a third embodiment of the invention and a method according to a fourth embodiment of the invention
  • Figure 4 shows schematically the apparatus and method of Figure 3
  • Figure 5 shows schematically a pruning mechanism flowchart
  • Figure 6 shows schematically steps of the method shown in Figure 3 carried out using a Dual Kalman Filter.
  • FIG. 1 An apparatus according to a first embodiment of the invention is shown in Figure 1 and is designated generally by the reference numeral 2.
  • the apparatus 2 is for determining a point of regard and comprises an electrooculography device 6 and a point of regard determination device 8.
  • the apparatus 2 is suitable for carrying out a method 100 according to a second embodiment of the invention.
  • the method comprises the steps of:
  • each eye movement model in the sequence of eye movement models is selected from a plurality of candidate eye movement models, wherein the plurality of candidate eye movement models represents different types of eye movements;
  • the electrooculography device 6 is configured to carry out steps 130 and 135.
  • the point of regard determination device 8 is configured to carry out steps 140, 150 and 160.
  • step 140 comprises the steps of: 141) populating a sequence tree with a plurality of candidate eye movement models based on the or each electrooculography signal;
  • each activity probability being representative of the likelihood that the measured actual eye movement includes the type or types of eye movement represented by the respective sequence of candidate eye movement models
  • Figure 2 shows an example of a sequence tree 14 populated in step 141.
  • a candidate eye movement model may be built using one of four modelling techniques corresponding to four different types of eye movement: fixations (FIX), saccades (SAC), blinks (BLI) and vestibulo-ocular reflexes (VOR).
  • fixations FIX
  • saccades SAC
  • blinks BRI
  • vestibulo-ocular reflexes VOR
  • the sequence tree is populated with a plurality of candidate eye movement models based on one or more electrooculography signals measured at that time instant such that the sequence tree comprises a branch per candidate eye movement model.
  • the sequence tree 14 starts at time instant 0, at which point the user's actual eye movement is either known or considered to be a fixation. This may be because the apparatus's determination of point of regard was initiated at time instant 0 and the user's initial eye movement is assumed to be a fixation, i.e., the activity probability of the fixation eye movement model (p(F)) is assumed to equal 1. Alternatively, a previous sequence tree may have been pruned at time instant 0 and p(F) at time instant 0 is therefore assumed to equal 1.
  • the sequence tree 14 is populated with a plurality of candidate eye movement models representing fixations, saccades, blinks and vestibulo-ocular reflexes.
  • Each of the four resulting branches of the sequence tree, at time instant 1, represent a sequence of candidate eye movement models.
  • the top branch represents a sequence of two consecutive fixation eye movement models whereas the bottom branch represents a sequence comprising a fixation eye movement model followed by a vestibulo-ocular reflex eye movement model.
  • An activity probability is determined for each sequence of candidate eye movement models forming part of the sequence tree 14.
  • Each activity probability is representative of the likelihood that the measured actual eye movement includes the type or types of eye movement (FIX/SAC/BLI/VOR) represented by the respective sequence of candidate eye movement models.
  • the sequence tree 14 is populated with a further plurality of candidate eye movement models. Sequences which feature more than one model transition are not considered, as shown. This is acceptable because the sequence tree is pruned regularly (described in further detail below) where the interval between any two successive pruning events is short enough to assume that only a single model transition occurs in this interval. This also reduces the computational demand, which is advantageous from a practical point of view.
  • An activity probability for each sequence of candidate eye movement models is also determined at each time instant.
  • the sequence of candidate eye movement models with the highest activity probability is generally selected as the sequence of eye movement models corresponding to actual possible eye movements and it is this sequence on which the estimation of gaze angle and the determination of point of regard is based.
  • the determined activity probability for sequences of candidate eye movement models featuring the fixation (or blink or VOR) model being active is expected to be very close to 0, whereas those sequences of candidate eye movement models featuring the saccade model being active are expected to have a higher probability.
  • sequence tree 14 is continually populated without limitation, then the processing power required to calculate the increasing number of activity probabilities associated with every possible sequence forming part of the sequence tree may become impractical.
  • the activity probability determined for certain sequences will become increasingly low as it becomes more and more apparent from the one or more EOG signals that those sequences are not representative of the actual eye movement. Hence, it becomes increasingly redundant to determine an activity probability for many of the sequences of candidate eye movement models.
  • the step 143 may comprise pruning the sequence tree if the sequence tree meets a pruning condition or if one or more of a plurality of pruning conditions is met by the sequence tree.
  • the step of pruning the sequence tree comprises eliminating all sequences of candidate eye movement models except the sequence of candidate eye movement models generally with the highest activity probability. Therefore, immediately following a pruning occurrence, the sequence tree consists of a single candidate eye movement model and the activity probability of that candidate eye movement model may be considered to equal 1. Such a situation is represented by time instant 0 in Figure 2.
  • FIG. 3 an apparatus 202 and a method 300 are shown which are similar to the apparatus 2 and a method 100 shown in Figure 1.
  • the apparatus 202 also comprises a head measurement device 216 configurable to measure an actual head pose and an actual head position of a user of the apparatus 202.
  • the point of regard determination device 208 is programmed to determine the point of regard based on the measured actual head pose and the measured actual head position in addition to the estimated gaze angle. Accordingly, step 360 of the method 300 comprises determining the point of regard based on the estimated gaze angle, the measured actual head pose and the measured actual head position.
  • the eiectrooculography device 206 comprises a plurality of monopolar electrodes 222, 224, 225.
  • At least one monopolar electrode 222 may be configured to generate an electrooculography signal processed as a monopolar channel, thereby generating a monopolar electrooculography signal.
  • at least one pair of monopolar electrodes 224, 225 may be configured to generate pairs of electrooculography signals processed as a bipolar channel (one electrooculography signal is essentially subtracted from the other), thereby generating a bipolar electrooculography signal.
  • Each monopolar electrode 222, 224, 225 is configured to measure a potential of the eye 4. Specifically, a first pair of monopolar electrodes 224, 225 may be placed adjacent to the outer canthi and a second pair of monopolar electrodes 224, 225 may be mounted superiorly and inferiorly to one ocular globe.
  • the electrooculography device 206 is configured to generate at least one monopolar eiectrooculography signal 212 based on the potential measured by the monopolar electrode 222 and at least one bipolar eiectrooculography signal 213 based on the potential measured by each pair of monopolar electrodes 224, 225.
  • the apparatus 202 also comprises a screen 226 for displaying a graphical user interface.
  • the point of regard determination device 208 is programmed to determine a point of regard constrained to a two-dimensional plane corresponding to the screen 226. The determined point of regard may be displayed on the screen 226, similarly to the cursor of a computer mouse.
  • Figure 4 shows the apparatus 202 and, in particular shows in more detail the method step 340 to be carried out by the point of regard determination device 208.
  • Step 340 itself comprises the following steps:
  • each activity probability being representative of the likelihood that the measured actual eye movement includes the type or types of eye movement represented by the respective sequence of candidate eye movement models.
  • step 141 Upon completion of step 143, as well as step 350 being carried out to estimate a gaze angle for that time instant (similarly to step 150 of the method 100 shown in Figure 1), step 141 is repeated so that method steps are initiated for estimating the gaze angle at the next time instant.
  • Figure 5 shows a pruning mechanism flowchart 18.
  • the pruning mechanism flowchart comprises a plurality of pruning conditions and a plurality of overwriting conditions.
  • L the sequence tree depth, that is, the number of samples with which the sequence tree has been populated since the last pruning occurrence.
  • d a minimum pruning threshold, that is, a predetermined minimum number of samples that must elapse before pruning can occur.
  • Pmax t the highest activity probability at the current time instant, t.
  • St the sequence of candidate eye movement models with the highest activity probability.
  • the point of regard determination device 208 may determine whether a sufficient combination of pruning conditions is fulfilled by the sequence tree, according to the pruning mechanism flowchart.
  • pruning 345 may occur if L > d. In other words, pruning may occur if, since the last pruning occurrence, the sequence tree has been populated with a number of samples equal to or exceeding the minimum pruning threshold. If the first pruning condition 71 is not met, no pruning occurs and the sequence tree is populated with a new sample. If the first pruning condition 71 is met, then a second pruning condition 72 must be met.
  • pruning 345 may occur if p maXt > 0.5 or L > 2d. In other words, pruning may occur if either: the sequence of candidate eye movement models with the highest activity probability has an activity probability exceeding 0.5; or, since the last pruning occurrence, the sequence tree has been populated with a number of samples equal to or exceeding double the minimum pruning threshold
  • the second pruning condition 72 is not met, no pruning occurs and the sequence tree is populated with a new sample. If the second pruning condition 72 is met, then a third pruning condition 73 is considered.
  • pruning 345 may occur if the sequence of candidate eye movement models with the highest activity probability comprises no transitions between different types of eye movement. If all three of the first, second and third pruning conditions 71, 72, 73 are met, the sequence tree is pruned and the next sample is used to begin populating a new sequence tree.
  • the sequence of candidate eye movement models with the highest activity probability comprises a transition between different types of eye movement such that the third pruning condition 73 is not met, pruning is still possible without the sequence tree being populated with a further sample. Rather, a fourth pruning condition 74 is considered.
  • pruning 345 may occur if, within S t *, the newly active candidate eye movement model has been active for a number of samples exceeding a minimum new model threshold, N sw .
  • N sw a minimum new model threshold
  • a fifth pruning condition 75 is considered. Conversely, if the fourth pruning condition 74 is not met then a sixth pruning condition 76 is considered instead. According to the fifth pruning condition 75, pruning 345 may occur if S t * comprises no transition to a blink eye movement model.
  • the sequence tree is pruned and the next sample is used to begin populating a new sequence tree.
  • a reestimation of the gaze angle, eyelid aperture and baseline component may be required prior to the pruning.
  • re-estimation of the gaze angle, eyelid aperture and baseline component may occur if s t * comprises a blink-to-fixation transition and the saccade eye movement model was active just before the blink model became active.
  • the gaze angle, eyelid aperture and baseline component in the sequence tree are re-estimated based on an assumption that there was no transition to the saccade eye movement model and that the transition to the blink eye movement model occurred directly.
  • the sequence tree is pruned and the next sample is used to begin populating a new sequence tree.
  • the sequence tree is pruned without re-estimation being carried out and the next sample is used to begin populating a new sequence tree.
  • pruning 345 is still possible without the sequence tree being populated with a further sample. Rather, a seventh pruning condition 77 is considered.
  • pruning may occur if an estimated eyelid aperture angle, dropped below a blink threshold, Blinks are characterised by a decreasing trajectory of during the lowering of the eyelids before starts increasing again as the eyelids start to re-open.
  • the fifth pruning condition 77 ensures that/ ⁇ is truly indicative of a blink having occurred before pruning is allowed.
  • the sequence tree is pruned 345 and the next sample is used to begin populating a new sequence tree. Meanwhile, if the seventh pruning condition 77 is not met, pruning 345 is still possible without the sequence tree being populated with a further sample. Rather, an eighth pruning condition 78 is considered.
  • the eighth pruning condition is based on N sw (the minimum new model threshold mentioned above), c b and N ⁇ .
  • c b is the number of failed attempts at meeting the seventh pruning condition 77 as counted by a first counter 81. Meanwhile, denotes the number of samples while the blink model is active but the estimated eyelid aperture angle, has not dropped below the blink threshold, T
  • the purpose of the eighth pruning condition 78 is to curb excessive growth of the sequence tree.
  • c b may be required to exceed a different value predetermined as being a suitable excessive growth curbing threshold.
  • a first overwriting condition 91 is selected. Conversely, if the determined probability of head movement, p b HM) , is less than 0.5, a second overwriting condition 92 is selected.
  • the sequence tree is pruned by: selecting the sequence of candidate eye movement with the highest activity probability from any sequences forming part of the sequence tree which feature a transition to the vestibulo-ocular reflex (VOR) eye movement model wherein the VOR eye movement model is active for more than N sw samples; overwriting s t * with the newly selected sequence; and pruning 345 the sequence tree according to the new S t *.
  • VOR vestibulo-ocular reflex
  • the sequence tree is pruned by: eliminating sequences which either feature (i) a transition to the blink or VOR model, or (ii) a transition between different types of candidate eye movement model in which the newly active candidate eye movement model is active for less than N sw samples; selecting, from the remaining sequences of candidate eye movement models, the sequence of candidate eye movement models with the highest activity probability overwriting s t * with the newly selected sequence; and pruning 345 the sequence tree according to the new S t *.
  • failed attempts at meeting the fourth pruning condition 74 are counted by a second counter 82, the count equalling C sw . If c sw is less than or equal to 10, no pruning occurs and the sequence tree is populated with a new sample. However, if C sw is more than 10, pruning will occur without the sequence tree being populated with a further sample, but only once s t * has been overwritten according to a third overwriting condition 93.
  • the purpose of the sixth pruning condition 76 is to curb excessive growth of the sequence tree.
  • c sw may be required to exceed a different value predetermined as being a suitable excessive growth curbing threshold.
  • the sequence tree is pruned by: selecting the sequence of candidate eye movement with the highest activity probability from any sequences which do not feature a transition between different candidate eye movement models; overwriting s t * with the newly selected sequence; and pruning 345 the sequence tree according to the new S t *.
  • Step 350 comprises the steps of:
  • step 350 comprises the use of a Dual Kalman Filter (DKF) to carry out steps 351 and 352 simultaneously and recursively.
  • DKF Dual Kalman Filter
  • the use of a Dual Kalman Filter is well known in the art. However, for completeness, the use of the Dual Kalman Filter in this embodiment of the invention is described briefly below.
  • the Dual Kalman Filter comprises two filters: a parameter filter 353 which is used to estimate the baseline component and a plurality of state filters 354 which are used for gaze angle estimation.
  • Baseline component information is fed from the parameter filter 353 to the state filters 354, of which there are four in this embodiment of the invention.
  • a first state filter 354 models fixations
  • a second state filter 354 models saccades
  • a third state filter 354 models blinks
  • a third state filter 354 models vestibulo-ocular reflexes.
  • gaze angle and eyelid opening information is fed from the state filters 354 to the parameter filter 353.
  • the plurality of state filters 354 generate the candidate eye movement models used to populate the sequence trees referred to above with respect to Figures 3 to 5.
  • This process is carried out recursively through a prediction stage and an update stage and hence carries out the estimation of the state vector at each time instant, k.
  • the state vector of each of the state filters consists of a horizontal gaze angle, a vertical gaze angle and an eyelid aperture angle, as well as the difference of these three states.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Medical Informatics (AREA)
  • Neurology (AREA)
  • Dermatology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Neurosurgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

L'invention concerne un procédé de détermination d'un point de regard, le procédé comportant les étapes consistant à: mesurer un mouvement réel d'œil d'un œil (4) par électro-oculographie, et générer au moins un signal (112) d'électro-oculographie correspondant au mouvement réel d'œil mesuré; utiliser le ou chaque signal (112) d'électro-oculographie pour construire une séquence de modèles de mouvement d'œil qui corresponds à des mouvements d'œil réels possibles, chaque modèle de mouvement d'œil dans la séquence de modèles de mouvement d'œil étant choisi parmi une pluralité de modèles candidats de mouvement d'œil, la pluralité de modèles candidats de mouvement d'œil représentant différents types de mouvements d'œil; estimer un angle de regard d'après la séquence de modèles de mouvement d'œil; et déterminer le point de regard en se basant au moins en partie sur l'angle de regard estimé.
PCT/EP2023/061035 2022-05-03 2023-04-26 Procédé de détermination d'un point de regard par électro-oculographie dans des conditions de pose et de position de tête non stationnaires WO2023213663A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB2206429.9 2022-05-03
GB2206429.9A GB2618335B (en) 2022-05-03 2022-05-03 A method of determining a point of regard using electrooculography under non-stationary head pose and position conditions

Publications (1)

Publication Number Publication Date
WO2023213663A1 true WO2023213663A1 (fr) 2023-11-09

Family

ID=81943973

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/061035 WO2023213663A1 (fr) 2022-05-03 2023-04-26 Procédé de détermination d'un point de regard par électro-oculographie dans des conditions de pose et de position de tête non stationnaires

Country Status (2)

Country Link
GB (1) GB2618335B (fr)
WO (1) WO2023213663A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118113158A (zh) * 2024-03-18 2024-05-31 北京极溯光学科技有限公司 视线追踪方法、装置、设备和存储介质

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5649061A (en) * 1995-05-11 1997-07-15 The United States Of America As Represented By The Secretary Of The Army Device and method for estimating a mental decision
US20150126845A1 (en) * 2013-11-05 2015-05-07 The Research Foundation For The State University Of New York Wearable head-mounted, glass-style computing devices with eog acquisition and analysis for human-computer interfaces
US20160364881A1 (en) * 2015-06-14 2016-12-15 Sony Computer Entertainment Inc. Apparatus and method for hybrid eye tracking
US20170180882A1 (en) * 2015-12-22 2017-06-22 Oticon A/S Hearing device comprising a sensor for picking up electromagnetic signals from the body
US20180341328A1 (en) * 2017-05-23 2018-11-29 Stichting Imec Nederland Method and a system for monitoring an eye position
CN109976525A (zh) * 2019-03-27 2019-07-05 上海大学 一种用户界面交互方法、装置及计算机设备
US20190370650A1 (en) * 2018-06-01 2019-12-05 The Charles Stark Draper Laboratory, Inc. Co-adaptation for learning and control of devices
US20200097076A1 (en) * 2018-09-21 2020-03-26 Neurable Inc. Human-computer interface using high-speed and accurate tracking of user interactions
US20200337653A1 (en) * 2018-01-18 2020-10-29 Neurable Inc. Brain-computer interface with adaptations for high-speed, accurate, and intuitive user interactions

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5726916A (en) * 1996-06-27 1998-03-10 The United States Of America As Represented By The Secretary Of The Army Method and apparatus for determining ocular gaze point of regard and fixation duration

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5649061A (en) * 1995-05-11 1997-07-15 The United States Of America As Represented By The Secretary Of The Army Device and method for estimating a mental decision
US20150126845A1 (en) * 2013-11-05 2015-05-07 The Research Foundation For The State University Of New York Wearable head-mounted, glass-style computing devices with eog acquisition and analysis for human-computer interfaces
US20160364881A1 (en) * 2015-06-14 2016-12-15 Sony Computer Entertainment Inc. Apparatus and method for hybrid eye tracking
US20170180882A1 (en) * 2015-12-22 2017-06-22 Oticon A/S Hearing device comprising a sensor for picking up electromagnetic signals from the body
US20180341328A1 (en) * 2017-05-23 2018-11-29 Stichting Imec Nederland Method and a system for monitoring an eye position
US20200337653A1 (en) * 2018-01-18 2020-10-29 Neurable Inc. Brain-computer interface with adaptations for high-speed, accurate, and intuitive user interactions
US20190370650A1 (en) * 2018-06-01 2019-12-05 The Charles Stark Draper Laboratory, Inc. Co-adaptation for learning and control of devices
US20200097076A1 (en) * 2018-09-21 2020-03-26 Neurable Inc. Human-computer interface using high-speed and accurate tracking of user interactions
CN109976525A (zh) * 2019-03-27 2019-07-05 上海大学 一种用户界面交互方法、装置及计算机设备

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118113158A (zh) * 2024-03-18 2024-05-31 北京极溯光学科技有限公司 视线追踪方法、装置、设备和存储介质

Also Published As

Publication number Publication date
GB2618335B (en) 2024-07-24
GB202206429D0 (en) 2022-06-15
GB2618335A (en) 2023-11-08

Similar Documents

Publication Publication Date Title
CA2910552C (fr) Systeme et procede de suivi probabiliste d'un objet au fil du temps
CN106662917B (zh) 眼睛跟踪校准系统和方法
WO2020015468A1 (fr) Procédé et appareil de transmission d'image, dispositif terminal et support de stockage
JP5227340B2 (ja) 随意的な眼球信号に基づく特に撮影のための制御方法
US7091928B2 (en) Intelligent eye
US11017257B2 (en) Information processing device, information processing method, and program
JP2021533462A (ja) ユーザカテゴリ化による多深度平面ディスプレイシステムのための深度平面選択
US20160029883A1 (en) Eye tracking calibration
US20140226131A1 (en) Systems and methods of eye tracking calibration
WO2020042542A1 (fr) Procédé et appareil d'acquisition de données d'étalonnage de commande de mouvement oculaire
CN109032351B (zh) 注视点函数确定方法、注视点确定方法、装置及终端设备
WO2023213663A1 (fr) Procédé de détermination d'un point de regard par électro-oculographie dans des conditions de pose et de position de tête non stationnaires
CN109976535B (zh) 一种校准方法、装置、设备及存储介质
KR101638095B1 (ko) 시선 인식 및 생체 신호를 이용한 헤드 마운트 디스플레이를 통해 사용자 인터페이스를 제공하는 방법, 이를 이용한 장치 및 컴퓨터 판독 가능한 기록 매체
CN109254662A (zh) 移动设备操作方法、装置、计算机设备及存储介质
CN109976528B (zh) 一种基于头动调整注视区域的方法以及终端设备
KR101571848B1 (ko) 뇌전도 및 눈동자 움직임 기반 하이브리드형 인터페이스 장치 및 이의 제어 방법
KR20220060163A (ko) 3차원 응시점 정보 기반 관심객체 검출 및 사용자 시·지각 메타데이터 제공장치 및 그 방법
CN113534945A (zh) 眼球追踪校准系数的确定方法、装置、设备及存储介质
CN104898823B (zh) 控制视标运动的方法和装置
CN109144262B (zh) 一种基于眼动的人机交互方法、装置、设备及存储介质
EP2888716B1 (fr) Détermination d'angle d'objet cible utilisant de multiples caméras
CN114429670A (zh) 瞳孔检测方法、装置、设备及存储介质
US11853472B2 (en) Modify audio based on physiological observations
CN109189222B (zh) 一种基于检测瞳孔直径变化的人机交互方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23722362

Country of ref document: EP

Kind code of ref document: A1