WO2013026725A1 - Method for gaze-controlled text size control,and methods for gaze-based measuring of a text reading speed and of a number of visual saccades per text line - Google Patents

Method for gaze-controlled text size control,and methods for gaze-based measuring of a text reading speed and of a number of visual saccades per text line Download PDF

Info

Publication number
WO2013026725A1
WO2013026725A1 PCT/EP2012/065758 EP2012065758W WO2013026725A1 WO 2013026725 A1 WO2013026725 A1 WO 2013026725A1 EP 2012065758 W EP2012065758 W EP 2012065758W WO 2013026725 A1 WO2013026725 A1 WO 2013026725A1
Authority
WO
WIPO (PCT)
Prior art keywords
signal
text
line
gaze signal
gaze
Prior art date
Application number
PCT/EP2012/065758
Other languages
French (fr)
Inventor
Arnaud Leroy
Julien Fleureau
Philippe Guillotel
Original Assignee
Thomson Licensing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomson Licensing filed Critical Thomson Licensing
Priority to EP12745496.5A priority Critical patent/EP2745188B1/en
Priority to US14/239,360 priority patent/US9256285B2/en
Publication of WO2013026725A1 publication Critical patent/WO2013026725A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/103Formatting, i.e. changing of presentation of documents
    • G06F40/109Font handling; Temporal or kinetic typography

Definitions

  • the present invention relates to human-machine interface, in particular to text size control on display devices.
  • Figure 1 shows an example of a horizontal capture setup .
  • the Boston College “EagleEyes” Project [6] is an example of taking advantage of the EOG to help users with severe physical disabilities to control a computer.
  • the present invention proposes a gaze-based way to improve the user experience when watching multimedia content comprising text. More precisely, a gaze information is used to
  • font size control is achieved by analyzing the user's eyes movements. This method is passive from the user' s point of view, in the sense that it does not require any active user manipulation for font size change. Eye movement characteristics are recorded while users are reading a text, and are evaluated to automatically adapt the font size and thus enhance the visual comfort and the user experience .
  • a method for gaze-controlled text size control according to the present invention comprises the following steps:
  • a user's horizontal gaze signal is probed, sampled and recorded.
  • the sampling is performed at a predefined sampling frequency.
  • the horizontal gaze signal may be amplified and is processed for determining and analyzing the horizontal eyes movements as further described in the following. For the processing and analyzing, one may assume a reading context where a user
  • the horizontal gaze signal may be calibrated so that amplitude values between 0 and 1 in the normalized signal matches the width of the entire display screen.
  • the horizontal gaze signal is then subjected to a subband filter bank transform into several frequency bands, or to a wavelet transform on several levels of detail .
  • line delimiters are detected. This can be achieved by locating pieces of the
  • This reading speed is a momentary value, and conceptually relates to the portion of the gaze signal that is enclosed between the line delimiters.
  • the horizontal gaze signal between the pair of consecutive line delimiters describes the eye movement while reading a current line of text. From this signal, a number of saccades in this line is determined by counting those locations, where the gaze signal has a sudden high slope portion surrounded on both sides by portions of markedly smaller slope. This analysis may be performed by comparing different frequency bands or time- frequency components of the transformed horizontal gaze signal. Saccades are elementary movements of the eye while scanning and reading a text.
  • Fig. 1 shows an example of a horizontal EOG capture setup.
  • Fig. 2 shows an example calibrated horizontal EOG signal, as it arises while a user is reading one complete line of text; together with one wavelet level of detail signal thereof .
  • Fig. 3 shows a raw EOG signal in a setting with a very small font size.
  • Fig. 4 shows a raw EOG signal in a setting with an "optimal font size.
  • Fig. 5 shows a raw EOG signal in a setting with a very big font size.
  • Fig. 6 shows a calibrated horizontal EOG signal together with the sum of the third to fifth level of detail signals thereof .
  • ElectroOculoGram signal also denoted as EOG signal is used as a gaze signal.
  • Figure 1 shows an example of a horizontal EOG capture setup.
  • two horizontal electrodes 101, 102 are attached to the left and right temple, and a reference electrode also referred to as ground electrode 103 is attached to the middle of the forehead.
  • a horizontal EOG signal of the user is recorded at a sample frequency Fs of e.g. 200Hz.
  • Fs e.g. 200Hz.
  • the electrodes could also be embedded in a dedicated device (e.g. eyeglasses) which touches the user's head at or near the desired positions during use.
  • Figure 2 shows, as a function of a sample index 201, an example calibrated horizontal EOG signal 202, as it arises while a user is reading one complete line of text; together with a third level of detail signal D3 thereof 203.
  • the horizontal EOG signal is amplified using an appropriate device (e.g. commercial instrumentation amplifiers for
  • the processing step aims at determining and analyzing the horizontal eyes movements, and comprises the following steps:
  • the calibrated signal sn is subjected to a wavelet transform with a spline wavelet, on 5 levels of detail.
  • the level of detail signals are named Dl, D5.
  • the "A Trou” algorithm [9] or stationary wavelet transform can advantageously be used.
  • the wavelet transform is an advantageous approach to process EOG signals, because it allows a fast multi-bandpass filtering and constitutes a convenient way to identify fast transitions in the signal, especially the fast transitions that occur when the line of sight jumps to the beginning of a next line.
  • a dyadic wavelet transform may be used. However, other filtering techniques may also be used to perform a similar processing. As the core of the wavelet transform, a cubic spline wavelet may be used. In the "a trou" algorithm, no subsampling is applied to the signal, but the filter responses are upsampled instead with zero padding. This entails, among others, that the level of detail signals all have the same length than the original signal .
  • the first level of detail signal Dl contains the upper half of the normalized frequency range, corresponding to pi/2 to pi.
  • the second level of detail signal D2 contains the second-lowest quarter of the normalized frequency range, corresponding to pi/4 to pi/2.
  • the third level of detail signal D3 contains the second-lowest eighth of the normalized frequency range, corresponding to pi/8 to pi/4.
  • the fourth level of detail signal D4 contains the second-lowest sixteenth of the normalized frequency range, corresponding to pi/16 to pi/8, and so on.
  • the sum of the third level of detail signal D3 plus the fourth level of detail signal D4 plus the fifth level of detail signal D5 constitutes the most useful frequency band to do the evaluations described here.
  • This sum signal D3+D4+D5 may therefore also be termed the "informative signal”.
  • the first and second level of detail signals Dl and D2 may be found to contain mostly
  • h[n] a lowpass forward filter, commonly denoted as h[n], of length four, where the coefficients h[n]/sqrt(2) are (0.125; 0.375; 0.375; 0.125);
  • h ⁇ [n] a lowpass backward filter, commonly denoted as h ⁇ [n] , of length four, where the coefficients h ⁇ [n] /sqrt (2) are (0.125; 0.375; 0.375; 0.125);
  • g[n] a highpass forward filter, commonly denoted as g[n], of length two, where the coefficients g[n]/sqrt(2) are (-0.5; 0.5);
  • g ⁇ [n] a highpass backward filter, commonly denoted as g ⁇ [n] , of length six, where the coefficients g ⁇ [n] /sqrt (2) are (-0.03125; -0.21875; -0.6875; 0.6875; 0.21875; 0.03125). These are also termed quadratic spline filters.
  • a dyadic wavelet transform used on signal blocks of 512
  • Figure 2 shows, as a function of a sample index 201, an example calibrated horizontal EOG signal 202. That the signal is
  • Figure 2 also shows a medium level of detail component 203 of the example calibrated horizontal EOG signal 202 which corresponds to the level D3 thereof.
  • a currently read line Li is detected as being a portion of the gaze signal delimited by two line delimiters LiO and Lil.
  • the Line delimiters are defined as those time samples where
  • the fifth threshold Tline equals -1 and Wline equals the number of samples corresponding to a duration of 0.5 seconds, typically.
  • Figure 6 shows, as a function of the sample index 601, a calibrated horizontal EOG signal 602
  • a current reading speed Vi is calculated from the sampling frequency Fs and the sample indexes LiO, Lil of the line
  • Vi Fs/ (Lil-LiO) .
  • the current reading speed can be measured in lines per second.
  • a positive saccade count Sip is counted on the interval
  • a negative saccade count Sin is counted on the interval
  • Tsaccade 0.02 typically; Wsaccade equals the number of samples corresponding to 0.2 seconds, typically; and a time sample is considered as a saccades time sample, if its magnitude is greater than Tsaccade, with other words if its value is either below (-1) *Tsaccade or above
  • a number of saccades Si in the currently read line is calculated as the difference between the positive saccade count Sip and the negative saccade count Sin:
  • Si Sip-Sin This calculation takes care of the fact that while reading a text, the gaze sometimes jumps back and forth to re-read a portion of text, in order to reinsure the meaning of something that was perhaps too hastily read in the first instance.
  • the second threshold Nmax 20 typically
  • the third threshold Vmin 0.05 lines per second, typically.
  • the fourth threshold Nmin equals 15 typically.
  • the third to sixth step are repeated for every consecutive text line.
  • the text line index i is increased by 1.
  • Figure 3 shows, as a function of the sample index 301, an
  • FIG. 4 shows, as a function of the sample index 401, an uncalibrated EOG signal 402 in a setting with an "optimal" font size.
  • Figure 5 shows, as a function of the sample index 501, an uncalibrated EOG signal 502 in a setting with a very big font size.
  • the methods according to this invention iteratively allow to switch from extreme configuration (very little or very big font size) to the optimal one.
  • the notion of "optimal" font size may be user-dependent and can be adjusted by allowing the user to modify the thresholds Nmin, Nmax, and Vmin.
  • this invention improves the visual comfort on media like computers, TV or e- books. This leads to reduced eyestrain because the eyes don't move more than necessary and because the deciphering phenomena is limited. User satisfaction is increased because the size of the font is automatically adapted. And it provides a better understanding of text content because of a good fluidity while reading .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

For gaze-controlled text size control of a display, the invention proposes to probe, sample and record a user's horizontal gaze Signal; to subject the gaze Signal to a subband filterbank or wavelet transform; to detect line delimiters in the gaze Signal; to derive a reading speed; to determine, as a number of saccades per text line the number of locations where the gaze Signal has sudden high slope portions surrounded on both sides by portions of markedly smaller slope; to detect, based on the reading speed and the number of saccades, a too small font size Status or a too big font size Status; and to initiate a corresponding font size change. Parts of this method can be used for gaze-based measuring of text reading speed and for gaze-based measuring of number of saccades.

Description

Method for gaze-controlled text size control, and methods for gaze-based measuring of a text reading speed and of a number of visual saccades per text line
Field of the invention
The present invention relates to human-machine interface, in particular to text size control on display devices.
Background of the invention
Recording and estimating the gaze path of a user watching a screen is a mature technology opening new perspectives in terms of Human-Machine Interfaces. Such captures are till now mainly achieved using infrared video technologies in commercial systems .
[5] purport to describe a laptop computer product with
integrated eye control, taking advantage of the reflection of infrared sources on the user' s eyes to estimate the current gaze orientation .
Other, more experimental systems are physiologically based on the recording of the corneo-retinal potential by the means of electrodes positioned around the eye. Two electrodes are
generally used to record the horizontal movements, two others catch the vertical motions and a last one is used as a
reference. Figure 1 shows an example of a horizontal capture setup .
Young et al [1] have purportedly shown that captured signals, namely ElectroOculoGram (EOG) signals, are linearly correlated to the eye motions. Several commercial or academic systems embed dedicated amplifiers to measure and record the associated signal. "BIOPAC" systems for an example of generic biomedical amplifier,
"BlueGain EOG Amplifier" developed by Cambridge Research
Systems, and an Eye-movement Tracking System proposed by Deng [2] .
Even if such systems were historically and mainly used for medical purposes [3], recent developments in video games and entertainment [4] prove their potential as a new way for users to interact with a machine.
The Boston College "EagleEyes" Project [6] is an example of taking advantage of the EOG to help users with severe physical disabilities to control a computer.
In [7], Bulling et al propose to use EOG signals to recognize users' activities by analyzing their eyes movements. Horizontal EOGs are processed with dedicated wavelet transforms and help to determine if the user is reading, writing or browsing while s/he is in front of her/his computer.
With the development of e-books, the improvements of TV-screens which are now able to satisfyingly display texts and web pages, it becomes apparent that reading comfort may not always be optimal and depends among others on the size of the text font used in the display. To adjust font size to individual users' needs, [5] requires an active interaction of the user with the machine like a deliberate click on an icon, or a specific eye motion to zoom on some part of a screen. The Single Line Reader algorithm implementation in [8] also makes use of deliberate head movements to control the speed and scrolling direction of a single line text display. An improvement of ease of user interaction is thus desirable. Invention
The present invention proposes a gaze-based way to improve the user experience when watching multimedia content comprising text. More precisely, a gaze information is used to
automatically adapt the text font size to enhance the user comfort. The invention is based on having recognized that reading a text with too small font requires more time and effort for a user than reading optimally-sized text; and on the other hand reading a text with too large font requires the gaze to move with bigger amplitudes, leading to an associated bigger eyestrain. According to the present invention, font size control is achieved by analyzing the user's eyes movements. This method is passive from the user' s point of view, in the sense that it does not require any active user manipulation for font size change. Eye movement characteristics are recorded while users are reading a text, and are evaluated to automatically adapt the font size and thus enhance the visual comfort and the user experience .
A method for gaze-controlled text size control according to the present invention comprises the following steps:
A user's horizontal gaze signal is probed, sampled and recorded. The sampling is performed at a predefined sampling frequency.
The horizontal gaze signal may be amplified and is processed for determining and analyzing the horizontal eyes movements as further described in the following. For the processing and analyzing, one may assume a reading context where a user
sequentially reads a justified text from the left to the right and from the top to the bottom of a display screen. It can additionally be assumed that line returns, i.e. a repeated reading of a same line of text, or line jumps, i.e. the skipping of lines between lines that are read, do not occur. The horizontal gaze signal may be calibrated so that amplitude values between 0 and 1 in the normalized signal matches the width of the entire display screen. In the following
description, it is assumed that the arrangement of the
electrodes at the head, together with the polarity of the probed gaze signal, the amplification and the calibration cooperate in such a way, that a value of 0 (zero) for the calibrated gaze signal corresponds to a gaze that is directed to the left border of the text block being read, and a value of 1 (one) for the calibrated gaze signal corresponds to a gaze that is directed to the right border of the text block. Transformation of these assumptions to other setups, like change of signal polarity or change of reading direction, is straightforward. The horizontal gaze signal, optionally calibrated, is then subjected to a subband filter bank transform into several frequency bands, or to a wavelet transform on several levels of detail . In the transformed horizontal gaze signal, line delimiters are detected. This can be achieved by locating pieces of the
transformed horizontal gaze signal where selected ones of the frequency bands or wavelet levels of detail are below a first threshold .
Then, for each pair of consecutive line delimiters, a reading speed is derived from the distance in samples of the line delimiters, in relation to the sampling frequency of the
horizontal gaze signal. This reading speed is a momentary value, and conceptually relates to the portion of the gaze signal that is enclosed between the line delimiters.
The horizontal gaze signal between the pair of consecutive line delimiters describes the eye movement while reading a current line of text. From this signal, a number of saccades in this line is determined by counting those locations, where the gaze signal has a sudden high slope portion surrounded on both sides by portions of markedly smaller slope. This analysis may be performed by comparing different frequency bands or time- frequency components of the transformed horizontal gaze signal. Saccades are elementary movements of the eye while scanning and reading a text.
If the number of saccades is above a second threshold, or if the reading speed is below a third threshold, this is detected as an indicator that the currently used font size is too small, and an increase of the font size is initiated.
On the other hand, if the number of saccades is less than a fourth threshold, this is detected as an indicator that the currently used font size is too big, and a decrease of the font size is initiated.
These steps are repeated for every line of text, i.e. for all pieces of the horizontal gaze signal between consecutive line delimiters.
Drawings
Exemplary embodiments of the invention are illustrated in drawings and are explained in more detail in the following description .
In the figures:
Fig. 1 shows an example of a horizontal EOG capture setup. Fig. 2 shows an example calibrated horizontal EOG signal, as it arises while a user is reading one complete line of text; together with one wavelet level of detail signal thereof .
Fig. 3 shows a raw EOG signal in a setting with a very small font size. Fig. 4 shows a raw EOG signal in a setting with an "optimal font size.
Fig. 5 shows a raw EOG signal in a setting with a very big font size.
Fig. 6 shows a calibrated horizontal EOG signal together with the sum of the third to fifth level of detail signals thereof .
Exemplary embodiments
An example implementation of the method according to the present invention is described in the following. The ElectroOculoGram signal also denoted as EOG signal is used as a gaze signal.
Figure 1 shows an example of a horizontal EOG capture setup. Around the eyes of a user's head 100, two horizontal electrodes 101, 102 are attached to the left and right temple, and a reference electrode also referred to as ground electrode 103 is attached to the middle of the forehead.
From the electrodes 101, 102, 103, a horizontal EOG signal of the user is recorded at a sample frequency Fs of e.g. 200Hz. As an alternative to being directly attached to the user' s head, the electrodes could also be embedded in a dedicated device (e.g. eyeglasses) which touches the user's head at or near the desired positions during use.
Figure 2 shows, as a function of a sample index 201, an example calibrated horizontal EOG signal 202, as it arises while a user is reading one complete line of text; together with a third level of detail signal D3 thereof 203.
The horizontal EOG signal is amplified using an appropriate device (e.g. commercial instrumentation amplifiers for
physiological recording) and is then processed. The processing step aims at determining and analyzing the horizontal eyes movements, and comprises the following steps:
First step:
Calibration of the horizontal EOG signal s into a calibrated signal sn, in such a way that an amplitudes range of 1.0 in the calibrated signal sn corresponds to the entire text width. Second step:
The calibrated signal sn is subjected to a wavelet transform with a spline wavelet, on 5 levels of detail. The level of detail signals are named Dl, D5. For the wavelet transform, the "A Trou" algorithm [9] or stationary wavelet transform can advantageously be used.
The wavelet transform is an advantageous approach to process EOG signals, because it allows a fast multi-bandpass filtering and constitutes a convenient way to identify fast transitions in the signal, especially the fast transitions that occur when the line of sight jumps to the beginning of a next line.
A dyadic wavelet transform may be used. However, other filtering techniques may also be used to perform a similar processing. As the core of the wavelet transform, a cubic spline wavelet may be used. In the "a trou" algorithm, no subsampling is applied to the signal, but the filter responses are upsampled instead with zero padding. This entails, among others, that the level of detail signals all have the same length than the original signal .
Conceptually, after such a wavelet transform, the first level of detail signal Dl contains the upper half of the normalized frequency range, corresponding to pi/2 to pi. The second level of detail signal D2 contains the second-lowest quarter of the normalized frequency range, corresponding to pi/4 to pi/2. The third level of detail signal D3 contains the second-lowest eighth of the normalized frequency range, corresponding to pi/8 to pi/4. The fourth level of detail signal D4 contains the second-lowest sixteenth of the normalized frequency range, corresponding to pi/16 to pi/8, and so on.
In a typical embodiment, it may be found empirically, that the sum of the third level of detail signal D3 plus the fourth level of detail signal D4 plus the fifth level of detail signal D5 constitutes the most useful frequency band to do the evaluations described here. This sum signal D3+D4+D5 may therefore also be termed the "informative signal". The first and second level of detail signals Dl and D2 may be found to contain mostly
recording noise, and the sixth and higher level of detail signals D6, D7, ... may be found to contain mainly physiological drift components.
As the impulse response core to be used in the wavelet
transform, one may use
- a lowpass forward filter, commonly denoted as h[n], of length four, where the coefficients h[n]/sqrt(2) are (0.125; 0.375; 0.375; 0.125);
- a lowpass backward filter, commonly denoted as h~ [n] , of length four, where the coefficients h~ [n] /sqrt (2) are (0.125; 0.375; 0.375; 0.125);
- a highpass forward filter, commonly denoted as g[n], of length two, where the coefficients g[n]/sqrt(2) are (-0.5; 0.5); and
- a highpass backward filter, commonly denoted as g~ [n] , of length six, where the coefficients g~ [n] /sqrt (2) are (-0.03125; -0.21875; -0.6875; 0.6875; 0.21875; 0.03125). These are also termed quadratic spline filters.
A dyadic wavelet transform, used on signal blocks of 512
samples, has a total of 9 levels. Of these, the last level signal D9 contains the very lowest frequencies. This shows, that in such a context, an informative signal composed of D3+D4+D5, conceptually has a bandpass character.
Figure 2 shows, as a function of a sample index 201, an example calibrated horizontal EOG signal 202. That the signal is
calibrated can be seen from the fact that the signal comprises amplitudes in the range of about -0.7 to +0.3, corresponding to an amplitude range of 1. Figure 2 also shows a medium level of detail component 203 of the example calibrated horizontal EOG signal 202 which corresponds to the level D3 thereof.
Third step: A currently read line Li is detected as being a portion of the gaze signal delimited by two line delimiters LiO and Lil. The Line delimiters are defined as those time samples where
sd=D3+D4+D5, i.e. the sum of the third level of detail signal, the fourth level of detail signal, and the fifth level of detail signal of the wavelet transformed signal is under a fifth threshold Tline, and where additionally in a time window of a width Wline preceding the time sample, no other line delimiters exist . In the example setting, the fifth threshold Tline equals -1 and Wline equals the number of samples corresponding to a duration of 0.5 seconds, typically. Figure 6 shows, as a function of the sample index 601, a calibrated horizontal EOG signal 602
together with the sum 603 of the third to fifth level of detail signals thereof. The EOG signal has values smaller than the fifth threshold Tline=-1 604 only during those portions where - at a line wrap - the gaze quickly moves back to the beginning of the next line. Fourth step: A current reading speed Vi is calculated from the sampling frequency Fs and the sample indexes LiO, Lil of the line
delimiters surrounding the current line, as Vi=Fs/ (Lil-LiO) . The current reading speed can be measured in lines per second.
Fifth step:
A positive saccade count Sip is counted on the interval
[LiO, Lil] as the number of time samples where a second highest frequency component D4 of the wavelet transformed signal is above a sixth threshold Tsaccade, and where additionally in a time window of a width Wsaccade preceding the time sample, no other saccades time sample exists.
A negative saccade count Sin is counted on the interval
[LiO, Lil] as the number of time samples where a second highest frequency component D4 of the wavelet transformed signal is below a threshold of (-1) *Tsaccade, and where additionally in a time window of the width Wsaccade preceding the time sample, no other saccades time sample exists.
In the example setting, Tsaccade equals 0.02 typically; Wsaccade equals the number of samples corresponding to 0.2 seconds, typically; and a time sample is considered as a saccades time sample, if its magnitude is greater than Tsaccade, with other words if its value is either below (-1) *Tsaccade or above
Tsaccade . Then, a number of saccades Si in the currently read line is calculated as the difference between the positive saccade count Sip and the negative saccade count Sin:
Si=Sip-Sin This calculation takes care of the fact that while reading a text, the gaze sometimes jumps back and forth to re-read a portion of text, in order to reinsure the meaning of something that was perhaps too hastily read in the first instance.
Sixth step:
If the number of saccades Si is above a second threshold Nmax, or if the reading speed Vi is below a third threshold Vmin, this is detected as an indicator that the currently used font size is too small, and an increase of the font size, e.g. to a next bigger available font size, is initiated. This will be denoted as a too small font size status in the following. In this, the second threshold Nmax equals 20 typically, and the third threshold Vmin equals 0.05 lines per second, typically.
On the other hand, if the number of saccades Si is less than a fourth threshold Nmin, this is detected as an indicator that the currently used font size is too big, and a decrease of the font size, e.g. to a next smaller available font size, is initiated. This will be denoted as a too big font size status in the following . In this, the fourth threshold Nmin equals 15 typically.
With other words, if Nmax≤Si (Nmax=20 typically) or Vi≤Vmin (Vmin=0.05 second per line typically), increase the font size of one step. Else if Si≤Nmin (Nmin=15 typically), decrease the font size of one step.
Seventh step:
The third to sixth step are repeated for every consecutive text line. Each time, the text line index i is increased by 1. Figure 3 shows, as a function of the sample index 301, an
uncalibrated EOG signal 302 in a setting with a very small font size. Figure 4 shows, as a function of the sample index 401, an uncalibrated EOG signal 402 in a setting with an "optimal" font size. Figure 5 shows, as a function of the sample index 501, an uncalibrated EOG signal 502 in a setting with a very big font size. The methods according to this invention iteratively allow to switch from extreme configuration (very little or very big font size) to the optimal one. Note that the notion of "optimal" font size may be user-dependent and can be adjusted by allowing the user to modify the thresholds Nmin, Nmax, and Vmin.
It can be seen as advantageous about this invention, that it improves the visual comfort on media like computers, TV or e- books. This leads to reduced eyestrain because the eyes don't move more than necessary and because the deciphering phenomena is limited. User satisfaction is increased because the size of the font is automatically adapted. And it provides a better understanding of text content because of a good fluidity while reading .
References :
[1] Young LR, Sheena D (1988) : Eye-movement measurement
techniques. In Encyclopedia of Medical Devices and
Instrumentation, ed. JG Webster, pp. 1259-1269, John Wiley, New York.
[2] L. Y. Deng, C. Hsu, T. Lin, J. Tuan, Y. Chen: EOG-Based
Signal Detection And Verification For HCI . In 2009
International Conference on Machine Learning and Cybernetics, Volume 6, pp. 3342 - 3348. [3] International Society for Clinical Electrophysiology of Vision (ICSEV), "Visual Electrodiagnostics - A Guide To Procedures",
http : //www . isce .org/standards /proceduresguide . html .
[4] H. Miyashita, M. Hayashi, K.Okada: Implementation of EOG- based Gaze Estimation in HMD with Head-tracker. In 18th International Conference on Artificial Reality and
Telexistence (ICAT 2008) .
[5] Tobii, "Tobii unveils the world's first eye-controlled
laptop", http : //www .tobii . com/en/eye-tracking- integration/global /news-and-events /press-releases/tobii- unveils-the-worlds-first-eye-controlled-laptop/ .
[6] EagleEyes Project, Boston College,
http : //www . be . edu/schools/csom/eagleeyes/faq . html
[7] A. Bulling, J. A. Ward, H. Gellersen, G. Troster: Eye
Movement Analysis for Activity Recognition Using
Electrooculography . In IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 33, no. 4, pp. 741 - 753, April 2011.
[8] Single Line Reader, L. E. L. Mizutan, T. Nakajima, Graduate School of Educational Informatics - Tohoku University - Japan, http : //www . cmsoft .com.br/slr/.
[9] M. J. Shensa: The Discrete Wavelet Transform: Wedding the A Trous and Mallat Algorithms. In IEEE Transactions on Signal Processing, Vol. 40 No. 10, pp. 2464-2482, October 1992.

Claims

Claims
1. A method for gaze-controlled text size control, comprising the steps of:
- probing, sampling and recording a user's horizontal gaze
signal at a predefined sampling frequency;
- subjecting the horizontal gaze signal to a frequency or
wavelet transform on several levels;
- detecting, in the transformed horizontal gaze signal, line delimiters;
- deriving, for each pair of consecutive line delimiters
enclosing the transformed horizontal gaze signal of a current line, a reading speed from the distance in samples of the pair of line delimiters, in relation to the sampling frequency of the horizontal gaze signal;
- determining, from the transformed horizontal gaze signal of the current line, a number of saccades in the current line, by counting those locations, where the gaze signal has a sudden high slope portion surrounded on both sides by portions of markedly smaller slope;
- detecting a too small font size status if the number of
saccades is above a second threshold or if the reading speed is below a third threshold, and detecting a too big font size status if the number of saccades is less than a fourth
threshold;
- initiating an increase of the font size if the too small font size status was detected, and initiating a decrease of the font size if the too big font size status was detected.
2. A method for gaze-based measuring of a text reading speed, comprising the steps of:
- probing, sampling and recording a user's horizontal gaze
signal at a predefined sampling frequency;
- subjecting the horizontal gaze signal to a frequency or
wavelet transform on several levels; - detecting, in the transformed horizontal gaze signal, line delimiters by locating pieces of the transformed horizontal gaze signal where a high frequency content is below a first threshold;
- deriving, for each pair of consecutive line delimiters
enclosing the transformed horizontal gaze signal of a current line, the text reading speed from the distance in samples of the pair of line delimiters, in relation to the sampling frequency of the horizontal gaze signal.
3. A method for gaze-based measuring of a number of visual saccades per text line, comprising the steps of:
- probing, sampling and recording a user's horizontal gaze
signal ;
- subjecting the horizontal gaze signal to a frequency or
wavelet transform on several levels;
- detecting, in the transformed horizontal gaze signal, line delimiters ;
- determining, from a portion of the transformed horizontal gaze signal enclosed by two consecutive ones of the line delimiters, the number of visual saccades in the text line, by counting those locations, where the gaze signal has a sudden high slope portion surrounded on both sides by portions of markedly smaller slope.
4. A method according to Claim 1 or Claim 3, wherein the step of determining the number of visual saccades comprises determining a positive saccade count, determining a negative saccade count, and calculating the number of visual saccades as the difference between the positive saccade count and the negative saccade count .
5. A method according to one of the previous claims, wherein the horizontal gaze signal, before detecting line delimiters, is calibrated in such a way that an amplitude difference of 1 in the calibrated signal matches the width of the text being read.
PCT/EP2012/065758 2011-08-19 2012-08-10 Method for gaze-controlled text size control,and methods for gaze-based measuring of a text reading speed and of a number of visual saccades per text line WO2013026725A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP12745496.5A EP2745188B1 (en) 2011-08-19 2012-08-10 Method for gaze-controlled text size
US14/239,360 US9256285B2 (en) 2011-08-19 2012-08-10 Method for gaze-controlled text size control, and methods for gaze-based measuring of a text reading speed and of a number of visual saccades per text line

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP11290378.6 2011-08-19
EP11290378 2011-08-19

Publications (1)

Publication Number Publication Date
WO2013026725A1 true WO2013026725A1 (en) 2013-02-28

Family

ID=46640053

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2012/065758 WO2013026725A1 (en) 2011-08-19 2012-08-10 Method for gaze-controlled text size control,and methods for gaze-based measuring of a text reading speed and of a number of visual saccades per text line

Country Status (2)

Country Link
EP (1) EP2745188B1 (en)
WO (1) WO2013026725A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103294201A (en) * 2013-06-27 2013-09-11 深圳市中兴移动通信有限公司 Mobile terminal and gesture controlling method thereof
CN103399636A (en) * 2013-07-30 2013-11-20 深圳市中兴移动通信有限公司 Method and device for adjusting terminal font through eye motion
WO2015091228A1 (en) * 2013-12-20 2015-06-25 Koninklijke Philips N.V. Control device and method for controlling a display
US9898077B2 (en) 2013-09-18 2018-02-20 Booktrack Holdings Limited Playback system for synchronised soundtracks for electronic media content
US10289295B2 (en) 2014-12-18 2019-05-14 International Business Machines Corporation Scroll speed control for document display device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0816984A2 (en) * 1996-06-25 1998-01-07 Sun Microsystems, Inc. Method and apparatus for eyetrack-driven information retrieval
WO2003050658A2 (en) * 2001-12-12 2003-06-19 Eyetools Techniques for facilitating use of eye tracking data
US6873314B1 (en) * 2000-08-29 2005-03-29 International Business Machines Corporation Method and system for the recognition of reading skimming and scanning from eye-gaze patterns
EP2050389A1 (en) * 2007-10-18 2009-04-22 ETH Zürich Analytical device and method for determining eye movement
WO2010018459A2 (en) * 2008-08-15 2010-02-18 Imotions - Emotion Technology A/S System and method for identifying the existence and position of text in visual media content and for determining a subject's interactions with the text

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0816984A2 (en) * 1996-06-25 1998-01-07 Sun Microsystems, Inc. Method and apparatus for eyetrack-driven information retrieval
US6873314B1 (en) * 2000-08-29 2005-03-29 International Business Machines Corporation Method and system for the recognition of reading skimming and scanning from eye-gaze patterns
WO2003050658A2 (en) * 2001-12-12 2003-06-19 Eyetools Techniques for facilitating use of eye tracking data
EP2050389A1 (en) * 2007-10-18 2009-04-22 ETH Zürich Analytical device and method for determining eye movement
WO2010018459A2 (en) * 2008-08-15 2010-02-18 Imotions - Emotion Technology A/S System and method for identifying the existence and position of text in visual media content and for determining a subject's interactions with the text

Non-Patent Citations (13)

* Cited by examiner, † Cited by third party
Title
"EagleEyes Project", BOSTON COLLEGE
"Visual Electrodiagnostics - A Guide To Procedures", INTERNATIONAL SOCIETY FOR CLINICAL ELECTROPHYSIOLOGY OF VISION
A. BULLING; J. A. WARD; H. GELLERSEN; G. TROSTER: "Eye Movement Analysis for Activity Recognition Using Electrooculography", IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, vol. 33, no. 4, April 2011 (2011-04-01), pages 741 - 753
ANDREAS BULLING ET AL: "Eye Movement Analysis for Activity Recognition Using Electrooculography", IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, IEEE SERVICE CENTER, LOS ALAMITOS, CA, US, vol. 33, no. 4, 1 April 2011 (2011-04-01), pages 741 - 753, XP011373525, ISSN: 0162-8828, DOI: 10.1109/TPAMI.2010.86 *
ANDREAS BULLING ET AL: "Robust Recognition of Reading Activity in Transit Using Wearable Electrooculography", 19 May 2008, PERVASIVE COMPUTING; [LECTURE NOTES IN COMPUTER SCIENCE], SPRINGER BERLIN HEIDELBERG, BERLIN, HEIDELBERG, PAGE(S) 19 - 37, ISBN: 978-3-540-79575-9, XP019110792 *
D. BEYMER ET AL.: "An eye tracking study of how font size and type influence online reading", BCS-HCI '08 PROCEEDINGS OF THE 22ND BRITISH HCI GROUP ANNUAL CONFERENCE ON PEOPLE AND COMPUTERS: CULTURE, CREATIVITY, INTERACTION, vol. 2, 31 December 2008 (2008-12-31), British Computer Society Swinton, UK, pages 15 - 18, XP002689101, ISBN: 978-1-906124-06-9 *
DENG L Y ET AL: "EOG-based signal detection and verification for HCI", MACHINE LEARNING AND CYBERNETICS, 2009 INTERNATIONAL CONFERENCE ON, IEEE, PISCATAWAY, NJ, USA, 12 July 2009 (2009-07-12), pages 3342 - 3348, XP031518307, ISBN: 978-1-4244-3702-3 *
H. MIYASHITA; M. HAYASHI; K.OKADA: "Implementation of EOG-based Gaze Estimation in HMD with Head-tracker", 18TH INTERNATIONAL CONFERENCE ON ARTIFICIAL REALITY AND TELEXISTENCE, 2008
L. Y. DENG; C. HSU; T. LIN; J. TUAN; Y. CHEN: "EOG-Based Signal Detection And Verification For HCI", INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, vol. 6, 2009, pages 3342 - 3348
M. J. SHENSA: "The Discrete Wavelet Transform: Wedding the A Trous and Mallat Algorithms", IEEE TRANSACTIONS ON SIGNAL PROCESSING, vol. 40, no. 10, October 1992 (1992-10-01), pages 2464 - 2482
SINGLE LINE READER; L. E. L. MIZUTAN; T. NAKAJIMA, GRADUATE SCHOOL OF EDUCATIONAL INFORMATICS - TOHOKU UNIVERSITY - JAPAN, Retrieved from the Internet <URL:http://www.cmsoft.com.br/slr>
TOBII, TOBII UNVEILS THE WORLD'S FIRST EYE-CONTROLLED LAPTOP, Retrieved from the Internet <URL:http://www.tobii.com/en/eye-tracking- integration/global/news-and-events/press-releases/tobii- unveils-the-worlds-first-eye-controlled-laptop>
YOUNG LR; SHEENA D: "Encyclopedia of Medical Devices and Instrumentation", 1988, JOHN WILEY, article "Eye-movement measurement techniques", pages: 1259 - 1269

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103294201A (en) * 2013-06-27 2013-09-11 深圳市中兴移动通信有限公司 Mobile terminal and gesture controlling method thereof
CN103399636A (en) * 2013-07-30 2013-11-20 深圳市中兴移动通信有限公司 Method and device for adjusting terminal font through eye motion
US9898077B2 (en) 2013-09-18 2018-02-20 Booktrack Holdings Limited Playback system for synchronised soundtracks for electronic media content
WO2015091228A1 (en) * 2013-12-20 2015-06-25 Koninklijke Philips N.V. Control device and method for controlling a display
US10049645B2 (en) 2013-12-20 2018-08-14 Koninklijke Philips N.V. Control device and method for optimizing a presentation style of specific content for a specific user
US10289295B2 (en) 2014-12-18 2019-05-14 International Business Machines Corporation Scroll speed control for document display device
US10318141B2 (en) 2014-12-18 2019-06-11 International Business Machines Corporation Scroll speed control for document display device

Also Published As

Publication number Publication date
EP2745188B1 (en) 2018-07-25
EP2745188A1 (en) 2014-06-25

Similar Documents

Publication Publication Date Title
US9256285B2 (en) Method for gaze-controlled text size control, and methods for gaze-based measuring of a text reading speed and of a number of visual saccades per text line
EP2081100B1 (en) Adjusting device for brain wave identification method, adjusting method and computer program
EP2745188B1 (en) Method for gaze-controlled text size
US8333475B2 (en) Electro-oculography measuring device, ophthalmological diagnosis device, eye-gaze tracking device, wearable camera, head-mounted display, electronic eyeglasses, electro-oculography measuring method, and recording medium
CN111343917B (en) Method for hosting mobile access to high resolution electroencephalogram data
Bulling et al. It's in your eyes: towards context-awareness and mobile HCI using wearable EOG goggles
Deng et al. EOG-based Human–Computer Interface system development
US9535499B2 (en) Method and display apparatus for providing content
JP4399515B1 (en) Apparatus, method, and program for adjusting method of identifying electroencephalogram signal
Olsson Real-time and offline filters for eye tracking
US20120191542A1 (en) Method, Apparatuses and Service for Searching
CN109875583B (en) Fatigue driving detection system and method based on AR technology
WO2016209435A1 (en) Electrode contact quality
JP2018153469A (en) Information display apparatus, biosignal measurement system, and program
Kunze et al. How much do you read? counting the number of words a user reads using electrooculography
JP2013180076A (en) Analysis supporting device and program
CN112070141B (en) SSVEP asynchronous classification method integrating attention detection
KR100947639B1 (en) System for outputting multimedia contents using brain wave according to emotion in real time and method therefor
US20150078728A1 (en) Audio-visual work story analysis system based on tense-relaxed emotional state measurement and analysis method
JP3629047B2 (en) Information processing device
CN104793743A (en) Virtual social contact system and control method thereof
Martín-Pascual et al. Using electroencephalography measurements and high-quality video recording for analyzing visual perception of media content
Donley et al. Analysing the Quality of Experience of multisensory media from measurements of physiological responses
KR102094936B1 (en) Method for Enhancing Reliability of BCI System
KR102306111B1 (en) Method and apparatus for eog-based eye tracking protocol using baseline drift removal algorithm for long-term eye movement detection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12745496

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2012745496

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 14239360

Country of ref document: US