WO2018046957A2 - Système de lecture, procédé d'affichage de texte et appareil - Google Patents

Système de lecture, procédé d'affichage de texte et appareil Download PDF

Info

Publication number
WO2018046957A2
WO2018046957A2 PCT/GB2017/052656 GB2017052656W WO2018046957A2 WO 2018046957 A2 WO2018046957 A2 WO 2018046957A2 GB 2017052656 W GB2017052656 W GB 2017052656W WO 2018046957 A2 WO2018046957 A2 WO 2018046957A2
Authority
WO
WIPO (PCT)
Prior art keywords
text
reading
data
scrolling
reader
Prior art date
Application number
PCT/GB2017/052656
Other languages
English (en)
Other versions
WO2018046957A3 (fr
Inventor
Howard MOSHTAEL
Baljean Dhillon
Ian Underwood
Original Assignee
The University Court Of The University Of Edinburgh
Lothian Health Board
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The University Court Of The University Of Edinburgh, Lothian Health Board filed Critical The University Court Of The University Of Edinburgh
Publication of WO2018046957A2 publication Critical patent/WO2018046957A2/fr
Publication of WO2018046957A3 publication Critical patent/WO2018046957A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/103Formatting, i.e. changing of presentation of documents
    • G06F40/109Font handling; Temporal or kinetic typography
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B17/00Teaching reading
    • G09B17/003Teaching reading electrically operated apparatus or devices
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/008Teaching or communicating with blind persons using visual presentation of the information for the partially sighted
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2505/00Evaluating, monitoring or diagnosing in the context of a particular type of medical care
    • A61B2505/07Home care
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4082Diagnosing or monitoring movement diseases, e.g. Parkinson, Huntington or Tourette
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4088Diagnosing of monitoring cognitive diseases, e.g. Alzheimer, prion diseases or dementia
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6803Head-worn items, e.g. helmets, masks, headphones or goggles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/14Solving problems related to the presentation of information to be displayed
    • G09G2340/145Solving problems related to the presentation of information to be displayed related to small screens
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Definitions

  • the present invention relates to a text display method and apparatus.
  • the present invention also relates to a reading system for obtaining and processing reading data, and a method of using the reading system.
  • Background Reading on-screen electronic text may be one of the most widespread and significant human-machine interactions.
  • a proliferation of miniature direct-view display screens, for example in devices such as smart watches, has renewed interest in alternative methods of text presentation to a traditional page-at-a-time display.
  • Proposed text presentation methods may address how to present text so as to facilitate effective reading when there is insufficient space on the screen for a full paragraph.
  • the low vision community may seek a wider selection of text presentation formats to optimise use of their residual vision.
  • RSVP rapid serial visual presentation
  • An intrinsic feature of the RSVP technique may be that parafoveal processing is suppressed.
  • Parafoveal processing refers to the way that readers use their off-central (parafoveal) vision to access information. It has been demonstrated that lack of parafoveal processing may adversely affect comprehension during RSVP reading.
  • RSVP RSVP
  • one word is displayed at a time in sequence.
  • a reader may be unable to regress to a previously- viewed word.
  • Prevention of regressions may adversely affect comprehension during RSVP reading.
  • a further proposed text presentation method is continuously scrolling text.
  • a line of text may move smoothly across a screen from right to left.
  • Continuously scrolling text may display sentences rather than single words.
  • Continuously scrolling text may maintain a benefit of parafoveal preview used in normal reading.
  • Parafoveal preview may be particularly useful for readers with central vision loss.
  • continuously scrolling text may require smooth pursuit eye movements that are not required in normal (non-scrolling) reading.
  • Continuously scrolling text may increase fixation and/or pursuit times when compared with static text. When scrolled too quickly, words may become blurred and become difficult to read. Summary
  • a text display method comprising scrolling a line of text across a screen, the scrolling comprising: at least one pause in which the line of text does not move; and at least one text movement in which the line of text moves; wherein at least one parameter of the scrolling is determined based on measured eye movement data from at least one test subject, such that the scrolling mimics eye movement of the at least one test subject.
  • Text may be scrolled across the screen in a single visible line.
  • the line of text may move between text positions.
  • a text position may be a position of the line of text on the screen.
  • the line of text may move between a series of text positions.
  • the line of text may move in a series of text movements.
  • the determining of the at least one parameter of the scrolling may comprise selecting and/or controlling at least one parameter of the scrolling.
  • measured eye movement data may provide a natural and/or intuitive text display since the scrolling method is based on natural eye movements.
  • a text display method based on measured eye movement data may be more intelligible to at least some readers than a text display that is not based on measured eye movement data.
  • the text display method may provide a faster reading speed than some other scrolling display methods, for example continuous scrolling.
  • the reader's eyes may not have to make smooth pursuit movements.
  • the text display method may provide a faster reading speed than some other display methods, for example display methods that do not use scrolling.
  • a faster speed may result from the scrolled text moving on the screen, which may reduce the amount of eye movement required from a user reading the text.
  • the text display method may be appropriate for use on small screens, for example screens of smart phones or smart watches.
  • the text movement may move the line of text horizontally and/or vertically.
  • the text movement may move the line of text from right to left.
  • the text movement may move the text from left to right.
  • the text movement may move the text vertically upwards.
  • the text movement may move the text vertically downwards.
  • the at least one parameter of the scrolling may comprise a duration or text position of at least one pause.
  • the duration or text position of the at least one pause may be based on a duration or text position of a corresponding fixation in the eye movement data and/or on a word frequency and/or on a word length.
  • Values for at least one parameter of the scrolling may be determined by at least one feature of the text to be displayed.
  • at least one parameter of the scrolling may be determined by at least one text length, sentence length, word length, word frequency, word familiarity.
  • Values for least one parameter of the scrolling may be determined according to a type of scrolling (for example, whether it is intended to mimic eye movements closely or smooth over it, or whether it is intended to mimic a population or an individual).
  • Values for at least one parameter of the scrolling may be determined according to a type of text (for example, whether the text comprises single sentences or a paragraph, or is complex or simple). Values for at least one parameter of the scrolling may be determined according to a language used.
  • Values for at least one parameter of the scrolling may be determined according to a condition of a reader to whom the text is displayed, for example whether the reader is normally sighted, has vision loss, is dyslexic or has a neurodegenerative disease. Values for at least one parameter of the scrolling may be determined in accordance with a reading task to be performed, for example skim reading or in depth reading. Values for at least one parameter of the scrolling may be determined in accordance with user preference.
  • the pause may be a pause on a particular word in the line of text
  • the corresponding fixation may be a fixation on the same word in the same line of text, or in different text.
  • the duration or text position of the corresponding fixation in the eye movement data may be an average duration or text position in eye movement data from a plurality of test subjects.
  • a duration of the or each pause may be at least 30 ms, optionally at least 50 ms, further optionally at least 100 ms, further optionally at least 200 ms.
  • a duration of the or each pause may be less than 1 second, optionally less than 500 ms, further optionally less than 300 ms.
  • Different pause durations and/or positions may be used for different words, for example words of different length, frequency, complexity or familiarity.
  • Pause duration may be determined based on an equation linking pause duration to word length and word frequency. Pause duration may be moderated based on a speed setting.
  • the at least one parameter of the scrolling may comprise a duration or length of at least one text movement.
  • the duration or length of the at least one text movement may be based on a duration or length of a corresponding saccade in the eye movement data.
  • the text movement may itself be referred to as a saccade or text saccade.
  • the duration of a text movement may be referred to as a saccade duration.
  • the length of a text movement may be referred to as a saccade length.
  • the length of the at least one text movement may comprise a distance over which the text moves while performing the text movement. The distance may be measured as, for example, a distance in millimetres and/or a number of characters by which the text moves.
  • a length of the or each text movement may be at least 1 character, optionally at least 3 characters, further optionally at least 5 characters, further optionally at least 7 characters.
  • the length of the or each text movement may be less than 10 characters, optionally less than 8 characters, further optionally less than 6 characters.
  • Different text movement lengths may be used for different words, for example words of different length, frequency, complexity or familiarity.
  • a duration of the or each text movement may be between 10 ms and 100 ms, optionally between 20 ms and 50 ms, further optionally between 20 ms and 35 ms.
  • the text movement may be a text movement between particular words in the line of text, and the corresponding saccade may be a saccade between the same words in the same line of text, or in different text.
  • the duration or length of the corresponding saccade in the eye movement data may be an average duration or text position in eye movement data from a plurality of test subjects.
  • the length of the saccade may comprise a distance from a first point in the text on which a user's eye is fixed at the start of the saccade to a second point in the text on which the user's eye is fixed at the end of the saccade.
  • the text display method may mimic a natural pattern of eye movement, which may result the text display method seeming natural to a user.
  • the text may be displayed in a manner that takes into account the actual content of the text (for example, the sentence) being displayed, for example word lengths or word frequencies.
  • the text display apparatus may utilise data on how a word or sentence is read, either by a population or by at least one individual.
  • the text display apparatus may be considered smart.
  • the text display apparatus may be considered to adapt to the sentences that it is provided with and/or a user that it is presenting to.
  • a value for at one parameter of the scrolling may vary within the line of text. For example, different pause durations may be used for different words or portions of words in the line of text. In some circumstances, different parameter values (for example, different pause durations, text movement lengths and/or different text movement durations) may be used for different words in a sentence, for example for adjacent words.
  • different parameter values may be used for words of different lengths. For example, a longer pause duration may be used for longer words. Different parameter values may be used based on the complexity or familiarity of words. For example, a longer pause duration may be used for more complex and/or less familiar words. Different parameter values may be used based on the frequency at which each word occurs, for example how often each word appears in a language corpus. The text display may therefore appear to speed up and slow down even within a single sentence.
  • a duration of the or each text movement may be such that the text appears to move instantaneously from one position to a further position.
  • the text may move between adjacent frames, such that the text is shown at a first position on a first frame, and is shown at a second position on a second frame which directly follows the first frame.
  • the screen may be configured to display a fixed number of frames per second, for example 25, 30, 50, 60, 120 or 240 frames per second.
  • a frame refresh time may be 1 second divided by the number of frames per second.
  • a duration of the or each pause may be an integer multiple of the frame refresh time.
  • the at least one parameter of the scrolling may comprise a scrolling speed.
  • the at least one parameter of the scrolling may comprise a regression parameter.
  • the regression parameter may comprise at least one of a number of words by which to regress, a number of characters by which to regress, a number of regressions.
  • the text display method may further comprise modifying the at least one parameter of the scrolling based on an input received from a reader.
  • the reader may input a reading speed.
  • the modifying of the at least one parameter of the scrolling may comprise modifying at least one pause duration and/or at least one text movement duration in dependence on the input reading speed.
  • the modifying of the at least one parameter of the scrolling may comprise scaling at least one pause duration and/or at least one text movement duration in inverse proportion to a change in reading speed.
  • the method may further comprise displaying on the screen a reading position indicator.
  • the reading position indicator may be indicative of a position on the screen on which a reader of the line of text is intended to fixate while reading.
  • the method may further comprise displaying on the screen a progress indicator that is indicative of a position in the displayed text.
  • the scrolling of the line of text may be such that the centre of each word of the line of text is positioned at the reading position.
  • the scrolling of the line of text may be such that the position in each word that is positioned at the reading position is dependent on a corresponding pause position in the eye movement data.
  • the reading position indicator may be positioned in dependence on a reader's vision. For a reader with central vision loss, the reading position indicator may be positioned differently from the position used for a reader without central vision loss.
  • the method may further comprise obtaining the measured eye movement data from the at least one test subject by measuring eye movement of the at least one test subject while the at least one test subject is reading at least one sample text.
  • the at least one sample text may be displayed as static text.
  • the eye movement data may comprise a set of saccades and fixations made by the at least one test subject while reading the at least one sample text.
  • the at least one sample text may be a standardised text, for example a text used in a standardised vision test.
  • the eye movement data may be measured from a single test subject, and the at least one scrolling parameter may be based only on the data from that single test subject.
  • the eye movement data of the single test subject may be used to determine at least one scrolling parameter for a text display that is to be displayed to that subject at a subsequent time.
  • a tailored set of eye movement data may be obtained.
  • the test subject may find text displayed using parameters based on their own eye movement data to be easier to read than text displayed using parameters based on someone else's eye movement data, or on averaged eye movement data.
  • the measured eye movement data may be obtained from each of a plurality of test subjects reading the same at least one sample text.
  • the method may further comprise processing the measured eye movement data to obtain average eye movement data.
  • the determining of the at least one parameter of the scrolling may be based on the average eye movement data.
  • the measured eye movement data may comprise an eye movement corpus.
  • the measured eye movement data may be processed to obtain a range of eye movement data. By averaging data from a plurality of test subjects, eye movement data may be obtained that is representative of the eye movement data of a population.
  • the measured eye movement data may be used to display text in a way that may be suitable for use by a range of readers, for example if the eye movement data is found to be consistent across a particular population of readers.
  • Measured eye movement data for words in the at least one sample text may be extrapolated to words that are not in the at least one sample text. For example, a pause duration for a word in the line of text that is not part of the at least one sample text may be determined based on fixation durations for other words that are part of the at least one sample text, for example by determining trends in fixation duration against word length and/or word frequency.
  • a method of determining text display parameters for a scrolling text display comprising: obtaining measured eye movement data from the at least one test subject; and determining based on the measured eye movement data to determine at least one parameter to be used in scrolling text across the screen, such that the scrolling mimics eye movement of the at least one test subject.
  • the measured eye movement data may be eye movement data of a population.
  • a text display apparatus comprising a screen and a processor, wherein the processing is configured to scroll a line of text across the screen, the scrolling comprising: at least one pause in which the line of text does not move; and at least one movement in which the line of text moves between text positions; wherein at least one parameter of the scrolling is determined based on measured eye movement data from at least one test subject, such that the scrolling mimics eye movement of the at least one test subject.
  • the text display apparatus may comprise a head-mounted display system. Text may be displayed on one or more screens of the head-mounted display system.
  • the head-mounted display system may comprise smart glasses.
  • the head-mounted display system may comprise a smartphone headset.
  • a head-mounted display system may benefit readers who have difficulty in maintaining a consistent head position, or readers who have difficulty in holding a hand-held device such as a tablet or e-reader.
  • the text display apparatus may comprise at least one of a computer, a portable computer, a laptop, a tablet, an e-reader, a smartphone, a handheld reading device, a handheld computing device, an assistive reading device, a television.
  • the screen may comprise at least one of a laptop screen, a tablet screen, a stand-alone screen, a television.
  • the text display apparatus may be further configured to record reading data.
  • the reading data may comprise eye movement data, blink rate data or pupillary response data.
  • the text display apparatus may comprise a detector configured to record the eye movement data, blink rate data or pupillary response data.
  • the detector may be part of a head-mounted display system.
  • the text display apparatus may be portable.
  • the text display apparatus may be handheld.
  • the text display apparatus may be an apparatus used by a reader for normal reading, for example for reading books, magazines, emails or websites.
  • a reading system comprising: an apparatus comprising a screen and a processor configured to display a sequence of text to a reader in a reading process; and at least one processing resource configured to: obtain reading data representative of at least one property of the text and/or the reading process, and to process the reading data to identify changes in the reading data over time that may be indicative of visual or neurological deterioration and/or the presence of at least one condition in the reader.
  • the at least one processing resource may process the reading data automatically. Gathering feedback on a user's reading over an extended period of time may allow a clinician to determine whether there has been a deterioration in the user's sight or neurological function.
  • the user may be a user with a known visual or neurological condition.
  • the user may be a user who is considered to be at risk of developing a visual or neurological condition.
  • the reading system may be portable.
  • the reading system may be handheld.
  • the reading system may comprise an apparatus used by a reader for normal reading, for example for reading books, magazines, emails or websites. Obtaining reading data using a portable device may be convenient for a reader.
  • the apparatus comprising the screen and the processor may be portable.
  • the apparatus comprising the screen and the processor may be handheld.
  • the apparatus comprising the screen and the processor may be a text display apparatus.
  • the system may comprise a communication resource configured to transmit the identified changes and/or the reading data to a remote location.
  • the remote location may comprise a remote computing apparatus.
  • the remote computing apparatus may comprise a remote server.
  • the remote computing apparatus may comprise a remote workstation.
  • the processing resource may be located in the remote location.
  • the system may comprise or be associated with a data store configured to store the reading data.
  • the data store may be local to the screen and/or processor.
  • the data store may be remote.
  • the remote computing apparatus may be part of a medical records system.
  • the changes may be stored as part of the reader's medical records.
  • Data on the remote computing apparatus may be accessible by medical professionals.
  • the system may be configured to obtain the reading data in a home or office environment.
  • the system may be configured to obtain the reading data in the reader's normal or everyday environment.
  • the apparatus may be used by the reader in the reader's home.
  • the apparatus may be used by the reader for routine reading.
  • By processing the reading data for example by processing the reading data automatically, subtle changes may be identified that may not be apparent to the reader themselves.
  • Reading data may be acquired much more often that a frequency of traditional vision testing. Changes in the reading data may be identified in a time period that is much shorter than an interval between traditional vision tests. Patients having known visual or neurological conditions may be monitored more frequently than if the patient were to be monitored by testing in a clinic.
  • Readers in an at-risk population may be monitored more often than would be the case with traditional vision tests. By using the system to provide automated monitoring, it may be the case that a larger at-risk population may undergo regular monitoring than would be the case if a medical professional were required to administer the monitoring.
  • the processing resource may be further configured to issue an alert based on one or more of the identified changes.
  • the processing resource may be configured to issue the alert to the reader.
  • the processing resource may be configured to issue the alert to a medical professional.
  • the processing resource may compare one or more parameters of the reading data to a threshold value, and issue the alert based on the comparison.
  • the threshold value may comprise a baseline value for the reader.
  • the threshold value may comprise a normal value for a population.
  • the processor may be configured to obtain the reading data from normal reading activities of the reader, the normal reading activities comprising reading text from the screen.
  • Reading data may be obtained from the reader's daily activities, for example reading a novel or a news article. Obtaining reading data from normal reading activities may be more convenient for the reader than performing dedicated vision tests. Better reader compliance may be achieved.
  • the reading data may be representative of the reader's normal reading performance. Reading data may be acquired in an environment in which the reader normally reads, for example under light levels usually used by the reader.
  • the processor may be configured to receive a selection of an item of content from the reader and to display at least part of the item of content on the screen on the screen,
  • the obtaining of the reading data may comprise obtaining reading data relevant to the reader's reading of at least part of the item of content.
  • the item of content may be any item that may be read by the reader in normal reading, for example a book, newspaper or magazine article, or website.
  • the processor may be further configured to deliver vision tests to the user via the screen.
  • the obtaining of the reading data may comprise obtaining reading data from the vision tests.
  • the vision tests may comprise at least one of perimetry tests, reading speed tests, reading acuity tests, contrast visibility tests, contrast sensitivity tests and pupillary response tests.
  • the vision tests may comprise standardised and reproducible test. Reading data obtained from vision tests may be interpreted using standard techniques. By providing the vision tests on an apparatus that may be used in the reader's home (for example, for routine reading), the tests may be provided much more often than tests in a surgery or hospital.
  • the vision tests may comprise tests of screen visibility on the particular screen used.
  • the changes in the reading data over time may comprise changes in performance in the vision tests.
  • the reading data may comprise at least one of text size, typeface, contrast, colour contrast, reading speed, reading duration, luminance.
  • the text size may comprise a reading test-type equivalent size.
  • the text size may be measured in terms of visual angle instead of letter height.
  • the at least one property of the text may comprise at least one of text size, typeface, contrast, colour contrast, reading speed, luminance.
  • the reading data may comprise at least one of an interocular presentation parameter, a bilateral presentation parameter.
  • the reading data may comprise different parameter values for each eye.
  • the reading data may comprise a location of presentation.
  • a reader may select a location of presentation to locate text on a preferred retinal locus.
  • a change in location of presentation over time may be indicative of a change in the extent of the reader's vision.
  • the reading data may comprise eye movement data, blink rate data or pupil response data.
  • the apparatus may further comprise a detector configured to measure eye movement data, blink rate data or pupil response data and to deliver the measured eye movement data, blink rate data or pupil response data to the processor.
  • the apparatus may be the text display apparatus.
  • the reading data may be obtained while the reader is reading text displayed using the test display method.
  • the visual or neurological deterioration may be due to, or the at least one condition may comprise, at least one of: amblyopia, dyslexia, macular disease, glaucoma, cognitive decline, dementia, a neurodegenerative disease, motor neurone disease, multiple sclerosis, Parkinson's disease, stroke.
  • the at least one condition may comprise a condition that causes at least one of optic neuritis, optic neuropathy, macular oedema.
  • the at least one condition may comprise at least one of multiple sclerosis, glaucoma, diabetes, retinal vein occlusion, retinal dystrophy, uveitis, macular degeneration, age-related macular degeneration.
  • Changes in the reading data over time that may be indicative of visual or neurological deterioration and/or the presence of at least one condition in the reader may comprise at least one of larger print size, increased illumination, increased contrast, a change in colour contrast preference in one eye, a change in colour contrast preference in both eyes.
  • Changes in the reading data over time that may be indicative of visual or neurological deterioration and/or the presence of at least one condition in the reader may comprise at least one of a change in pupil response, a change in smooth pursuit eye movements, a change in saccadic eye movements.
  • Changes in the reading data over time that may be indicative of visual or neurological deterioration and/or the presence of at least one condition in the reader may comprise changes compared to baseline data.
  • Reading data may be compared to normative datasets, for example to normative age-gender matched data sets.
  • Changes in the reading data over time that may be indicative of visual or neurological deterioration and/or the presence of at least one condition in the reader may comprise changes resulting in parameter values outside a normative range.
  • the at least one processing resource may comprise a first processing resource and a second processing resource at a different location from the first processing resource.
  • the first processing resource may be configured to obtain reading data representative of the at least one property of the text and/or the reading process.
  • the second processing resource may be configured to process the reading data to identify changes in the reading data over time that may be indicative of visual or neurological deterioration and/or the presence of at least one condition in the reader.
  • the second processing resource may be located at the or a remote location.
  • the apparatus may comprise at least one of a computer, a portable computer, a laptop, a tablet, an e-reader, a smartphone, a handheld reading device, a handheld computing device, an assistive reading device, a head- mounted display system, smart glasses, a smartphone headset.
  • a system comprising at least one processing resource configured to obtain reading data representative of at least one property of a sequence of text displayed to a reader in a reading process and/or of the reading process, and to process the reading data to identify changes in the reading data over time that may be indicative of visual or neurological deterioration and/or the presence of at least one condition in the reader.
  • the system may be configured to obtain reading data from a plurality of readers.
  • the system may be accessible by medical professionals.
  • a reading system comprising: an apparatus comprising a screen and a processor configured to display a sequence of text to a reader in a reading process; and at least one processing resource configured to obtain reading data representative of at least one property of the text and/or the reading process, and to process the reading data to identify parameter values in the reading data that may be indicative of visual or neurological deterioration and/or the presence of at least one condition in the reader.
  • the at least one condition may comprise a condition that causes at least one of optic neuritis, optic neuropathy, macular oedema.
  • the at least one condition may comprise at least one of multiple sclerosis, glaucoma, diabetes, retinal vein occlusion, retinal dystrophy, uveitis, macular degeneration, age-related macular degeneration.
  • a method comprising obtaining reading data from a reader; automatically processing the reading data to identify changes in the reading data over time that may be indicative of visual or neurological deterioration in the reader; and automatically communicating the identified changes to the reader or to a further user.
  • a computer program product comprising computer readable instructions that are executable by a processor to perform a method as claimed or described herein.
  • a method or system substantially as described herein with reference to the accompanying drawings.
  • features in one aspect may be provided as features in any other aspect as appropriate.
  • features of a method may be provided as features of an apparatus and vice versa.
  • Any feature or features in one aspect may be provided in combination with any suitable feature or features in any other aspect.
  • Figure 1 is a schematic illustration of a data gathering apparatus in accordance with an embodiment
  • Figure 2 is a schematic illustration of a text display apparatus in accordance with an embodiment
  • Figure 3 is a flow chart illustrating in overview a method of an embodiment
  • Figure 4 is an illustration of one frame of a text presentation method in accordance with an embodiment
  • Figures 5a, 5b and 5c illustrate successive frames of a text presentation method in accordance with an embodiment
  • Figure 6 is a schematic illustration of a text display apparatus in accordance with an embodiment
  • Figure 7 is a flow chart illustrating in overview a method of an embodiment
  • Figure 8 is a schematic illustration of a reading data collection apparatus in accordance with an embodiment
  • Figure 9 is a schematic illustration of a reading data collection apparatus in accordance with an embodiment
  • Figure 10 is a plot of total reading time against word frequency
  • Figure 11 is a plot of total reading time against word length
  • Figure 12 is a plot of total reading time against number of fixations per word
  • Figure 13 is a plot of mean reading speed
  • Figure 14 is a flow chart illustrating in overview a text display method in accordance with an embodiment
  • Figure 15 is a plot of reading speed using optical magnifier against reading speed using dynamic text on smart glasses
  • Figure 16 is a plot of the number of participants that achieved their fastest reading speed from each of four methods (static, RSVP, horizontal smooth scrolling, biomimetic scrolling);
  • Figure 17 is a plot of mean reading speed for each of four text presentation methods (static, RSVP, horizontal smooth scrolling, biomimetic scrolling);
  • Figure 18 is a plot of responses to the question, 'Compared to reading large print from paper, did you find reading from the display to be... ' ;
  • Figure 19 is a plot of responses to the question, 'Did you prefer reading from the display or from paper?'.
  • Figure 1 is a schematic illustration of a data gathering apparatus 10 in accordance with an embodiment.
  • Figure 2 is a schematic illustration of a text display apparatus in accordance with the embodiment.
  • the data gathering apparatus of Figure 1 is used to gather eye movement data from a plurality of test subjects.
  • the eye movement data from the plurality of test subjects is used to determine parameters for a text presentation method, biomimetic scrolling, that is used to present text on a display screen of the text display apparatus of Figure 2.
  • a single apparatus may be used as both a data gathering apparatus 10 and text display apparatus 20.
  • the data gathering apparatus 10 and/or text display apparatus 20 may comprise a plurality of connected apparatuses.
  • the data gathering apparatus 10 comprises a display screen 12, computing apparatus 14, camera 16, and head support 19. In some embodiments, no head support 19 is used. In further embodiments, any eye movement detector may be used in place of camera 16.
  • the computing apparatus 14 is a desktop computer, for example a PC.
  • the computing apparatus 14 comprises a processor 18.
  • the computing apparatus 14 may be any suitable computing device.
  • the display screen 12 is integrated into the computing apparatus 14.
  • the data gathering apparatus 10 comprises a SR Research EyeLink 1000 Desktop mount system with a 2000 Hz camera upgrade. In other embodiments, any data gathering apparatus may be used that is configured to measure and record eye movement data.
  • the processor 18 is configured to display text on the display screen 12.
  • the text comprises a plurality of individual test sentences.
  • the test sentences are displayed one by one.
  • Each test sentence is displayed as a static display.
  • any sample text in any suitable text format may be used.
  • a test subject is positioned with their head supported by the head support 19 and their eyes facing the display screen 12.
  • the test subject is instructed to read the text displayed on the display screen 12.
  • the camera 16 records movement of one or both eyes of the test subject while the test subject is reading the displayed text. Eye movement data from camera 16 is stored by the processor 18.
  • eye movement data is recorded for a large number of test subjects (for example, 100 or more test subjects), each reading a large number of test sentences (for example, 100 or more sentences). In other embodiments, any number of test subjects may be used. Any suitable text may be used for testing.
  • a fixation may be an interval in which the subject's eyes stay in one position, fixating on a point in the text.
  • a saccade may be an interval in which the subject's eyes move along the text.
  • the motion of the subject's eyes may also include one or more backwards saccades, in which the subject's eyes return to a previously-read part of the text. Backwards saccades may be referred to as regressions, and may occur about 10% to 15% of the time. Eye movements in reading may serve to direct light from target words onto the fovea, which may be the area of the retina with the highest visual acuity.
  • the eye movement data recorded for each of the test subjects is processed by the processor 18 to determine for each test sentence a plurality of fixations and saccades, each having associated parameters as described below.
  • the processor 18 determines a fixation position and a fixation duration.
  • the fixation position is the position in the text that the subject is looking at during the fixation, for example the letter that the subject is looking at.
  • the fixation duration is the length of time for which the subject's eyes are substantially still while looking at the fixation position. In some circumstances, a typical length of fixation may be around 250 ms, but there may be variation depending on text legibility, linguistic difficulty, reading ability and the aim of the reader.
  • the processor 18 determines a saccade duration and a saccade length.
  • the saccade duration is the length of time during which the subject's eyes moved.
  • the saccade length is the distance in the text by which the subject's eyes moved, for example a number of letters. In some circumstances, a typical saccade may last around 20 to 35 ms and may span 7 letters in English.
  • a total viewing duration for a word may be determined by taking the sum of all fixation durations of that word, regardless of whether the fixations on the word are regressions or not. In normal reading, not every word may be fixated. Some words may be skipped. Some words may be refixated. Skipping probability, fixation probability and refixation probability may be calculated for each word.
  • a pattern of fixations may be plotted on a one dimensional plot which has horizontal sentence position as its axis.
  • an analysis of eye movement may be simplified by considering each eye movement to be either a fixation or a saccade. Smooth pursuit eye movements may not be considered. Smooth pursuit eye movements are generally used to track a moving object, so may not be used to read static text. Finer eye movements such as microsaccades (involuntary saccades during fixation), drifts (slow curved movements between microsaccades), and tremors (very fast small oscillations superimposed on drifts) may not be considered.
  • the analysis of eye movement data may be simplified by considering only horizontal eye movements. In practice, fixations may occur above or below a line of text being read.
  • the analysis of eye movement data may be simplified by defining the motion of saccades using start and end positions and saccade duration, without considering precise movement of eye motions between fixations.
  • a saccade taking the shortest route between the start and end positions may be assumed.
  • the processor 18 aggregates the eye movement data from the plurality of test subjects. In the present embodiment, the processor 18 determines, for each test sentence, average values for fixation position, fixation duration, saccade duration, and saccade length. In other embodiments, different parameters may be determined. Any suitable statistical measures may be determined for each parameter, for example range, mean, and standard deviation.
  • correlations and linear regressions are calculated for a population in order to translate word characteristics (for example, word frequency and word length) that are available for any given text into parameters of biomimetic scrolling.
  • FIG. 2 is a schematic illustration of a text display apparatus 20 comprising a display screen 22 and computing apparatus 24.
  • the computing apparatus 24 is a desktop computer, for example a PC.
  • the computing apparatus 24 comprises a processor 28 configured to display text on display screen 12 using biomimetic scrolling.
  • the computing apparatus 24 may be any suitable computing device, for example a laptop, tablet, smartphone, or e-reader.
  • the computing apparatus 24 may be an assistive reading device, for example a device that is configured to provide text display for a reader with low vision.
  • the computing apparatus 24 may be portable or handheld.
  • the display screen 22 is integrated into the computing apparatus 24.
  • processor 28 displays text on display screen 22 using a biomimetic scrolling text presentation method.
  • the biomimetic scrolling method is configured to mimic natural eye movements made when reading.
  • Biomimetic scrolling may also be referred to as saccadic scrolling, since the text display of biomimetic scrolling may mimic saccadic movement of the eye.
  • Parameters of the biomimetic scrolling method are determined using the eye movement data obtained using the data gathering apparatus 10 of Figure 1.
  • Biomimetic text presentation software may be installed on processor 28, for example a biomimetic text presentation app.
  • the biomimetic text presentation software may be configured to instruct the processor 28 to perform each of a set of actions described below.
  • Figure 3 is a flow chart illustrating a method of an embodiment.
  • the camera 16 measures eye movement data for a plurality of test subjects reading a plurality of sentences.
  • the processor 18 translates the eye movement data obtained by the camera 16 into a series of saccades and fixations.
  • the processor 28 of the test display apparatus 20 uses the measured saccades and fixations to determine a set of biomimetic scrolling parameters for text display.
  • the biomimetic scrolling parameters are determined by processor 18 of the data gathering apparatus 10, or by any other suitable computing device.
  • the processor 28 displays a sentence to a reader on display screen 22 using the determined biomimetic scrolling parameters.
  • FIG 4 is a schematic illustration of an example of a biomimetic scrolling text display displayed on display screen 22.
  • a line of text 40 scrolls intermittently from right to left.
  • the intermittent scrolling is configured to mimic natural eye movement.
  • the intermittent scrolling comprises a set of pauses and a set of text movements.
  • the text movements may be called saccades or text saccades, by analogy with the saccades made by the eye when reading.
  • a length and duration of a text movement may be referred to as a saccade length and saccade duration.
  • a reader is instructed to look at a reading position on the line of text 40.
  • the reading position is a position on which it is intended that a reader should fixate while reading the text as it is displayed by biomimetic scrolling.
  • the reading position is indicated by a pair of arrows 24, which may be referred to as reading position indicators.
  • the reading position is a position on the text that is located between the arrows 42.
  • any suitable reading position indicator may be used.
  • a color or size of the displayed text may be used to indicate a part of the text on which the reader should fixate.
  • the line of text 40 is displayed such that the arrows 42 point to a first position in the text, which may be referred to as a first pause position.
  • the line of text 40 remains with the arrows 42 pointing to the first pause position for a first pause duration.
  • the line of text 40 then moves horizontally from right to left in a first text movement.
  • the line of text 40 moves from right to left by a number of letters.
  • the number of letters may be referred to as the first text movement length, or a first saccade length.
  • the line of text 40 moves to a position in which the arrows 42 point to a second pause position in the text.
  • the line of text may move in any suitable direction.
  • the line of text may be in a language that is read from right to left, and the line of text may therefore move from left to right.
  • the line of text may be in a language that is read vertically and the line of text may move vertically upwards or downwards.
  • the time taken to move the line of text 40 between the first pause position and second pause position may be referred to as the first text movement duration, or a first saccade duration. If the first text movement duration is zero, the line of text 40 moves directly from the first position to second position without assuming any intermediate positions. If the text movement duration is non-zero, the line of text 40 scrolls smoothly from the first position to the second position. The time taken to scroll from the first position to the second position is the first text movement duration. The line of text 40 remains with the arrows 42 pointing to the second position for a second pause duration. The line of text then continues to be moved by the processor 28 in a sequence of alternating text movements and pauses.
  • the biomimetic scrolling method mimics the eye movement of a reader.
  • the eye of the reader is intended to be held still while the text moves relative to the eye in a sequence of pauses and text movements that may mimic the fixations and saccades that may occur during normal reading of static text.
  • each text movement duration is set at zero.
  • Each pause position is set to be in the middle of a word, which may be referred to as the optimal viewing position (OVP) for that word.
  • OVP optimal viewing position
  • the OVP the position in which a reader should fixate
  • the pause positions may be set at the PVL or at any other position determined using the eye movement data.
  • a pause at every word of the line of text 40.
  • the text movement lengths are determined using the distances between words.
  • the pause duration for each pause is based on the eye movement data. In other embodiments, there may be some words that are not paused on, or a single word may contain more than one pause.
  • the pause duration is chosen to equal the total reading time of that word in the eye movement data, which may be defined as the sum of all fixation durations on that word, averaged over all of the plurality of subjects.
  • a pause duration may be, for example, between 50 and 500 ms.
  • One parameter that can be adjusted in some embodiments is a number of pauses per word. There may be zero (corresponding to a skipped word), one (corresponding to a fixated word) or two (corresponding to a refixated word). Use of more pauses may be possible, but may be likely to be used only by low vision users reading highly magnified text.
  • each pause position is based on an average fixation position in the eye movement data.
  • one of the average fixation positions determined from the eye movement data may be between the and 'g' of 'delight' .
  • the processor 28 may causes the text to pause at a position such that the arrows 42 point to a pause position between the 'i' and 'g' of 'delight'.
  • text movement durations are based on measured saccade durations. For example an average saccade duration for a given saccade may be used to determine a text movement duration for a corresponding text movement. In some embodiments, a text movement duration may be, for example, between 20 and 50 ms.
  • parameters of the scrolling are based on average values for those parameters in eye movement data from a plurality of test subjects.
  • parameters of the scrolling are determined by directly translating fixations and saccades from a single test subject.
  • Each fixation made by the test subject is translated into a pause in the text display, the pause duration being the same as the fixation duration and the pause position being at the same point in the text as the intended reading position (i.e. the point in the text on which the test subject fixated is placed between the arrows 42).
  • Each saccade made by the text subject is translated into a text movement, the text movement duration being the same as the saccade duration.
  • eye movement data from an individual is used to determine parameters for a text display to be displayed to that individual.
  • the text display may be tailored to an individual.
  • text movement durations and/or pause durations are based on frame rates of the display screen 12.
  • the display screen 12 may have a fixed or programmable frame rate, which may also be referred to as a screen refresh rate.
  • the frame rate may be expressed as frames per second, or Hz.
  • a movement of the text may be performed between one displayed frame and the next displayed frame.
  • Each frame may be displayed for a frame refresh time which is 1 second divided by the number of frames per second.
  • a pause in the text display may be of one or more frame refresh times.
  • a frame rate may be 25, 30, 50, 60, 120 or 240 frames per second.
  • a frame refresh time may correspondingly be, for example, 1/25 s, 1/30 s, 1/50 s, 1/60 s, 1/120 s or 1/240 s.
  • a pause may be an integer multiple of the frame refresh time.
  • a maximum pause length may be set. For example, a maximum pause time of 1 second may be used.
  • the text displayed to a reader using the text display apparatus of Figure 2 comprises the same sentences as were displayed to the test subjects using the data gathering system of Figure 1.
  • Parameters for a given sentence may be determined using eye movement data measured for that sentence. For example, an average fixation duration for a given word may be used to determine a pause duration for that word.
  • the text displayed to the reader on the text display apparatus 20 of Figure 2 comprises different sentences from the sentences that were used to collect the eye movement data.
  • the processor 28 may determine values for the biomimetic scrolling parameters from the eye movement data using any suitable method. For example, pause positions or durations obtained for a given word may be used for that word even when it is used in a different context. Trends may be obtained from the eye movement data and applied to different texts. Eye movement data may be extrapolated to sentences for which eye movement data has not previously been measured. In some embodiments, the eye movement data is used to determine a relationship between fixation duration and word length. In general, longer words may correspond to longer fixations.
  • the length of the word and the determined relationship between fixation duration and word length is used to determine a pause duration for that word.
  • the eye movement data is used to determine a relationship between fixation duration and word frequency. A reader's gaze may dwell longer upon low frequency (less common) words than on high-frequency (more common) words. If a word is to be displayed that is not part of the text used in testing, the frequency of the word and the determined relationship between fixation duration and word frequency is used to determine a pause duration for that word.
  • each line of text 40 that is displayed comprises a single sentence.
  • the screen 22 is then cleared before displaying a next sentence.
  • the line of text 40 may comprise one or more sentences, or may comprise a fraction of a sentence. Any length of text may be scrolled, for example depending on the use of the text.
  • a pause length on the last word of a sentence may be increased relative to other pauses.
  • Data on the transition between sentences may be incorporated. For example, eye movement data may be analysed to determine a fixation time at the end of a sentence. The fixation time at the end of the sentence may be translated into a corresponding pause time.
  • Text display (for example, a number of sentences displayed at a time) may be influenced by the content and context of the reading materials, or by individual user-driven preferences.
  • the display of text by the text display apparatus 20 may be considered to provide a smart method of text presentation.
  • Some known methods do not take into account the actual content of the text (for example, sentence) being displayed, for example word length or word frequency. Instead, those methods may use parameters that simply designate a size and rate of a step (for example, a number of characters).
  • the scrolling method may take into account the content of the sentence. In deciding how to take into account the sentence content, it may utilise data on how those words or that sentence is read (either by a population or by that individual). It may adapt to the sentences it is fed and the user it is presenting to.
  • the reader who is reading text on screen 22 is given control of a number of display parameters including one or more biomimetic scrolling parameters.
  • the reader may change other reading parameters, for example text size, typeface, contrast, or image enhancement parameters.
  • the reader may change a number of sentences displayed.
  • input from the data and/or data relating to the reader is used to provide a bespoke text presentation for that reader.
  • the processor 28 displays on the screen 22 a user interface through which the reader may control the controllable parameters (for example, by selecting from a list or using a dial or slider).
  • the user interface may be provided by biomimetic text presentation software, for example a biomimetic text presentation app. User-driven preference may be delivered through the app.
  • the reader may change values of one or more biomimetic scrolling parameters relative to initial values that were determined based on the eye movement data. The user may increase or decrease pause durations.
  • the reader inputs a value on a percentage scale and the computing apparatus 24 changes all the pause durations in accordance with the percentage scale, for example by changing all pause durations to 90% or 110% of their original length.
  • the user may change text movement durations.
  • the reader may change text movement durations by a percentage.
  • the reader may change text movement durations from zero to non-zero, for example by inputting a value for the text movement durations.
  • the reader may be given control over scrolling speed.
  • a change in the speed setting measured in words per minute (wpm), proportionally increases or decreases the pause duration on each word.
  • the change in the speed setting also increases or decreases text movement durations.
  • a range of pause durations is scaled according to an overall scrolling speed which is set by the user. For a scrolling speed of 360 wpm, pause durations may be, for example, between 50 ms and 350 ms. If scrolling speed is decreased, pause durations may be increased proportionally.
  • the reader may change pause positions and/or text movement lengths.
  • the reader may change biomimetic scrolling parameters for individual words, or based on a position of a word in a sentence (for example, pausing longer at the end of a sentence).
  • the reader may change the relationship between pause duration and word length, or the relationship between pause duration and word frequency.
  • the reader may be given control of different parameters.
  • parameter values may be selected by someone other than the reader, for example by a medical professional. Any suitable user interface may be used to allow control of parameters.
  • data collected on eye movement parameters is used to define the horizontal movement of a line of text past a particular point which serves as the location of fixation for a steady gaze.
  • the series of fixations and saccades performed while reading static text is reverse engineered to move the text by biomimetic scrolling.
  • the movement of text mimics eye movements.
  • the method of Figure 3 maintains the regular line of text view which is familiar to users, and may include the benefit of parafoveal preview.
  • the eye is intended to remain steady. There may be no blurring effects from moving text.
  • biomimetic scrolling may result in a display that is natural and intuitive to readers since it mimics the eye's natural movement when reading.
  • reading speed may be increased over some other reading methods, for example by reducing the amount of eye movement required.
  • an amount of eye movement used to read text displayed using biomimetic scrolling may be less than an amount of eye movement used to read static text.
  • an amount of smooth pursuit eye movement used to read text displayed using biomimetic scrolling may be less than an amount of smooth pursuit eye movement used to read continuously scrolling text.
  • Figures 5 a to 5c are illustrations of successive frames of a text display of a further embodiment.
  • One implementation of biomimetic scrolling is to use arrows 42 to direct the gaze to a fixation point on the screen.
  • the text is organized into a single line 40 which moves past the arrows 42 and pauses at various points and for various durations according to the pre-defined settings.
  • the settings are as follows:
  • Pause duration The length of time the sentence pauses between the arrows 42. This is varied according to the word of focus, with shorter, more common words requiring a shorter pause duration.
  • Pause position The position in the sentence which pauses between the arrows 42. There can be zero, one or more pauses on a single word, depending on its commonness and length.
  • Saccade duration The length of time between frames. This can either be set zero for instantaneous transition or to a duration of the order of milliseconds.
  • Each fixation position, x may be converted into a sentence position, s, such that the fixation position is located between the arrows.
  • Each sentence in an eye movement corpus begins with 6 spaces and ends with a full stop, and each character has the same width due to the use of a monospaced font.
  • s 0 s 0 ——Fx (Equation 1)
  • s 0 is the initial position from which point the sentence moves to the left when reading left-to-right.
  • character width equals 3 fifths of font size.
  • Scrolling speed - a multiplicative factor to modulate pause and saccade durations.
  • the user interface also incorporates an interactive progress indicator 44 to illustrate the position in the text at which it is currently located. Through interacting with this progress indicator, the text position can also be moved.
  • Figures 5a, 5b and 5c show successive positions of the line of text 40 and corresponding displays of the interactive progress bar 44.
  • the reader is provided with a regression button (not shown) which provides a control that may simulate natural regressions.
  • a regression button (not shown) which provides a control that may simulate natural regressions.
  • the sentence steps back by one word.
  • the reader presses the regression button the sentence steps back from its current pause position to a previous pause position.
  • the eye movement data may be used to determine a regression parameter, for example a number of words by which to regress, a number of characters by which to regress, or a number of regressions.
  • Non-continuous scrolling of text may mimic the natural eye movements of saccades and fixations. Eye tracking data may be incorporated into text presentation. Regression functionality may be introduced as a user control.
  • text is displayed to the reader using biomimetic scrolling.
  • the text is displayed on a display screen 22 coupled to a computing apparatus 24 comprising a processor 28.
  • the biomimetic scrolling text is displayed to the reader on one or more display screens 52 that form part of a head-mounted display system 50.
  • An example of a head-mounted display system 50 is schematically illustrated in Figure 6.
  • Displaying text on a head-mounted display system may be convenient for readers who may have difficulty using a hand-held device such as a tablet or e-reader, for example some readers having Parkinson's disease, MS, or neurodegenerative diseases that may impair the ability to use conventional hand-held devices. Displaying text on a head-mounted display system may also be convenient for applications such as speed- reading, autocue, voice to text for translation, or providing text cues for sensory deficits, for example deafness.
  • Head-mounted display system 50 comprises a small screen 52 held near to a reader's eye 60, and at least one optical component 54. The screen 52 is configured to display an image comprising the scrolling text display.
  • the at least one optical component 54 is configured to magnify and focus the image from the screen 52 onto the reader's eye 60.
  • Head-mounted display system 50 further comprises a computing apparatus 56 comprising a processor 58.
  • the processor 58 is configured to control the text display that is displayed on screen 52.
  • the computing apparatus 56 is integrated into the head-mounted display system.
  • the computing apparatus 56 is connected to the head-mounted display system 50 by a wireless or wired connection.
  • the computing apparatus 56 is a smartphone and the text display is controlled using an app on the smartphone.
  • the at least one optical component 54 magnifies and focuses the image such that it appears to be displayed at a distance further from the reader's eye 60 than the distance at which the screen 52 is placed.
  • a virtual screen 62 may be considered to be present at the apparent distance of the image. The apparent distance of the virtual screen may be at a comfortable distance from the eye.
  • Dotted lines 64 show a path of light from the small screen 52 to the eye 60.
  • Dashed lines 66 show a virtual path of light from the virtual screen 62 as perceived by the reader.
  • FIG 6 only one screen 52, optical component 54 and eye 60 are shown.
  • the reader uses two eyes 60 to view a text display.
  • the head-mounted display system 50 comprises a respective screen 52 and respective optical component 54, or set of optical components 54, for each eye 60.
  • a single screen 52 is used for both eyes. Different parts of the single screen 52 may be used to provide the same image, or different images, to each eye 60.
  • the head-mounted display system 50 provides an opaque text display, such that the reader cannot see their surroundings while reading the text display.
  • the head- mounted display system 50 provides an at least partially transparent text display, such that the reader may be able to see some or all of their surroundings in addition to the text display.
  • a head-mounted display system having at least partially transparent display may be referred to as smart glasses.
  • a head-mounted display system comprises a smartphone headset.
  • the smartphone headset comprises a smartphone and a support frame configured to support a display screen of the smartphone at a predetermined distance from the reader's eyes.
  • the smartphone display screen is configured to display an image comprising a text display.
  • the smartphone display screen may be configured to display two images, one for each eye.
  • a reader who is reading text on screen or screens 52 of head-mounted display system 50 may be given control of certain scrolling parameters and/or other display parameters, for example biomimetic scrolling parameters, reading speed, text size, typeface, contrast, or image enhancement parameters.
  • a reader may be given control of retinal locus or location of presentation. For example, a patient with central vision loss may wish to align the test display to a preferred retinal locus rather than to the fovea.
  • a location of content displayed on the screen may be tailored to an individual reader.
  • parameters may be controllable by a person who is not the reader. For example, parameters may be set by a medical professional.
  • a head-mounted display system 50 may be capable of presenting asymmetric visual stimuli.
  • the different input may be in dependence on ocular dominance.
  • the input provided to each eye is controllable by the reader or by another individual.
  • Some readers with certain conditions, for example AMD, may use different, and possibly changing, settings for each eye, for example different contrast settings.
  • a text display on a head-mounted display system 50 may be used for a mobility or navigational aid. With text displayed on a head-mounted display system 50, a reader may not have to hold a book or to keep their head still in order to read. This may be useful for patients with certain conditions affecting their hand or head movement, for example Parkinson's.
  • Using a head-mounted display system 50 text can be presented in the reader's field of view even if the reader's field of view is limited, for example by central vision loss.
  • a text display apparatus is configured to monitor a reader's reading and to acquire reading data representative of one or more reading parameters.
  • the reading data may be representative of at least one property of the text that is displayed.
  • the reading data may be representative of at least one property of a reading process.
  • the text display apparatus, or a further apparatus is configured to analyse the reading data to identify changes in the reading data that may be representative of visual or neurological deterioration in the reader, or of the presence of a condition.
  • the condition may be a visual or neurological condition.
  • the condition may be a condition that has not previously been diagnosed in the reader.
  • software for example an app serves as both a person-specific reading tool and a diagnostic for monitoring.
  • FIG. 7 A flow chart representative of a method of monitoring reading is illustrated in overview in Figure 7.
  • reading is monitored using the text display apparatus 20 of Figure 2.
  • any suitable apparatus or combination of apparatuses may be used.
  • the apparatus may be any apparatus comprising a screen on which text can be displayed.
  • the text may or may not be displayed by biomimetic scrolling.
  • the apparatus is configured to present vision tests to the reader via the screen.
  • the text display apparatus 20 is configured to display text that is selected by the reader for reading.
  • the user-selected text may comprise any text that the reader wants to read, for example text of a letter, book, newspaper or magazine article, website, or instruction manual.
  • the text display apparatus 20 is used by the reader for reading at home, in the office, or in other non-medical environments.
  • the processor 28 records reading data while the reader is reading user-selected text.
  • the reading data comprises the reading speed, text size, typeface and contrast used by the reader, and the length of time that the reader spent reading.
  • Each of the recorded parameters is a parameter that is changeable by the reader.
  • the processor 28 records the reading data over time.
  • the processor 28 may record the reading data over an extended time. For example, the processor 28 may record reading data for a week, for a month, or longer.
  • Reading data may be acquired during all the time in which a reader is reading, or a portion of the time in which the reader is reading. Reading data may be acquired continuously while the reader is reading, or may be sampled.
  • the processor 28 processes the recorded reading data to identify changes in the reading data. For example, over time the reader may have changed a display speed at which they are reading, or a text size used for reading.
  • the processor 28 determines whether any change identified at stage 72 may be indicative of visual deterioration. In other embodiments, the processor 28 may additionally or alternatively determine whether any change identified at stage 72 may be indicative of neurological deterioration. The processor 28 may determine whether any change identified at stage 72 is indicative of the presence of a condition, for example a condition that has not previously been determined to be present in the reader.
  • changes that may be indicative of deteriorating vision include an overall increase in text size with time, an overall change in contrast with time, an overall decrease in reading speed with time, and an overall decrease in time spent reading with time (for example, reading time per session, per day or per week).
  • Changes that are identified by the processor 28 may be changes that are considered not to be indicative of deterioration. Changes that may not be indicative of deterioration may include, for example, periodic changes to the parameters to account for variations in light levels or to account for the reader's tiredness. In some such cases, parameters may change within a day but may not result in a net change over a longer time period such as a week or a month.
  • any appropriate changes may be identified as being indicative of possible deterioration or of the possible presence of a condition.
  • the processor 28 informs the reader of any changes that may be indicative of deteriorating vision. For example, the processor 28 may email the reader, or the processor 28 may display a warning screen the next time the reader uses the text display apparatus 20. In other embodiments, any suitable alert may be used. The alert may be triggered when one or more parameters of reading data cross a threshold value or change by more than a threshold value.
  • the processor 28 informs a further person of any changes that may be indicative of deteriorating vision, instead of or in addition to informing the reader.
  • the processor 28 may inform the reader's doctor, optician, nurse or any suitable medical professional.
  • the processor 28 may inform the reader's relative or caregiver.
  • the processor 28 may prompt the reader, medical professional or other person to arrange a medical appointment to test whether the reader's vision has actually deteriorated.
  • a community eyecare evaluation may be scheduled.
  • An appointment at a specialist eye unit may be made or accelerated.
  • any deterioration is notified to the reader or to another individual.
  • only deterioration below a threshold level is notified. For example, a notification may be made if the reading data indicates that a reading performance of an individual has dropped below what is considered to be normal for their age and gender.
  • the processor 28 transmits data to a central server (not shown) at a remote location using a communication resource (not shown).
  • the remote location may be, for example, in a different room, building, street, or town from the reader.
  • the data transmitted to the central server includes the changes that may be indicative of deteriorating vision.
  • the data transmitted to the central server includes some or all of the reading data recorded for the reader.
  • data is transmitted to any suitable remote location, for example to any suitable remote computing apparatus.
  • the text display apparatus is configured to record reading data but may not process the reading data.
  • the text display apparatus sends the reading data to a remote computing apparatus, for example the central server.
  • a processing resource of the remote computing apparatus is configured to process the reading data to determine changes that may be indicative of a visual or neurological deterioration or condition.
  • the reading data may be processed by any suitable processing resource.
  • the central server is configured to receive and store reading data from a plurality of readers.
  • the central server may be part of a medical records system.
  • the central server may be located in a medical facility that is remote from the reader. Data on the central server may be accessible by one or more medical professionals.
  • any suitable reading data may be recorded, which may or may not include values for parameters listed above. Any suitable changes in the reading data may be used to indicate a possibility of visual or neurological deterioration or of a condition.
  • the reading data is obtained during normal reading activity by the reader, for example reading activity in the reader's home.
  • the reader is not required to participate in particular reading tests.
  • Reading data may be recorded during all of the reader's reading activities. Alternatively, reading data is recorded periodically, for example at intervals determined by the processor 28. In each case, the reading data is recorded while the reader is reading user-selected content. In other embodiments, some or all of the reading data is obtained while the reader is participating in vision tests, for example vision tests that are delivered on the text display apparatus 20 of Figure 2.
  • processor 28 of the text display apparatus 20 of Figure is configured to present one or more types of vision test to the reader.
  • the processor 28 may be configured to present the vision tests to the reader on a regular basis, for example to present to the reader a weekly vision test or set of vision tests. In some embodiment, the processor 28 is configured to present one or more vision tests to the reader when requested by the reader or by another individual (for example, a medical professional).
  • the processor 28 processes the reading data obtained from the vision tests (for example, from weekly vision tests) to identify changes and determine whether the changes may be indicative of deteriorating vision. In other embodiments, the processor 28 processes the reading data to identify changes that may be indicative of a neurological deterioration, or of the presence of a condition. In some embodiments, the processor 28 processes the reading data obtained from the vision tests in combination with reading data obtained during normal reading of user-selected texts.
  • a vision test may comprise, for example, a reading speed test, a reading acuity test, a contrast visibility test, a contrast sensitivity test, a perimetry test, or a pupillary response test.
  • the vision tests are designed to correspond to standardised tests that measure aspects of the reader's vision in a standardised way.
  • the vision tests may be designed to be independent of the testing medium (for example, the screen 22 of the test display apparatus 20) and to yield an absolute measurement of vision (for example, acuity or contrast sensitivity).
  • the vision tests are screen visibility tests that are designed to measure the performance of the reader's vision in seeing or reading from the particular display used. Screen visibility tests may not yield an absolute measurement of vision. However, repeated use of screen visibility tests may allow changes in a reader's ability to read from a particular screen to be identified. Screen visibility tests may test, for example, a text size or contrast level that is the limit of what an individual reader can read on the screen that they are using. A screen visibility may test an extent of visibility, which may measure how much of the screen is visible to the reader. In the present embodiment, the processor 28 identifies changes in the reader's test performance over time.
  • the processor 28 determines that the reader may have experienced visual (or neurological) deterioration or that the reader may have developed a condition. Results of repeated tests may be compared to baseline results. In some embodiments, the reader is notified any time that their test performance has decreased. In some embodiments, a reader is notified only if their test performance drops below a threshold test performance, for example a test performance that is considered to be normal for their age and gender. In the present embodiment, the processor 28 conducts a reading speed test by presenting sentences to the reader at different speed and testing the reader's comprehension. If the reader's maximum measured reading speed has decreased from one week to the next, the processor 28 informs the patient, medical professional, or another person that the reader's vision may have deteriorated. In other embodiments, performance in any suitable vision tests may be measured.
  • the processor 28 may transmit vision test results to the central server.
  • the vision test results may be stored in the reader's medical records.
  • changes in reading data are used to warn of possible visual deterioration.
  • changes in reading data may be used for other purposes.
  • a change in one or more display parameters that are selected by the reader may be used to automatically change other display parameters that are not directly changeable by the reader.
  • reading data may be recorded during the normal reading of the reader without the reader having to participate in many (or any) vision tests. Acquiring data during normal reading may be convenient for the reader, since it may not involve any disruption to the reader's normal activities. Acquiring data during normal reading may allow a large amount of data to be acquired. Acquiring data during normal reading may allow passive monitoring of an at-risk population, or of the general population,
  • the apparatus used to acquire reading data is the same text display apparatus that is used routinely by the reader for reading, for example the reader's desktop computer, laptop, tablet, e-reader or assistive reading device.
  • reading data is only acquired for readers with known visual or neurological conditions.
  • any user of the text display apparatus may be informed of changes that may indicate visual or neurological deterioration or presence of a condition, whether or not that user is already known to have a visual or neurological condition.
  • the reader is instructed to use a further apparatus for reading or for performing vision tests that is not the reader's normal reading apparatus.
  • the further apparatus is a dedicated test apparatus.
  • the reader may be instructed to use the further apparatus at specified times.
  • reading data may be obtained in a reader's own home. Reading data may be obtained more often than traditional vision tests, for example vision tests performed in a surgery or hospital by an optician or ophthalmologist. Acquiring reading data regularly (whether from normal reading or from repeated testing) may allow possible deterioration in vision to be detected more quickly than would be the case using traditional eye testing.
  • the reader may be tested in an environment more similar to the environment in which the reader usually reads.
  • the processor 28 is configured to identify changes in reading data (for example changes in parameters used or in results of vision tests) and to notify the reader or other individual of possible deterioration based on those changes.
  • the processor 28 may notify a reader or other individual of a possible eye condition based on the reading data, even if there has been no change in the reading data. For example, the processor 28 may notify the reader based on an initial set of reading data or on reading data that is consistent with time. The processor 28 may issue a warning if the reader's reading data indicates unusually poor vision for the reader's age and gender. The processor 28 may issue a warning if the reading data is inconsistent with what is known about the patient's visual and/or neurological conditions, for example if a reader who is not expected to have a visual condition performs poorly in a vision test.
  • the processor 28 is configured to identify parameter values that may be indicative of at least one condition based on reading data.
  • the reading data may be data that is obtained at a single time.
  • the reading data may be data that is obtained over time.
  • the processor 28 may be configured to identify parameter values indicative of a condition which causes optic neuritis, for example multiple sclerosis.
  • the processor 28 may be configured to identify parameter values indicative of a condition which causes optic neuropathy, for example glaucoma.
  • the processor 28 may be configured to identify parameter values indicative of a condition which causes macular oedema, for example diabetes, retinal vein occlusion, retinal dystrophy, drug toxicity, uveitis.
  • the processor 28 may be configured to identify parameter values that are indicative of AMD.
  • Optic neuritis, optic neuropathy or macular oedema may affect reading, may be painless, and unless picked up early and treated may lead to progression and irreversible sight loss.
  • the processing of reading data by the processor 28, for example data from routine reading, may in some circumstances allow earlier detection of potentially serious conditions.
  • the text display apparatus 20 is used to acquire reading data.
  • the reading data comprises values for parameters that may be set by the user, such as text size or contrast.
  • an apparatus is used that is configured to acquire further types of reading data, for example eye movement data, blink rate data or pupil response data.
  • Figure 8 is a schematic illustration of a reading system comprising reading test apparatus 80 that is configured to acquire reading data which comprises eye movement data, and a remote server 89.
  • the reading test apparatus 80 comprises a screen 82, a computing apparatus 84 comprising a processor 88 and a communication resource 87, and a camera 86.
  • the reading test apparatus 80 may be portable.
  • the reading test apparatus 80 may be useable wherever a user would like to read, for example at home or in the office.
  • the reading test apparatus 80 may be of a size that is convenient for the reader to use.
  • the reading test apparatus 80 may be handheld.
  • the camera 86 may be replaced by any eye movement detector.
  • the reading test apparatus 80 also comprises a light sensor, which may be configured to mitigate against variations in ambient lighting.
  • the reading test apparatus 80 is configured to transmit reading data to remote server 89.
  • processor 88 displays user-selected text on screen 82.
  • the camera 86 records the reader's eye movements while reading and sends eye movement data to the processor 88.
  • processor 88 displays a vision test to the user on screen 82.
  • the camera 86 records the reader's eye movements while participating in the vision text and sends eye movement data to the processor 88.
  • the processor 88 processes the eye movement data determines whether there is any aspect of the reader's eye movement data that may be indicative of deteriorating vision. In some embodiments, the processor 88 processes the eye movement data in combination with other reading data. In other embodiments, the eye movement data is processed in a processing resource of the remote server 89, or in any other suitable processing resource.
  • Communication resource 87 sends data from the processor 88 to the remote server 89.
  • the communication resource 87 may comprise, for example, a wireless antenna or internet connection.
  • the data may represent the changes in the reading data, or may comprise the reading data itself.
  • the reading test apparatus 80 comprises a detector (for example, a camera) configured to measure pupil response.
  • the recorded reading data may comprise pupil response data.
  • Vision tests delivered to the user may comprise pupillary response tests.
  • the reading test apparatus 80 comprises a detector (for example, a camera) configured to measure blink rate.
  • the recorded reading data may comprise blink rate data.
  • a single camera may be used to measure two or more of eye movement, pupil response and blink rate.
  • the apparatus of Figure 8 may provide continuous or periodic monitoring of reading data.
  • eye movement data, blink rate data and/or pupil response data may also be recorded while the reader is not reading.
  • any type or types of reading data may be recorded.
  • Multi-modal testing may be used. Multi-modal testing may provide better or faster indications of changes that may indicate deterioration, or may allow a larger number of possible changes to be assessed (for example, changes in eye movement in addition to changes in parameter settings).
  • Specific components of eye movement data for example pursuit and saccadic characteristics, may be mapped at baseline and compared with normative age-gender matched datasets to highlight deviations from optimality and change over times.
  • Eye movement signatures may be compared.
  • Cognitive decline eye movement signatures may be determined. Results may be compared with normal populations and in aging.
  • Reading test-type equivalent size, luminance, contrast and colour contrast may be used as descriptors for the reading data.
  • diseases affecting the macula may lead to a preference for larger print size, increased illumination and contrast, and shifting preference in colour contrast for each eye and both together. Changes in print size, illumination, contrast and/or colour contrast may be considered to be indicative of the possible presence or progress of macular disease.
  • any appropriate action may be taken if changes indicative of possible deterioration are identified.
  • the identification of changes that may indicate deterioration triggers a community eyecare evaluation to determine whether referral is needed.
  • identification of changes that may indicate deterioration may result in access to a specialist eye unit being fast-tracked.
  • a threshold for each eye for example a threshold of optotype characteristics such as text size
  • the thresholds may be used as a metric against which change may be measured. An alert may be issued if thresholds are crossed.
  • parameters for example, optotype characteristics
  • a threshold may be set for triggering optometric service evaluation.
  • a camera for measuring eye movement data, blink rate data, pupil response data or any other suitable data is mounted on a pair of glasses, on a head-mounted display system, or on another wearable device.
  • a head-mounted display system is used as a reading test apparatus. Data may be recorded for one or both eyes. In some circumstances, different input may be provided to each eye. Vision tests may be presented separately to each eye.
  • FIG. 9 is a schematic illustration of an embodiment of a head-mounted display device 90.
  • the head-mounted display device 90 comprises a commercially available head-mounted display device which in this embodiment is an Epson Moverio 200.
  • the head-mounted display device is used to display an image, for example an image comprising text.
  • the head-mounted display device 90 comprises a right screen 91 and a left screen 92; a backward -facing camera and/or sensor 94, and a forward-facing camera and/or sensor 96.
  • a processor receives an input, for example a text input, a voice to text input, or an OCR (optical character recognition) input.
  • the processor processes the input into a format for presentation, for example using biomimetic scrolling and/or image enhancement.
  • an algorithm app prepares a user interface and provides displays to right and left screens 104, resulting in an image display 110 being visible to a user of the head-mounted display device 90.
  • component parts may include a display module, camera (forward and back facing), wire and wireless Bluetooth connections, software app user interface, image recognition and OCR input feed.
  • a suite of presentations may use sample input and a user-directed algorithm presented using an app incorporating biometric scrolling and a series of forced choice options to determine preferred parameters including, for example, text font and size, colour contrast, luminance, scrolling speed, interocular and bilateral presentation parameters. These characteristics may form a baseline dataset for detecting change over time.
  • a method may be provided for reading text using modified scrolling text presentation in health and disease.
  • Algorithms and app designs may be provided for individualising text presentation for PC, tablet, phone and head-mounted displays.
  • There may be utilisation of data derived from the apparatus of Figure 9 as a platform for diagnostics in developmental, ocular, cognitive and neurological disease.
  • a method and read-out may be used for exploiting an enhancing near visual function capability as an enabling aid, visual rehabilitation device, pre-clinical diagnostic, monitoring took and/or point of care self-assessment test-kit which may then be integrated with optometry, primary care and hospital-based ophthalmic and systemic paper-based and electronic datasets.
  • An apparatus and method may provide self -testing of visual function, point of care psychophysics evaluation, tele-diagnostics, virtual clinic environments and/or any medical, non-medical, gaming and daily activity which utilises text -based information with stereo-multimedia or holographic displays.
  • Software may be written and integrated.
  • the software may have an algorithm and an app-user interface to navigate preferred text presentation in a stepwise and staircase design which may allow reading capability to be maximised.
  • Baseline characteristics of parameters defined by neuro -ophthalmic and ocular determinants may be of diagnostic and monitoring value as measured of eye and/or CNS (central nervous system) structure and function.
  • Data may be captured in real time. The captured data may reflect measurable physiological processes governing one or more of reading, blink-rate, macular function and pupillary responses indicative of conscious, subconscious and autonomic activity and responses to the cognitive challenge in text scanning and reading.
  • a device may advance reading capability in both health and disease and may be a tool to acquire deep phenotyping data in preclinical detection and disease monitoring in eye and neurological disorders including, for example, amblyopia, dyslexia, macular disease, glaucoma, cognitive decline, dementia, neurodegenerative diseases including motor neurone disease, multiple sclerosis, Parkinson's disease and stroke.
  • a developer kit for the head-mounted display device was a platform used to write text presentation software. A strategy to allow user preference was adopted.
  • a display may be tailored to specific needs of a user, for example a patient.
  • a display may be tailored in terms of preferred retinal locus, image enhancement, colour contrast and screening a location of a presentation.
  • Input to each eye may be modulated as ocular dominance alters.
  • Ocular dominance, or other parameters, may change over time for any patient with changing vision, for example with AMD progression.
  • Text may be displayed on a reader's preferred fixation position, which in some circumstances may shift over time with disease progression.
  • Text may be presented by reverse engineering eye-tracking used by a healthy population. An improvement in reading speed from biomimetic scrolling has been demonstrated in a clinical trial.
  • biomimetic scrolling and/or head-mounted display may be extended to a wider population of patients with sight-limiting eye and neurological diseases, for example glaucomatous visual field loss, stoke-related visual neglect and hemianopia, Parkinson's disease and MS or neurodegenerative diseases which may impair retinal or optical nerve function, disrupt ocular motility and/or impair the ability to use conventional hand-held devices.
  • sight-limiting eye and neurological diseases for example glaucomatous visual field loss, stoke-related visual neglect and hemianopia, Parkinson's disease and MS or neurodegenerative diseases which may impair retinal or optical nerve function, disrupt ocular motility and/or impair the ability to use conventional hand-held devices.
  • a platform may be adapted, for example using a next model Moverio 300 that incorporates light sensors and cameras.
  • the apparatus may mitigate against variations in ambient lighting (which may optimise reading capability), capture data in diagnostics and/or monitoring (for example, pupillary responses and eye-tracking movements) and/or adapt a patient's self-selected optimal reading algorithm to sense change over time in neuro -ophthalmic functioning.
  • the apparatus may be used as a method to detect pre-clinical disease for cognitive decline and dementia. Cognitive decline and dementia may impact pupil responses, smooth pursuit eye movements and/or saccadic eye movements.
  • Sampling data as a domiciliary point of care instrument in the course of an individual using the apparatus as a non-intrusive wearable piece of technology may allow the apparatus to have a function as a continuous monitoring device for at-risk populations.
  • the apparatus may have application in hands-free reading speed example, head-mounted display autocue for delivering presentations, voice to text for real time translation and/or to allow text cues for other sensory deficits, for example deafness. Further benefit may also be realised in the reading capability of children with dyslexia especially those who show a specific response to spectral hue and intensity. Individuals having amblyopia (a cause of uniocular poor vision) who require eye occlusion therapy may be improved by the apparatus's capability to present asymmetric visual stimuli.
  • Biomimetic scrolling may provide a rapid reading technology that may be capable of enhancing function for normally-sighted and/or sight impaired individuals using digital display.
  • the forms of digital display used may include, for example, large-screen, laptop, tablet, phone and/or head-mounted display hardware.
  • a further embodiment is now described of a method comprising gathering eye movement data and displaying text using biomimetic scrolling, the biomimetic scrolling using parameters based on the eye movement data.
  • Biomimetic scrolling may translate average eye movement behaviour into the movement of a line of text. Biomimetic scrolling may seek to emulate the success of methods such as rapid serial visual presentation in boosting reading speed, whilst maintaining the familiar line of text appearance. In the embodiment described below, the parameters of biomimetic scrolling are described and then tested in a reading speed study with 30 participants. Reading with biomimetic scrolling is found to enable all participants to read faster than with continuous scrolling, on average by a factor of 5 time, and almost half to read faster than with rapid serial visual presentation, with a statistically equal average. Biomimetic scrolling may combine a benefit of speed reading that may be present in RSVP with a benefit of parafoveal preview that may be present in continuous scrolling.
  • Eye movements while reading have been studied, particularly with the advent of high-accuracy eye-trackers. Eye movement research is applied by characterizing the movement of the gaze as it proceeds through a sentence, then designing a text presentation format in which a line of text moves with these characteristics. An effect may be that, whilst the eye remains steady, the text moves through the fixation point in a way comparable to that of regular reading.
  • Figure 4 is an illustration of one frame of the text presentation method, with the rectangular outline defining the boundaries of the screen 22, and the arrows 42 and text 40 showing what is displayed on the screen.
  • the arrows 42 define the location for the gaze to fixate.
  • the method of text presentation biomimetic scrolling
  • the method is based upon eye movement data.
  • a simple version of biomimetic scrolling may be to directly translate the chain of fixations and saccades of an individual reading a particular sentence into the movement of that sentence.
  • the sentence is initially positioned such that the first fixation position is between the arrows, then after pausing there for the duration of the first fixation the sentence takes the duration of the first saccade to move to the second fixation position, and so on until the final fixation.
  • Figure 14 shows an algorithm for biomimetic scrolling according to the data on the eye movements of an individual reading a line of text.
  • S is the sentence position corresponding to the i th fixation; d; is the duration at which the sentence pauses in position; t; is the i th step duration.
  • the processor 18 displays a sentence in a position S;. There is then a delay of d; seconds at stage 134.
  • the processor 18 determines whether i equals the number of fixations for the sentence. If so, the process ends at stage 140. If not, i is incremented to i+1 at stage 152.
  • the processor moves the sentence to position S; over the course of t; seconds.
  • the flow chart then returns to stage 134. Stages 134, 136, 150 and 152 are repeated until the sentence is completed.
  • Eye movements were recorded with an SR Research EyeLink 1000 Desktop mount system. It was equipped with a 2000 Hz camera upgrade, allowing for binocular recordings at a sampling rate of 1000 Hz for each eye. Data from the right eye were analyzed. The experiment was implemented in SR Research Experiment Builder. Sentences were presented in black bold 11 -pt Courier New font on a grey background on the horizontal centerline of the monitor (800 x 600 resolution). Participants were seated 90 cm in front of a 21 -inch monitor with the head positioned on a chin and forehead rest. At the straight-ahead viewing position letters subtended 0.28° of visual angle (center-to-center spacing). A 9-point system-controlled EyeLink calibration procedure was used.
  • Gaze raw data were parsed into sequences of fixations and saccades using SR Research Data Viewer, using the default parameters. Those data were converted into an interest area report, which provides a columnar output of eye movement data for each word in a sentence, separately for each participant.
  • biomimetic scrolling Some parameters of biomimetic scrolling are presented. There may be many viable combinations of these parameters. Different parameters, or different combinations of parameters, may suit various contexts. A particular set of parameters has been chosen for testing.
  • the pause duration may be considered to be equivalent to a fixation duration in normal (non-scrolled) reading.
  • pause duration was chosen to equal the total reading time of a given word.
  • the total reading time of a word was defined as the sum of all fixation durations on that word, averaged over all participants in the eye movement corpus. This was possible as the sentences used in the reading speed study were the same as those used in the eye movement corpus. A method is proposed to allow extrapolation of the eye movement data to sentences for which eye movement data is has not previously been measured.
  • Figure 10 is a plot of total reading time against word, averaged over the 67 participants of the eye movement corpus. Word frequencies are expressed on a logarithmic Zipf scale which ranges from around 1 (very low-frequency) to around 7 (very high-frequency). In Figure 10, words included in multiple sentences are included multiple times with the same Zipf but not necessarily with identical reading times.
  • a line of best fit was calculated using the method of least squares, giving a gradient of -61+1 ms/Zipf and intercept of 567+6 ms. This linear relation may provide a way to estimate the total reading time, and hence a suitable pause duration, from the word frequency.
  • Figure 11 is a plot of total reading time against word length, averaged over the 67 participants of the eye movement corpus.
  • the line of best fit was calculated using the method of least squares, giving a gradient of 37 + 1 ms/letter and an intercept of 45 + 3 ms. This may provide a further way to estimate total reading time and hence to choose an appropriate pause duration, this time based on word length.
  • Word length and word frequency may be correlated. In some embodiments, only word length (and not word frequency) is used to set pause duration. In some embodiment, only word frequency (and not word length) is used to set pause duration.
  • a combination of word length and word frequency is used to set pause duration.
  • a parameter of biomimetic scrolling may be a number of times that the text will pause on a particular word.
  • Another parameter of biomimetic scrolling may be the position along the word at which to pause.
  • an optimal viewing position OVP; the position where readers should fixate
  • PVL preferred viewing location
  • Findings from single- word and text reading studies suggest that word center may be the optimal position for word processing.
  • the preferred viewing location on the other hand, may be slightly left of word center. Both the OVP and PVL may be affected by word length.
  • the pause position is set according to data from the eye movement corpus.
  • Saccade duration may be another biomimetic scrolling parameter that may be adjusted.
  • the mean saccade duration of the eye movement corpus is 36 ms. As this is so short, the saccade duration is set to zero, meaning instantaneous movement between words.
  • the pause duration and saccade duration together set a speed with which the text scrolls.
  • scrolling speed may be a parameters over with the user would have control, it is set as a multiplicative factor to proportionally modulate pause duration.
  • a change in the speed setting measured in words per minute (wpm) proportionally increases or decreases the pause duration on each word.
  • biomimetic scrolling may enable users with normal vision to read at a faster rate than continuous scrolling, as biomimetic scrolling may be more similar to natural eye movements than continuous scrolling.
  • biomimetic scrolling and RSVP both may remove the need for eye movements, it may be expected for reading speeds using biomimetic scrolling to be similar to reading speeds using RSVP.
  • a study was carried out with 30 participants (different to those from the eye movement study), aged between 19 and 62, with a split of 16 males and 14 females.
  • Sentences that were between 12 and 14 words long and with a total number of characters between 68 and 76 were chosen. 40-pt, Courier New font, with white letters on a black background, was viewed from about 50cm on an LCD screen with horizontal width of 34cm and brightness of 210 cd/m for white.
  • the Python programming language, with the Kivy library, was used to implement the text presentation strategies, user interface and record the data.
  • the order of presentation methods was randomized. Each method was set to begin at a speed of 120 wpm. At the end of each sentence a multiple choice comprehension question was asked, with three choices plus a fourth option to indicate when they did not know the correct answer. The same speed level was used twice, then increased to the next level, with each level separated by 120 wpm. The testing would end when they selected "I don't know" four times in a row, or when it reached 1920 wpm for biomimetic scrolling and RSVP or 600 wpm for continuous scrolling. These ceiling levels were high enough not to constrain any participants. The maximum reading speed was taken as the maximum speed level at which both questions were correctly answered.
  • Biomimetic scrolling is a form of line-stepping text presentation in which the sentence moves in a sequence of saccades and pauses with respect to a fixation point on the screen.
  • the choice of length and duration of these saccades and pauses is made with reference to eye movement data in order to mimic normal reading.
  • biomimetic scrolling may save time by reducing or removing eye movements. This speeds up reading both by removing the time needed to make the saccade and the time needed to program it, typically approximately 180 to 250 ms.
  • the time taken to program a saccade may refer to the cognition time required to decide in which direction to make the saccade and where to make the next fixation.
  • Reading speed is just one measure of the success of a text presentation method. Subjective measures of user comfort and satisfaction may also be used to determine the comparative potential of the biomimetic scrolling technique. In addition, only a basic level of comprehension was tested after reading a single sentence. The effectiveness of biomimetic scrolling under sustained reading conditions and its effect on comprehension compared to normal reading is yet to be determined.
  • Biomimetic scrolling may enables reading at almost 5 times the rate of continuous scrolling. It may directly incorporate knowledge about reading and may provide an alternative to RSVP as a speed reading technique.
  • head-mounted display systems were evaluated for macular degeneration.
  • the macula is a part of the retina. The macula makes up the central 17° to 18° of the visual field. That is about the length of a hand viewed with an outstretched arm.
  • Macular disease may result in a loss of central vision.
  • Age-related macular degeneration (AMD) is the most common type of macular disease. AMD is the leading cause of blindness in much of the Western world. Peripheral vision may be unaffected.
  • Low vision aids currently available may include a cane, audio description, sensory substitution, optical or electronic magnification, or vision enhancing glasses. Low vision aids may be divided into two categories: a first category of aids that translate visual information into alternative sensory information, such as sound or touch (sensory substitution) and a second category of aids that adapt visual information to render it more visible to the user.
  • a head-mounted display system has been described above with reference to Figure 6.
  • Smart glasses may provide augmented reality. Smart glasses may provide a partially transparent display.
  • An example used in the evaluation of head-mounted display systems is the Epson Moverio B-200, which has a field of view of 23°.
  • Smart glasses may act like normal glasses in that sight is not blocked.
  • a smartphone headset may be used for virtual reality.
  • One famous budget version, Google Cardboard comprises a smartphone with a cardboard support frame.
  • An example used in the evaluation of head-mounted displays was a Homido with an LG G3 smartphone.
  • the LG G3 smartphone has one of the highest pixel densities at 538 ppi, so pixels may be barely visible even when used close to the eye.
  • a field of view may be 100°.
  • a giant, bright display may be provided with little more than a smartphone.
  • High tech aids for low vision have been reviewed, for example in review paper Moshtael H, Aslam T, Underwood I, Dhillon B. High Tech Aids Low Vision: A Review of Image Processing for the Visually Impaired. Transl Vis Sci Technol, 4(4) 2015.
  • the evaluation of head-mounted display systems had several aims. The evaluation was intended to test the display itself, not use it to its maximum potential. The evaluation aimed to answer several questions. How much of the display can a user see and how well can they see it? How well can they read from the display in terms of speed and text size? What is the user's subjective opinion of the display and headset? Smart glasses and a smartphone based headset were tested. A perimetry test was adapted to measure an extent of visibility.
  • a Radner reading test was adapted for smart glasses to measure reading speed.
  • a questionnaire was used to gather subjective information about visibility, comfort and prospective use. In a trial showing indicators at a plurality of screen positions, participants saw at least 45% of the points, except for two participants who were registered blind. All but one could read from both the smart glasses and the smartphone headset. 70% found it easier or the same to read from the smart glasses than from large print on paper. 91 % found it easier to read from the smartphone headset than from large print on paper.
  • a range of text presentation methods may be used.
  • a range of options may be provided to suit a wide variety of needs.
  • a size setting may be set according to reading acuity. Contrast may be suited to macular disease. Dynamic presentation options, like scrolling, may be used. RSVP may be used.
  • biomimetic scrolling is hypothesised to allow faster reading rates than continuous scrolling, at least in some circumstances.
  • a text presentation method in accordance with an embodiment was tested on the normally sighted and found to match the speed of RSVP and increase the speed of scrolling text by a factor of approximately five.
  • a question of interest is, if text was presented in a way that mimicked natural eye movements whilst reading, in the way patients read before acquiring a visual impairment, could reading be enhanced?
  • An investigation of dynamic methods of text presentation on those with macular disease is performed.
  • RSVP scrolling and biomimetic scrolling are compared to static text.
  • the primary outcome measure is reading speed.
  • the subjective preference of participants is considered. Through testing these methods, and adjusting the size and colour contrast of the text to the participant's preference, the text is then individually tailored.
  • This tailored text, presented on the digital display on the smart glasses, is next compared to reading with the participant's habitual optical aid.
  • Just 9 out of 23 participants (39%) have had experience using electronic low vision aids, and even fewer use them habitually.
  • all participants have used optical low vision aids, and most use them habitually.
  • optical low vision aids usually a magnifier of strength appropriate to their eyesight with a built-in light, are frequently provided by the low vision clinic at the Princess Alexandra Eye Pavilion, Edinburgh.
  • the perceptual span is defined as the width of the window of characters used in a fixation to plan the subsequent saccade. Not all the characters in the perceptual span can necessarily be recognized, so the visual span is defined to count the number of recognized characters in a fixation. In English, the perceptual span for the normally sighted is 4 characters left of fixation and 15 characters right of it, and the visual span is around 10 characters. Forward saccade length is an indirect measure of perceptual span, and was found to decrease from 7.5 letters in control subjects to between 1 and 4 letters in subjects with age-related maculopathy. To compound this issue, the use of magnified text decreases the field of view.
  • the number of pauses per word was set to a constant of 1 as it was assumed that each word could be recognized during this single pause. This is not necessarily a reasonable assumption for subjects with AMD, especially for longer words and for patients requiring larger text size. Therefore, for the partially sighted study, multiple pauses along the length of a single word were allowed.
  • the number of pauses, p was primarily determined by the word length, with longer words taking more pauses.
  • Word frequency is measured according to the logarithmic Zipf scale and was obtained from the SUBTLEX- UK word frequency database for British English.
  • the additive term was linearly scaled to word frequency with a gradient of m and constant of c.
  • the viewing position the position at which the word paused between the arrows, needed to be defined to permit multiple pauses.
  • the viewing positions were centrally positioned in each 1/n section of the word.
  • the viewing position was set as the centre; for two pause points, viewing positions were at one quarter and three quarter positions; for three pauses points, viewing positions were at one sixth, half and five sixths positions.
  • One of the most important settings for enhancing reading speed may be the text size.
  • a trade-off is required between a size large enough so that it is easily legible, but small enough that it does not decrease field of view.
  • the font size was chosen as the critical print size, the minimum text size at which reading speed is at a maximum. After being shown a sample sentence, the participant was given the choice to increase or decrease this size at their preference.
  • Letter spacing, word spacing and line spacing are three other parameters of text presentation. Increasing spacing is intended to reduce the effects of crowding, meaning that text or objects close together are difficult to recognise. A study on letter spacing found increasing the spacing did not increase reading speed in central vision loss. A study on word and line spacing found that both double word and double line spacing achieved the highest reading speed in macular disease. However, another study found that line spacing did not improve reading speed in AMD. Standard letter, word and line spacing were used in our study.
  • sentences are displayed in one of the dynamic display options: Static (non-dynamic), RSVP, horizontal continuous scrolling (leading or times square), and biomimetic scrolling.
  • Static non-dynamic
  • RSVP horizontal continuous scrolling
  • biomimetic scrolling The order of dynamic display options was randomised.
  • the sentences from the Radner Reading Chart were used with permission from Wolfgang Radner.
  • the sentences are standardized in terms of their difficulty and syntactical structure, and the number and length of words used.
  • Oral reading speed was the primary outcome measure for assessing the effectiveness of the visual aids. Reading speed using their habitual optical magnifier was then measured using the Radner reading chart. Reading speed was measured at the smallest print size readable with the magnifier without straining (down to a minimum letter size of 0.5 M) as well as at the size above, and the maximum of these speeds used. Participants wore their habitual reading correction (if any).
  • the reading speed achieved with this tailored approach on the smart glasses is compared, in Figure 15, with the reading speed achieved using the optical aid habitually used by the individual. It shows a scatter plot of the reading speed achieved using the optical aid against that achieved with dynamic text on the smart glasses. The line of equal speed is also plotted, with points above the line indicating a faster speed for the smart glasses.
  • Participants 115 and 116 did not bring an optical aid with them to the study as they reported that they did not use one to read. Therefore, their results reading from paper at their critical print size are used instead. Participants 112, 118, 120, 125 and 129 had vision too poor to read from the smart glasses display and thus are not included in this plot.
  • Figure 16 shows the number of participants that achieved their fastest reading speed out of each of the four methods of static, RSVP, horizontal smooth scrolling and biomimetic scrolling. This plot is included in order to illustrate the range of preferences across the sample, suggesting that the inclusion of a range of options will assist more individuals than a One size fits all' approach. It does not, however, take into account the size of the reading speed differences.
  • the mean reading speed for each text presentation method is shown in Figure 17.
  • Figure 17 is a plot of mean reading speed for the four text presentation methods of static, RSVP, horizontal smooth scrolling and biomimetic scrolling. Although biomimetic scrolling has the highest reading speed, the differences between each method are small.
  • each eye is presented with its own display screen or portion of display screen using a head-mounted display system, for example smart glasses.
  • a head-mounted display system for example smart glasses.
  • Different presentations may be provided to each eye, or text may be presented to only one eye, with the screen or portion of screen for the other eye being turned off.
  • a reader may prefer to have text presented only to their stronger eye.

Abstract

L'invention concerne un système de lecture comprenant : un appareil comportant un écran et un processeur configuré pour afficher une séquence de texte à l'attention un lecteur dans un processus de lecture; et au moins une ressource de traitement configurée pour obtenir des données de lecture représentant au moins une propriété du texte et/ou du processus de lecture, et pour traiter les données de lecture afin d'identifier les changements des données de lecture au fil du temps qui peuvent indiquer une détérioration visuelle ou neurologique et/ou la présence d'au moins un état chez le lecteur.
PCT/GB2017/052656 2016-09-09 2017-09-11 Système de lecture, procédé d'affichage de texte et appareil WO2018046957A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1615382.7 2016-09-09
GBGB1615382.7A GB201615382D0 (en) 2016-09-09 2016-09-09 A text display method and apparatus

Publications (2)

Publication Number Publication Date
WO2018046957A2 true WO2018046957A2 (fr) 2018-03-15
WO2018046957A3 WO2018046957A3 (fr) 2018-04-19

Family

ID=57234569

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2017/052656 WO2018046957A2 (fr) 2016-09-09 2017-09-11 Système de lecture, procédé d'affichage de texte et appareil

Country Status (2)

Country Link
GB (1) GB201615382D0 (fr)
WO (1) WO2018046957A2 (fr)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109567817A (zh) * 2018-11-19 2019-04-05 北京育铭天下科技有限公司 一种阅读能力评估方法及系统及其辅助装置
CN110444065A (zh) * 2019-08-13 2019-11-12 深圳市沃特沃德股份有限公司 辅助阅读方法、装置、存储介质及智能设备
CN112764599A (zh) * 2019-11-01 2021-05-07 北京搜狗科技发展有限公司 一种数据处理方法、装置和介质
WO2021123022A1 (fr) * 2019-12-19 2021-06-24 Sanofi Dispositif et procédé de dispositif de suivi oculaire
CN113655934A (zh) * 2021-08-30 2021-11-16 咪咕数字传媒有限公司 电子文件显示方法、装置、计算设备及计算机存储介质
US11224339B2 (en) 2019-07-16 2022-01-18 International Business Machines Corporation Dynamic eye condition self-diagnosis
EP3956756A4 (fr) * 2019-06-18 2022-06-15 Samsung Electronics Co., Ltd. Procédé et appareil de gestion d'opérations sur des données présentées sur une unité d'affichage
GB202208734D0 (en) 2022-06-14 2022-07-27 Brightpage Tech Ltd Reading method and apparatus
FR3121834A1 (fr) * 2021-04-20 2022-10-21 Scale-1 Portal Système de traitement de troubles neurovisuels ou vestibulaires et procédé de commande d’un tel système
US11670423B2 (en) 2017-11-12 2023-06-06 Bioeye Ltd. Method and system for early detection of neurodegeneration using progressive tracking of eye-markers
WO2023122307A1 (fr) * 2021-12-23 2023-06-29 Thomas Jefferson University Systèmes et méthodes de génération d'examens neuro-ophtalmiques
US11749132B2 (en) 2018-11-21 2023-09-05 International Business Machines Corporation Enhanced speed reading with eye tracking and blink detection
CN117058748A (zh) * 2023-09-07 2023-11-14 杭州励普科技有限公司 一种基于深度阅读识别的电子文档阅读方法和系统
US11918382B2 (en) 2020-04-13 2024-03-05 International Business Machines Corporation Continual background monitoring of eye health
US11972043B2 (en) 2014-06-19 2024-04-30 Apple Inc. User detection by a computing device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030038754A1 (en) * 2001-08-22 2003-02-27 Mikael Goldstein Method and apparatus for gaze responsive text presentation in RSVP display
US9888842B2 (en) * 2012-05-31 2018-02-13 Nokia Technologies Oy Medical diagnostic gaze tracker
US20170258319A1 (en) * 2014-11-27 2017-09-14 Koninklijke Philips N.V. System and method for assessing eyesight acuity and hearing ability

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11972043B2 (en) 2014-06-19 2024-04-30 Apple Inc. User detection by a computing device
US11670423B2 (en) 2017-11-12 2023-06-06 Bioeye Ltd. Method and system for early detection of neurodegeneration using progressive tracking of eye-markers
CN109567817A (zh) * 2018-11-19 2019-04-05 北京育铭天下科技有限公司 一种阅读能力评估方法及系统及其辅助装置
CN109567817B (zh) * 2018-11-19 2023-12-12 北京育铭天下科技有限公司 一种阅读能力评估方法及系统及其辅助装置
US11749132B2 (en) 2018-11-21 2023-09-05 International Business Machines Corporation Enhanced speed reading with eye tracking and blink detection
EP3956756A4 (fr) * 2019-06-18 2022-06-15 Samsung Electronics Co., Ltd. Procédé et appareil de gestion d'opérations sur des données présentées sur une unité d'affichage
US11782592B2 (en) 2019-06-18 2023-10-10 Samsung Electronics Co., Ltd. Method and apparatus for managing display of readable data presented in a portion of a display
US11224339B2 (en) 2019-07-16 2022-01-18 International Business Machines Corporation Dynamic eye condition self-diagnosis
CN110444065B (zh) * 2019-08-13 2022-01-11 深圳市沃特沃德软件技术有限公司 辅助阅读方法、装置、存储介质及智能设备
CN110444065A (zh) * 2019-08-13 2019-11-12 深圳市沃特沃德股份有限公司 辅助阅读方法、装置、存储介质及智能设备
CN112764599A (zh) * 2019-11-01 2021-05-07 北京搜狗科技发展有限公司 一种数据处理方法、装置和介质
WO2021123022A1 (fr) * 2019-12-19 2021-06-24 Sanofi Dispositif et procédé de dispositif de suivi oculaire
US11918382B2 (en) 2020-04-13 2024-03-05 International Business Machines Corporation Continual background monitoring of eye health
WO2022223924A1 (fr) 2021-04-20 2022-10-27 Scale-1 Portal Systeme de traitement de troubles neurovisuels ou vestibulaires et procede de commande d'un tel systeme
FR3121834A1 (fr) * 2021-04-20 2022-10-21 Scale-1 Portal Système de traitement de troubles neurovisuels ou vestibulaires et procédé de commande d’un tel système
CN113655934A (zh) * 2021-08-30 2021-11-16 咪咕数字传媒有限公司 电子文件显示方法、装置、计算设备及计算机存储介质
WO2023122307A1 (fr) * 2021-12-23 2023-06-29 Thomas Jefferson University Systèmes et méthodes de génération d'examens neuro-ophtalmiques
GB2619735A (en) 2022-06-14 2023-12-20 Brightpage Tech Ltd Reading method and apparatus
GB202208734D0 (en) 2022-06-14 2022-07-27 Brightpage Tech Ltd Reading method and apparatus
CN117058748A (zh) * 2023-09-07 2023-11-14 杭州励普科技有限公司 一种基于深度阅读识别的电子文档阅读方法和系统

Also Published As

Publication number Publication date
GB201615382D0 (en) 2016-10-26
WO2018046957A3 (fr) 2018-04-19

Similar Documents

Publication Publication Date Title
WO2018046957A2 (fr) Système de lecture, procédé d'affichage de texte et appareil
US20240099575A1 (en) Systems and methods for vision assessment
Carter et al. Best practices in eye tracking research
US11806079B2 (en) Display system and method
JP5302193B2 (ja) ヒト状態推定装置およびその方法
JP5912351B2 (ja) 自閉症診断支援システム及び自閉症診断支援装置
US20150282705A1 (en) Method and System of Using Eye Tracking to Evaluate Subjects
Harvey et al. Reading with peripheral vision: A comparison of reading dynamic scrolling and static text with a simulated central scotoma
Hougaard et al. Testing of all six semicircular canals with video head impulse test systems
Schafer et al. Glaucoma affects viewing distance for recognition of sex and facial expression
Calabrèse et al. A vision enhancement system to improve face recognition with central vision loss
Wang et al. Understanding How Low Vision People Read Using Eye Tracking
Feis et al. Reading eye movements performance on iPad vs print using a visagraph
Fried-Oken et al. Human visual skills for brain-computer interface use: a tutorial
Wallis et al. Characterization of field loss based on microperimetry is predictive of face recognition difficulties
US10779726B2 (en) Device and method for determining eye movements by tactile interface
Pratt et al. Scotoma visibility and reading rate with bilateral central scotomas
Bindiganavale et al. Development and preliminary validation of a virtual reality approach for measurement of torsional strabismus
Wong Instantaneous and Robust Pupil-Based Cognitive Load Measurement for Eyewear Computing
EP4197425A1 (fr) Détermination d'une performance visuelle d'un il d'une personne
US20230259203A1 (en) Eye-gaze based biofeedback
Asfaw Analysis of natural eye movements to assess visual field loss in glaucoma
Cañadas Suárez et al. Eye Tracking in Optometry: A Systematic Review
Moshtael et al. Saccadic scrolling: Speed reading strategy based on natural eye movements
WO2023196186A1 (fr) Systèmes et procédés améliorés pour tester la vision périphérique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17767914

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17767914

Country of ref document: EP

Kind code of ref document: A2