WO2017174114A1 - Improving readability of content displayed on a screen - Google Patents

Improving readability of content displayed on a screen Download PDF

Info

Publication number
WO2017174114A1
WO2017174114A1 PCT/EP2016/057384 EP2016057384W WO2017174114A1 WO 2017174114 A1 WO2017174114 A1 WO 2017174114A1 EP 2016057384 W EP2016057384 W EP 2016057384W WO 2017174114 A1 WO2017174114 A1 WO 2017174114A1
Authority
WO
WIPO (PCT)
Prior art keywords
gaze
displacement
change
viewer
displayed
Prior art date
Application number
PCT/EP2016/057384
Other languages
French (fr)
Inventor
Tomas JÖNSSON
Tommy Arngren
Stefan WÄNSTEDT
Peter ÖKVIST
Original Assignee
Telefonaktiebolaget Lm Ericsson (Publ)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Telefonaktiebolaget Lm Ericsson (Publ) filed Critical Telefonaktiebolaget Lm Ericsson (Publ)
Priority to EP16715490.5A priority Critical patent/EP3440532B1/en
Priority to US15/107,348 priority patent/US9811161B2/en
Priority to PCT/EP2016/057384 priority patent/WO2017174114A1/en
Publication of WO2017174114A1 publication Critical patent/WO2017174114A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/22Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of characters or indicia using display control signals derived from coded signals representing the characters or indicia, e.g. with a character-code memory
    • G09G5/24Generation of individual character patterns
    • G09G5/26Generation of individual character patterns for modifying the character dimensions, e.g. double width, double height
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0261Improving the quality of display appearance in the context of movement of objects on the screen or movement of the observer relative to the screen
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/045Zooming at least part of an image, i.e. enlarging it or shrinking it
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/14Solving problems related to the presentation of information to be displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/10Automotive applications

Definitions

  • the invention relates to a device for improving readability of content displayed to a viewer, a method of improving readability of content displayed to a viewer, a corresponding computer program, and a corresponding computer program product.
  • the human eye is able to track the movements or acceleration patterns of a moving object, such as a screen displaying content, up to a certain frequency and amplitude, thus managing to maintain proper focus on desired parts of the screen.
  • a person gazing at a screen for viewing content is able to maintain readability up to a certain frequency and amplitude of change in gaze.
  • the term 'readability' relates to the ease with which a viewer can understand a text displayed on the screen. It
  • displayed content may include not only text but also other graphical objects such as images, pictures, buttons, icons, and the like.
  • US 2013/0009962 A1 discloses solutions for automatically adjusting displayed information by selecting font sizes or a zoom level based on settings relating to the user, such as the user's vision or age, as well as ambient light, screen settings, and the displayed information, e.g., foreground and background colors of the displayed information.
  • a handheld device such as a smartphone, or a device mounted in a vehicle.
  • a device for improving readability of content displayed to a viewer comprises a screen for displaying the content to the viewer, a motion sensor for measuring a displacement of the device, and an eye tracker for measuring a change in gaze of the viewer when gazing at the screen.
  • the device is operative to adjust the displayed content so as to improve the readability of the displayed content.
  • the displayed content is adjusted in response to determining that the change in gaze and the displacement of the device are out-of-sync.
  • a method of improving readability of content displayed to a viewer is provided.
  • the method is performed by a device and comprises measuring a displacement of the device, measuring a change in gaze of the viewer when gazing at a screen comprised in the device, and adjusting the displayed content so as to improve the readability of the displayed content.
  • the displayed content is adjusted in response to determining that the change in gaze and the displacement of the device are out-of-sync.
  • a computer program comprises computer-executable instructions
  • a computer program product comprises a computer- readable storage medium which has the computer program according to the third aspect of the invention embodied therein.
  • the invention makes use of an understanding that an improved user experience for users of devices having screens for displaying content is achieved, if measures for adjusting the displayed content for the purpose of enhancing the readability of the displayed content are only taken in response to detecting an actual deterioration in readability as experienced by the viewer, i.e., the person or user gazing at the screen of the device.
  • embodiments of the invention adjust the displayed content in response to determining that the measured change in gaze of the viewer and the measured displacement of the device are out-of-sync.
  • the invention is based on the understanding that the human eye is able to follow displacements of an object up to a certain frequency and/or amplitude, with a certain phase difference which is dependent on frequency.
  • the change in gaze lags behind the displacement of the device, as the eyes attempt to remain focused on the content displayed on the screen. In other words, the viewer's eye or eyes try to stay in-sync with the device when being displaced.
  • the eye or eyes of the viewer cannot cope following the displacement of the device, e.g., because of a large device acceleration or an increase in frequency of the device's vibration in the case of a repetitive or quasi-periodic displacement, the eye or eyes of the viewer lag behind the displacement of the device to such an extent that focus on the displayed content can no longer be maintained.
  • the change in gaze of the viewer and the displacement of the device are out-of-sync.
  • readability may be negatively affected to an extent which makes it impossible for the viewer to understand displayed text or interpret displayed content other than text.
  • the change in gaze and the displacement of the device need to be measured repeatedly, and preferably periodically or continuously, resulting in time-dependent signals, i.e., time-series of data for the gaze, or the change thereof, and the position of the device, or the change thereof, respectively.
  • Embodiments of the invention are particularly advantageous for devices which are designed for handheld operation, such as mobile phones, smartphones, tablet computers or tablets, gaming consoles, media players, and laptops, as well as devices comprising screens which are mounted in vehicles, e.g., in the dashboard of a car. This is the case since these classes of devices are subject to accelerations, shocks, or sudden displacement, while being gazed at.
  • the change in gaze and the displacement of the device are out-of-sync by deriving a phase difference between the measured change in gaze and the measured displacement of the device, and determining that the change in gaze and the displacement of the device are out-of-sync if the derived phase difference, in particular an absolute value thereof, exceeds a threshold value.
  • the threshold value may either be configured by a user of the device or learned, as is described further below.
  • the displayed content is adjusted so as to improve readability by at least one of increasing a font size of displayed text, enlarging one or more displayed graphical objects, and increasing a zoom level of the displayed content.
  • the change in gaze of the viewer is measured by image processing a series of images captured by a camera which is configured for imaging the eye or eyes of the viewer. In particular, this may be a camera which is mounted on the same face of the device as the screen, i.e., a front-facing camera.
  • the change in gaze of the viewer is measured by detecting infra-red light using an infra-red light detector which is comprised in the device.
  • the infra-red light is originating from an infra-red light source comprised in the device and being reflected by the eye or eyes of the viewer.
  • Fig. 1 illustrates a device for improving readability of content displayed to a viewer, in accordance with an embodiment of the invention.
  • Fig. 2 schematically illustrates improving readability of content displayed to a viewer, in accordance with embodiments of the invention.
  • Fig. 3 exemplifies a measured displacement of a device for improving readability of content displayed to a viewer, the corresponding measured change in gaze, and the derived phase difference, in accordance with embodiments of the invention.
  • Fig. 4 shows an embodiment of the processing means comprised in the device for improving readability of content displayed to a viewer.
  • Fig. 5 shows another embodiment of the processing means comprised in the device for improving readability of content displayed to a viewer.
  • Fig. 6 shows a flow chart illustrating a method of improving readability of content displayed to a viewer, in accordance with an embodiment of the invention.
  • Fig. 1 an embodiment 100 of the device for improving readability of content displayed to a viewer 1 10 is shown.
  • device 100 in Fig. 1 is illustrated as a smartphone, i.e., a device for handheld operation
  • the invention may alternatively be embodied in other types of devices such as tablets, gaming consoles, media players, and laptops, or in devices which are mounted in a vehicle, e.g., in or on the dashboard of a car.
  • These types of devices are characterized by being subjected to sudden accelerations, e.g., because the user of a handheld device is walking or riding a bus, in response to which they are displaced relative to viewer 1 10, or rather the viewer's eye(s) 1 1 1 .
  • Such displacements necessitate a change in gaze 1 12, i.e., the direction into which viewer 1 10 is looking.
  • Device 100 comprises a screen 101 for displaying content to viewer 1 10, e.g., text 121 , an image 122, or other content such as user- interface elements (buttons and keys).
  • Device 100 further comprises a motion sensor 102 for measuring a displacement of device 100, and an eye tracker 103 for measuring a change in gaze 1 12 of viewer 1 10 when gazing at screen 101 .
  • a motion sensor 102 for measuring a displacement of device 100
  • an eye tracker 103 for measuring a change in gaze 1 12 of viewer 1 10 when gazing at screen 101 .
  • embodiments of device 100 measure the displacement of device 100 and the change in gaze 1 12 repeatedly, and preferably periodically or continuously.
  • the displacement of device 100 and/or the change in gaze 1 12 are only measured when it is assessed to be likely that the readability of
  • Such an assessment may, e.g., be based on contextual and/or environmental data such as ambient light or sudden accelerations detected by motion sensor 102 or any other sensor comprised in device 100.
  • Motion sensor 102 may be based on any type of sensor which is suitable for measuring a displacement of device 100, e.g., an accelerometer, a gyroscope, a magnetometer, a pedometer, and the like.
  • the output of motion sensor 102 is a time-dependent signal, i.e., a time-series of data d (t) , reflecting the displacement of device 100.
  • Embodiments of device 100 may utilize scalar displacement values d(t) reflecting displacement along a current direction of displacement.
  • a time-series of pairs of scalar values may be utilized, e.g., pairs of values reflecting the displacement of device 100 within a plane defined by screen 101 .
  • a time-series of triplets of scalar values i.e., vectors, may be utilized, reflecting the current displacement of device 100 in three- dimensional space.
  • Eye tracker 103 may utilize a camera comprised in device 100 which is configured for imaging the eye 1 1 1 or eyes of viewer 1 10, such as a front- facing camera 103 which most modern smartphones and tablets are provided with, for measuring the change in gaze 1 12. This may be achieved by image processing a series of images captured by camera 103, as is known in the art.
  • a camera comprised in device 100 which is configured for imaging the eye 1 1 1 or eyes of viewer 1 10, such as a front- facing camera 103 which most modern smartphones and tablets are provided with, for measuring the change in gaze 1 12. This may be achieved by image processing a series of images captured by camera 103, as is known in the art.
  • US 2015/0002392 A1 and US 2015/0346818 A1 disclose solutions for eye tracking and detecting micro eye movements based on images captured with a mobile device, such as a smartphone.
  • eye tracker 103 may comprise an infra-red light source and an infra-red light detector.
  • the change in gaze 1 12 of viewer 1 10 is measured based on infra-red light originating from the infra-red light source, which infra-red light is reflected by the eye 1 1 1 or eyes of viewer 1 10 and subsequently detected by the infra-red light detector. Based on the measured changes in reflections over time, information about eye rotation and the related change in gaze 1 12 may be extracted, as is known in the art.
  • the output of eye tracker 103 is a time-dependent signal, i.e., a time-series of scalar values g(t) reflecting the change in gaze 1 12 of viewer 1 10.
  • the time- series of data g(t) may, e.g., reflect an angle of the direction of gaze 1 12 relative to a reference axis defined in relation to the head of viewer 1 10, or a change thereof.
  • the time-series of data g(t) may reflect a change in the point of gaze 1 13, i.e., the point of focus of the eye(s) 1 1 1 of viewer 1 10 on screen 101 .
  • device 100 is operative to determine 205 that the change in gaze 1 12 and the displacement of device 100 are out-of-sync and, in response thereto, adjust 206 the displayed content 121/122 so as to improve its readability.
  • the determination 205 that the change in gaze 1 12 and the displacement of device 100 are out-of-sync may be achieved in a number of ways. For instance, device 100 may be operative to determine 205 that the change in gaze 1 12 and the
  • the phase difference p(t) may be derived by any known method which is suitable for establishing the instantaneous phase difference between two time-dependent signals. As an example, this may be accomplished by calculating the Hilbert transform for each of the signals d(t) and g(t), i.e.,
  • Hilbert transform is commonly known in the field of signal processing where it is used for deriving an analytic expression of a signal, thereby extending the signal into the complex plane.
  • discrete function such as a time-series of measured values
  • the discrete Hilbert transform is typically used.
  • the instantaneous phase difference p(t) between the change in gaze 1 12 and the displacement of device 100 can be calculated as the difference between the instantaneous phase angle p d (t of the measured displacement d (t) and the instantaneous phase angle p g (_t) of the measured change in gaze g(t) :
  • the derived phase difference p (t) may be expressed in the units of degree or radians.
  • the upper diagram in Fig. 3 illustrates a possible displacement of device 100 by means of a time-series of data d(t) , which may reflect the time-dependent displacement of device 100 as a scalar value having a unit of length.
  • the middle diagram in Fig. 3 illustrates a corresponding simulated change in gaze 1 12 by means of a time-series of data g (t), reflecting the change in direction of gaze 1 12 relative to a reference axis, or the change in point of gaze 1 13, as a scalar value having a unit of angle or length, respectively.
  • the instantaneous phase difference p (t) is obtained, which is shown in the lower diagram in Fig. 3.
  • phase difference p(t) it may be determined whether the measured change in gaze 1 12 and the measured displacement of device 100 are out-of-sync by comparing the phase difference p(t) to a threshold value p max . More specifically, if the instantaneous phase
  • phase difference p (t) may alternatively be expressed in terms of its absolute value
  • the threshold value p max for the derived phase difference may either be configured by a user of device 100 or learned. For instance, if a user of device 100 repeatedly stops gazing at screen 101 when the derived phase difference p (t) has reached about the same value, it may be concluded that the user stops viewing content 121/122 which is displayed on screen 101 for the reason that the readability has deteriorated to an extent which makes reading impossible. By storing a history of values for the instantaneous phase difference when the viewer stops gazing at screen 101 , a suitable threshold value p max for the derived phase difference may be established as an average value, a lower bound, or by performing a statistical analysis of the stored values.
  • signals g(t) and d (t) may need to be filtered so as remove noise and limit the range of frequency components present in the signals. In practice, this may be achieved by applying Finite Impulse Response (FIR) filters or Infinite Impulse Response (MR) filters to signals g(t) and d(t) .
  • FIR Finite Impulse Response
  • MR Infinite Impulse Response
  • phase detector or phase comparator may be employed, as is known in the art. These may be implemented as analog circuits which generate a voltage signal representing the phase difference between two signals.
  • type-ll phase detector is sensitive only to the relative timing of the edges of the input signals and produces a constant output proportional to the phase difference when both signals are at the same frequency.
  • device 100 may be operative to measure at least one of an amplitude and a frequency of the displacement of device 100, and determining that the change in gaze 1 12 and the displacement of device 100 are out-of-sync if at least one of the measured amplitude and the measured frequency of the displacement of the device exceeds a corresponding threshold value. In this case, rather than
  • the amplitude and/or frequency of the displacement of device 100 may be extracted from the measured displacement d(t) .
  • the threshold values for amplitude and/or frequency may either be configured by a user of device 100 or learned. For instance, similar to what is described above, by storing a history of values for the amplitude and/or frequency when the viewer stops gazing at screen 101 , suitable threshold values may be established as average values, lower bounds, or by performing a statistical analysis of the stored values.
  • Device 100 may be operative to adjust the displayed content 121/122 so as to improve readability by at least one of increasing a font size of displayed text 121 , enlarging one or more displayed graphical objects, such as picture 122, and increasing a zoom level of the displayed content.
  • the zoom level may either be increased for the entire displayed content or for parts of the content. The latter may be achieved by using an effect which resembles a magnifying glass, as is known in the art. If the entire displayed content is enlarged by applying a zoom level, point of gaze 1 13 is preferably used as a fixed point so as to avoid shifting the part of the displayed content which viewer 1 10 gazes at.
  • the displayed content may be adjusted gradually with increasing derived phase difference.
  • an increase in font size or a zoom factor which is applied to the displayed content may be dependent on the extent to which the derived phase difference p(t) exceeds the threshold value p max . As an example, this may be achieved by selecting an increase in font size or a zoom factor which is proportional to (
  • device 100 may be operative to adjust the displayed content 121/122 in response to determining that the change in gaze 1 12 and the displacement of device 100 have been out-of-sync for a specified period of time. Thereby, rapid changes in the displayed content are avoided in situations where the derived phase difference p(t) oscillates around the threshold value p max .
  • Device 100 may further be operative adjust the displayed
  • content 121/122 based on settings configured by a user of device 100. These settings may, e.g., define minimum values, maximum values, preferred values, or the like, for a zoom level, font sizes, and so forth. Further optionally, the content may be adjusted based on other criteria such as ambient light, contrast or brightness settings of screen 101 , or the displayed content 121/122, such as fore- and background colors.
  • processing means 104 comprised in device 100.
  • Embodiments of processing means 104 are described in the following, with reference to Figs. 4 and 5.
  • Processing means 400 comprises a processing unit 402, such as a general purpose processor, and a computer-readable storage medium 403, such as a Random Access Memory (RAM), a Flash memory, or the like.
  • processing means 400 comprises one or more interfaces 401 ('I/O' in Fig. 4) for controlling and/or receiving information from motion sensor 102, eye tracker 103, such as a front-facing camera, and screen 101 .
  • interface(s) 401 may be arranged for receiving time-series of data d(t) and g (t), reflecting the displacement of device 100 and the change in gaze 1 12, respectively, or information from which d (t) and g(t) can be derived.
  • Memory 403 contains computer-executable instructions 404, i.e., a computer program, for causing device 100 to perform in accordance with embodiments of the invention as described herein, when computer-executable instructions 404 are executed on processing unit 402.
  • processing means 500 comprises one or more interfaces 501 ('I/O' in Fig. 5) for controlling and/or receiving information from motion sensor 102, eye tracker 103, such as a front-facing camera, and screen 101 .
  • processing means 500 further comprises a displacement module 502, a gaze
  • displacement module 502 is configured for measuring a displacement of device 100
  • gaze module 503 is configured for measuring a change in gaze 1 12 of viewer 1 10 when gazing at screen 101
  • adjustment module 504 is configured for adjusting, in response to
  • Interface(s) 401 and 501 , and modules 502-504, as well as any additional modules comprised in processing means 500, may be
  • Method 600 comprises measuring 601 a displacement of the device, measuring 602 a change in gaze of the viewer when gazing at a screen comprised in the device, and, in response to determining that the change in gaze and the displacement of the device are out-of-sync, adjusting 605 the displayed content so as to improve the readability of the displayed content.
  • the determining that the change in gaze and the displacement of the device are out-of-sync may, e.g., comprise deriving 603 a phase difference between the measured change in gaze and the measured displacement of the device, and determining 604 that the change in gaze and the displacement of the device are out-of-sync if the derived phase difference exceeds a threshold value.
  • the determining that the change in gaze and the displacement of the device are out-of-sync may comprise measuring at least one of an amplitude and a frequency of the displacement of the device, and determining that the change in gaze and the displacement of the device are out-of-sync if at least one of the measured amplitude and the measured frequency of the displacement of the device exceeds a corresponding threshold value.
  • Method 600 may comprise additional, or modified, steps in accordance with what is described throughout this disclosure.
  • Method 600 may be performed by a device for handheld operation, e.g., a mobile phone, a smartphone, a tablet, a gaming console, a media player, and a laptop, or by a device mounted in a vehicle, e.g., in or on the dashboard of a car.
  • An embodiment of method 600 may be implemented as software, such as computer program 404, to be executed by a processing unit comprised in the device, whereby the device is operative to perform in accordance with embodiments of the invention described herein.
  • the person skilled in the art realizes that the invention by no means is limited to the embodiments described above. On the contrary, many modifications and variations are possible within the scope of the appended claims.

Abstract

A device (100), such as a smartphone or tablet, for improving readability of content (121, 122) displayed to a viewer (110) is provided. The device comprises a screen (101) for displaying the content to the viewer, a motion sensor (102) for measuring a displacement of the device, and an eye tracker, such as a front-facing camera (103), for measuring a change in gaze (112) of the viewer when gazing at the screen. The device is operative to adjust the displayed content so as to improve its readability in response to determining that the change in gaze and the displacement of the device are out-of-sync. The out-of-sync condition may be determined by deriving a phase difference between the measured change in gaze and the measure displacement of the device, and comparing the derived phase difference to a threshold value. The displayed content is adjusted by at least one of increasing a font size of displayed text (121), enlarging one or more displayed graphical objects (122), and increasing a zoom level of the displayed content.

Description

IMPROVING READABILITY OF CONTENT DISPLAYED ON A SCREEN Technical field The invention relates to a device for improving readability of content displayed to a viewer, a method of improving readability of content displayed to a viewer, a corresponding computer program, and a corresponding computer program product. Background
With the increasing use of screen-based devices, in particular handheld devices such as mobile phones, smartphones, and tablet computers or tablets, the issue of screen readability has become more prominent, for at least the following reasons. Firstly, the size of screens which are provided as displays with this class of devices is limited so as to enable users to accommodate their devices in a pocket of their shirt, jacket, or trousers, or to be carried in a briefcase. As a consequence, the size of displayed content is limited. In particular, this limits the range of font sizes which are reasonable for displaying text the screen of such a device. Second, readability is negatively affected by inadvertent displacements of the device, such as vibrations or sudden movements, relative to the user's eye or eyes gazing at the screen. The issue of inadvertent displacements or vibrations has emerged due to the increasing use of handheld devices while walking, driving in a car, riding the bus or train, and the like.
The human eye is able to track the movements or acceleration patterns of a moving object, such as a screen displaying content, up to a certain frequency and amplitude, thus managing to maintain proper focus on desired parts of the screen. Hence, a person gazing at a screen for viewing content is able to maintain readability up to a certain frequency and amplitude of change in gaze.
In the present context, the term 'readability' relates to the ease with which a viewer can understand a text displayed on the screen. It
encompasses 'legibility', which is a measure of how easily a reader can distinguish individual letters or characters from each other. More general, displayed content may include not only text but also other graphical objects such as images, pictures, buttons, icons, and the like.
Known solutions which address the issue of readability of devices comprising screens for displaying content are based on utilizing user settings for adjusting a font size which is used for displaying text or a zoom factor which is applied to enlarge parts of the content, or the entire content, displayed on the screen.
US 2013/0009962 A1 discloses solutions for automatically adjusting displayed information by selecting font sizes or a zoom level based on settings relating to the user, such as the user's vision or age, as well as ambient light, screen settings, and the displayed information, e.g., foreground and background colors of the displayed information.
While known solutions take certain contextual or environmental circumstances such as ambient light into account, it is not assessed whether the user viewing content displayed on a screen of the device in fact suffers from inferior readability before adjusting the displayed content or information.
Summary
It is an object of the invention to provide an improved alternative to the above techniques and prior art.
More specifically, it is an object of the invention to provide an improved solution for improving the readability of content which is displayed to a viewer, e.g., text or other graphical objects which is/are displayed on the screen of a handheld device, such as a smartphone, or a device mounted in a vehicle.
These and other objects of the invention are achieved by means of different aspects of the invention, as defined by the independent claims. Embodiments of the invention are characterized by the dependent claims.
According to a first aspect of the invention, a device for improving readability of content displayed to a viewer is provided. The device comprises a screen for displaying the content to the viewer, a motion sensor for measuring a displacement of the device, and an eye tracker for measuring a change in gaze of the viewer when gazing at the screen. The device is operative to adjust the displayed content so as to improve the readability of the displayed content. The displayed content is adjusted in response to determining that the change in gaze and the displacement of the device are out-of-sync.
According to a second aspect of the invention, a method of improving readability of content displayed to a viewer is provided. The method is performed by a device and comprises measuring a displacement of the device, measuring a change in gaze of the viewer when gazing at a screen comprised in the device, and adjusting the displayed content so as to improve the readability of the displayed content. The displayed content is adjusted in response to determining that the change in gaze and the displacement of the device are out-of-sync.
According to a third aspect of the invention, a computer program is provided. The computer program comprises computer-executable
instructions for causing a device to perform the method according to an embodiment of the second aspect of the invention, when the computer- executable instructions are executed on a processing unit comprised in the device.
According to a fourth aspect of the invention, a computer program product is provided. The computer program product comprises a computer- readable storage medium which has the computer program according to the third aspect of the invention embodied therein.
The invention makes use of an understanding that an improved user experience for users of devices having screens for displaying content is achieved, if measures for adjusting the displayed content for the purpose of enhancing the readability of the displayed content are only taken in response to detecting an actual deterioration in readability as experienced by the viewer, i.e., the person or user gazing at the screen of the device. As is described herein, embodiments of the invention adjust the displayed content in response to determining that the measured change in gaze of the viewer and the measured displacement of the device are out-of-sync.
The invention is based on the understanding that the human eye is able to follow displacements of an object up to a certain frequency and/or amplitude, with a certain phase difference which is dependent on frequency. Typically, the change in gaze lags behind the displacement of the device, as the eyes attempt to remain focused on the content displayed on the screen. In other words, the viewer's eye or eyes try to stay in-sync with the device when being displaced. If the eye or eyes of the viewer cannot cope following the displacement of the device, e.g., because of a large device acceleration or an increase in frequency of the device's vibration in the case of a repetitive or quasi-periodic displacement, the eye or eyes of the viewer lag behind the displacement of the device to such an extent that focus on the displayed content can no longer be maintained. As a result, the change in gaze of the viewer and the displacement of the device are out-of-sync. In such scenario, readability may be negatively affected to an extent which makes it impossible for the viewer to understand displayed text or interpret displayed content other than text.
It is to be understood that, in order to assess whether the change in gaze and the displacement of the device are in-sync or out-of-sync, the change in gaze and the displacement of the device need to be measured repeatedly, and preferably periodically or continuously, resulting in time- dependent signals, i.e., time-series of data for the gaze, or the change thereof, and the position of the device, or the change thereof, respectively.
Embodiments of the invention are particularly advantageous for devices which are designed for handheld operation, such as mobile phones, smartphones, tablet computers or tablets, gaming consoles, media players, and laptops, as well as devices comprising screens which are mounted in vehicles, e.g., in the dashboard of a car. This is the case since these classes of devices are subject to accelerations, shocks, or sudden displacement, while being gazed at.
According to an embodiment of the invention, it is determined that the change in gaze and the displacement of the device are out-of-sync by deriving a phase difference between the measured change in gaze and the measured displacement of the device, and determining that the change in gaze and the displacement of the device are out-of-sync if the derived phase difference, in particular an absolute value thereof, exceeds a threshold value. The threshold value may either be configured by a user of the device or learned, as is described further below.
According to an embodiment of the invention, it is determined that the change in gaze and the displacement of the device are out-of-sync by measuring at least one of an amplitude and a frequency of the displacement of the device, and determining that the change in gaze and the displacement of the device are out-of-sync if the measured amplitude of the displacement of the device exceeds a corresponding threshold value, the measured frequency of the displacement of the device exceeds a corresponding threshold value, or a combination of both.
According to an embodiment of the invention, the displayed content is adjusted so as to improve readability by at least one of increasing a font size of displayed text, enlarging one or more displayed graphical objects, and increasing a zoom level of the displayed content. According to an embodiment of the invention, the change in gaze of the viewer is measured by image processing a series of images captured by a camera which is configured for imaging the eye or eyes of the viewer. In particular, this may be a camera which is mounted on the same face of the device as the screen, i.e., a front-facing camera.
According to an embodiment of the invention, the change in gaze of the viewer is measured by detecting infra-red light using an infra-red light detector which is comprised in the device. The infra-red light is originating from an infra-red light source comprised in the device and being reflected by the eye or eyes of the viewer. By analyzing changes in the detected reflections it is possible to extract information about eye rotation, and thereby a change in gaze.
Even though advantages of the invention have in some cases been described with reference to embodiments of the first aspect of the invention, corresponding reasoning applies to embodiments of other aspects of the invention.
Further objectives of, features of, and advantages with, the invention will become apparent when studying the following detailed disclosure, the drawings and the appended claims. Those skilled in the art realize that different features of the invention can be combined to create embodiments other than those described in the following.
Brief description of the drawings The above, as well as additional objects, features and advantages of the invention, will be better understood through the following illustrative and non-limiting detailed description of embodiments of the invention, with reference to the appended drawings, in which:
Fig. 1 illustrates a device for improving readability of content displayed to a viewer, in accordance with an embodiment of the invention. Fig. 2 schematically illustrates improving readability of content displayed to a viewer, in accordance with embodiments of the invention.
Fig. 3 exemplifies a measured displacement of a device for improving readability of content displayed to a viewer, the corresponding measured change in gaze, and the derived phase difference, in accordance with embodiments of the invention.
Fig. 4 shows an embodiment of the processing means comprised in the device for improving readability of content displayed to a viewer.
Fig. 5 shows another embodiment of the processing means comprised in the device for improving readability of content displayed to a viewer.
Fig. 6 shows a flow chart illustrating a method of improving readability of content displayed to a viewer, in accordance with an embodiment of the invention.
All the figures are schematic, not necessarily to scale, and generally only show parts which are necessary in order to elucidate the invention, wherein other parts may be omitted or merely suggested.
Detailed description The invention will now be described more fully herein after with reference to the accompanying drawings, in which certain embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided by way of example so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
In Fig. 1 , an embodiment 100 of the device for improving readability of content displayed to a viewer 1 10 is shown. Whereas device 100 in Fig. 1 is illustrated as a smartphone, i.e., a device for handheld operation, it will be appreciated that the invention may alternatively be embodied in other types of devices such as tablets, gaming consoles, media players, and laptops, or in devices which are mounted in a vehicle, e.g., in or on the dashboard of a car. These types of devices are characterized by being subjected to sudden accelerations, e.g., because the user of a handheld device is walking or riding a bus, in response to which they are displaced relative to viewer 1 10, or rather the viewer's eye(s) 1 1 1 . Such displacements necessitate a change in gaze 1 12, i.e., the direction into which viewer 1 10 is looking.
Device 100 comprises a screen 101 for displaying content to viewer 1 10, e.g., text 121 , an image 122, or other content such as user- interface elements (buttons and keys). Device 100 further comprises a motion sensor 102 for measuring a displacement of device 100, and an eye tracker 103 for measuring a change in gaze 1 12 of viewer 1 10 when gazing at screen 101 . It will be appreciated that embodiments of device 100 measure the displacement of device 100 and the change in gaze 1 12 repeatedly, and preferably periodically or continuously. Optionally, the displacement of device 100 and/or the change in gaze 1 12 are only measured when it is assessed to be likely that the readability of
content 121/122 displayed on screen 101 is at risk. Such an assessment may, e.g., be based on contextual and/or environmental data such as ambient light or sudden accelerations detected by motion sensor 102 or any other sensor comprised in device 100.
Motion sensor 102 may be based on any type of sensor which is suitable for measuring a displacement of device 100, e.g., an accelerometer, a gyroscope, a magnetometer, a pedometer, and the like. The output of motion sensor 102 is a time-dependent signal, i.e., a time-series of data d (t) , reflecting the displacement of device 100. Embodiments of device 100 may utilize scalar displacement values d(t) reflecting displacement along a current direction of displacement. Alternatively, a time-series of pairs of scalar values may be utilized, e.g., pairs of values reflecting the displacement of device 100 within a plane defined by screen 101 . As yet a further alternative, a time-series of triplets of scalar values, i.e., vectors, may be utilized, reflecting the current displacement of device 100 in three- dimensional space.
Eye tracker 103 may utilize a camera comprised in device 100 which is configured for imaging the eye 1 1 1 or eyes of viewer 1 10, such as a front- facing camera 103 which most modern smartphones and tablets are provided with, for measuring the change in gaze 1 12. This may be achieved by image processing a series of images captured by camera 103, as is known in the art. As an example, US 2015/0002392 A1 and US 2015/0346818 A1 disclose solutions for eye tracking and detecting micro eye movements based on images captured with a mobile device, such as a smartphone. As an alternative, eye tracker 103 may comprise an infra-red light source and an infra-red light detector. In this case, the change in gaze 1 12 of viewer 1 10 is measured based on infra-red light originating from the infra-red light source, which infra-red light is reflected by the eye 1 1 1 or eyes of viewer 1 10 and subsequently detected by the infra-red light detector. Based on the measured changes in reflections over time, information about eye rotation and the related change in gaze 1 12 may be extracted, as is known in the art. The output of eye tracker 103 is a time-dependent signal, i.e., a time-series of scalar values g(t) reflecting the change in gaze 1 12 of viewer 1 10. The time- series of data g(t) may, e.g., reflect an angle of the direction of gaze 1 12 relative to a reference axis defined in relation to the head of viewer 1 10, or a change thereof. Alternatively, the time-series of data g(t) may reflect a change in the point of gaze 1 13, i.e., the point of focus of the eye(s) 1 1 1 of viewer 1 10 on screen 101 .
As is schematically illustrated in Fig. 2, device 100 is operative to determine 205 that the change in gaze 1 12 and the displacement of device 100 are out-of-sync and, in response thereto, adjust 206 the displayed content 121/122 so as to improve its readability. The determination 205 that the change in gaze 1 12 and the displacement of device 100 are out-of-sync may be achieved in a number of ways. For instance, device 100 may be operative to determine 205 that the change in gaze 1 12 and the
displacement of device 100 are out-of-sync by deriving a phase
difference p(t) between the change in gaze g(t), measured by eye tracker 103, and the displacement d(t) of the device, measured by motion sensor 102. Then, it is determined 205 that the change in gaze 1 12 and the displacement of device 100 are out-of-sync if the derived phase
difference p(t), or an absolute value thereof, \p(t \, exceeds a threshold value for a maximal phase difference.
The phase difference p(t) may be derived by any known method which is suitable for establishing the instantaneous phase difference between two time-dependent signals. As an example, this may be accomplished by calculating the Hilbert transform for each of the signals d(t) and g(t), i.e.,
Figure imgf000011_0001
and
Figure imgf000011_0002
respectively, where "p.v." denotes the Cauchy principal value. The Hilbert transform is commonly known in the field of signal processing where it is used for deriving an analytic expression of a signal, thereby extending the signal into the complex plane. For a discrete function, such as a time-series of measured values, the discrete Hilbert transform is typically used.
Subsequent to calculating the Hilbert transforms H(d) (t) and H(g) (t) for the measured signals d(t) and g(t), their respective instantaneous phase angles can be extracted. This may, e.g., be achieved by utilizing a polar representation of the complex Hilbert transforms, in which a complex number z = x + yi is expressed using its absolute value \z\ = x2 + y2 and its argument φ, commonly referred to as 'phase', as z = \z\ei p .
Finally, the instantaneous phase difference p(t) between the change in gaze 1 12 and the displacement of device 100 can be calculated as the difference between the instantaneous phase angle pd (t of the measured displacement d (t) and the instantaneous phase angle pg (_t) of the measured change in gaze g(t) :
p(t) = Vg (t - vd (t) .
The derived phase difference p (t) may be expressed in the units of degree or radians.
The process of determining that the change in gaze 1 12 and the displacement of device 100 are out-of-sync based on the instantaneous phase difference between signals g (t) and d (t) is exemplified by means of simulated signals shown in Fig. 3.
The upper diagram in Fig. 3 illustrates a possible displacement of device 100 by means of a time-series of data d(t) , which may reflect the time-dependent displacement of device 100 as a scalar value having a unit of length. The middle diagram in Fig. 3 illustrates a corresponding simulated change in gaze 1 12 by means of a time-series of data g (t), reflecting the change in direction of gaze 1 12 relative to a reference axis, or the change in point of gaze 1 13, as a scalar value having a unit of angle or length, respectively. By calculating the (discrete) Hilbert transforms and extracting the instantaneous phases for each of these signals, the instantaneous phase difference p (t) is obtained, which is shown in the lower diagram in Fig. 3.
Based on the derived phase difference p(t) , it may be determined whether the measured change in gaze 1 12 and the measured displacement of device 100 are out-of-sync by comparing the phase difference p(t) to a threshold value pmax. More specifically, if the instantaneous phase
difference p (t) exceeds pmax, if p(t) is positive, or is less than -pmax, if p (t) is negative, it is determined that the measured change in gaze 1 12 and the measured displacement of device 100 are out-of-sync, and the displayed content 121/122 is adjusted accordingly. With reference to the lower diagram in Fig. 3, this is the case during time periods when the derived phase difference p (t) is outside the range indicated by the dashed lines. The condition for the phase difference p (t) may alternatively be expressed in terms of its absolute value |p(t) | , as follows: if the absolute value of the instantaneous phase difference |p(t) | exceeds pmax, it is determined that the measured change in gaze 1 12 and the measured displacement of device 100 are out-of-sync, and the displayed content 121/122 is adjusted accordingly.
The threshold value pmax for the derived phase difference may either be configured by a user of device 100 or learned. For instance, if a user of device 100 repeatedly stops gazing at screen 101 when the derived phase difference p (t) has reached about the same value, it may be concluded that the user stops viewing content 121/122 which is displayed on screen 101 for the reason that the readability has deteriorated to an extent which makes reading impossible. By storing a history of values for the instantaneous phase difference when the viewer stops gazing at screen 101 , a suitable threshold value pmax for the derived phase difference may be established as an average value, a lower bound, or by performing a statistical analysis of the stored values.
As is known in the field of signal processing, in order to derive the phase difference p(t) as is described hereinbefore, signals g(t) and d (t) may need to be filtered so as remove noise and limit the range of frequency components present in the signals. In practice, this may be achieved by applying Finite Impulse Response (FIR) filters or Infinite Impulse Response (MR) filters to signals g(t) and d(t) .
An alternative to deriving the phase difference by means of Hilbert transforms, a phase detector or phase comparator may be employed, as is known in the art. These may be implemented as analog circuits which generate a voltage signal representing the phase difference between two signals. The so-called type-ll phase detector is sensitive only to the relative timing of the edges of the input signals and produces a constant output proportional to the phase difference when both signals are at the same frequency. As an alternative way of determining that the change in gaze 1 12 and the displacement of device 100 are out-of-sync, device 100 may be operative to measure at least one of an amplitude and a frequency of the displacement of device 100, and determining that the change in gaze 1 12 and the displacement of device 100 are out-of-sync if at least one of the measured amplitude and the measured frequency of the displacement of the device exceeds a corresponding threshold value. In this case, rather than
establishing that the change in gaze 1 12 and the displacement of device 100 are out-of-sync by deriving the phase difference between two signals, g(t) and d (t), parameters characterizing the displacement of device 100 are used as indicator for an out-of-sync condition. In practice, the amplitude and/or frequency of the displacement of device 100 may be extracted from the measured displacement d(t) . The threshold values for amplitude and/or frequency may either be configured by a user of device 100 or learned. For instance, similar to what is described above, by storing a history of values for the amplitude and/or frequency when the viewer stops gazing at screen 101 , suitable threshold values may be established as average values, lower bounds, or by performing a statistical analysis of the stored values.
Device 100 may be operative to adjust the displayed content 121/122 so as to improve readability by at least one of increasing a font size of displayed text 121 , enlarging one or more displayed graphical objects, such as picture 122, and increasing a zoom level of the displayed content. The zoom level may either be increased for the entire displayed content or for parts of the content. The latter may be achieved by using an effect which resembles a magnifying glass, as is known in the art. If the entire displayed content is enlarged by applying a zoom level, point of gaze 1 13 is preferably used as a fixed point so as to avoid shifting the part of the displayed content which viewer 1 10 gazes at.
Optionally, if the phase difference p(t) between the measured change in gaze 1 12 and the measured displacement of device 100 is derived, the displayed content may be adjusted gradually with increasing derived phase difference. For instance, an increase in font size or a zoom factor which is applied to the displayed content may be dependent on the extent to which the derived phase difference p(t) exceeds the threshold value pmax. As an example, this may be achieved by selecting an increase in font size or a zoom factor which is proportional to (|p(t) | - pmax) .
Further optionally, device 100 may be operative to adjust the displayed content 121/122 in response to determining that the change in gaze 1 12 and the displacement of device 100 have been out-of-sync for a specified period of time. Thereby, rapid changes in the displayed content are avoided in situations where the derived phase difference p(t) oscillates around the threshold value pmax.
Device 100 may further be operative adjust the displayed
content 121/122 based on settings configured by a user of device 100. These settings may, e.g., define minimum values, maximum values, preferred values, or the like, for a zoom level, font sizes, and so forth. Further optionally, the content may be adjusted based on other criteria such as ambient light, contrast or brightness settings of screen 101 , or the displayed content 121/122, such as fore- and background colors.
The above described behavior of device 100 may be implemented by means of processing means 104 comprised in device 100. Embodiments of processing means 104 are described in the following, with reference to Figs. 4 and 5.
In Fig. 4, a first embodiment 400 of processing means 104 is illustrated. Processing means 400 comprises a processing unit 402, such as a general purpose processor, and a computer-readable storage medium 403, such as a Random Access Memory (RAM), a Flash memory, or the like. In addition, processing means 400 comprises one or more interfaces 401 ('I/O' in Fig. 4) for controlling and/or receiving information from motion sensor 102, eye tracker 103, such as a front-facing camera, and screen 101 . In particular, interface(s) 401 may be arranged for receiving time-series of data d(t) and g (t), reflecting the displacement of device 100 and the change in gaze 1 12, respectively, or information from which d (t) and g(t) can be derived. For instance, if the change in gaze 1 12 is measured by image processing a series of images captured by camera 103, these images may be received via interface(s) 401 and image processed by processing means 400 so as to derive the change in gaze g(t) . Memory 403 contains computer-executable instructions 404, i.e., a computer program, for causing device 100 to perform in accordance with embodiments of the invention as described herein, when computer-executable instructions 404 are executed on processing unit 402.
In Fig. 5, an alternative embodiment 500 of processing means 104 is illustrated. In correspondence with processing means 400, processing means 500 comprises one or more interfaces 501 ('I/O' in Fig. 5) for controlling and/or receiving information from motion sensor 102, eye tracker 103, such as a front-facing camera, and screen 101 . Processing means 500 further comprises a displacement module 502, a gaze
module 503, and an adjusting module 504, which are configured to perform in accordance with embodiments of the invention as described herein. In particular, displacement module 502 is configured for measuring a displacement of device 100, gaze module 503 is configured for measuring a change in gaze 1 12 of viewer 1 10 when gazing at screen 101 , and adjustment module 504 is configured for adjusting, in response to
determining that the change in gaze 1 12 and the displacement of device 100 are out-of-sync, content displayed on screen 101 so as to improve its readability.
Interface(s) 401 and 501 , and modules 502-504, as well as any additional modules comprised in processing means 500, may be
implemented by any kind of electronic circuitry, e.g., any one, or a combination of, analogue electronic circuitry, digital electronic circuitry, and processing means executing a suitable computer program.
In the following, embodiments of the method of improving readability of content displayed to a viewer are described with reference to Fig. 6.
Method 600 comprises measuring 601 a displacement of the device, measuring 602 a change in gaze of the viewer when gazing at a screen comprised in the device, and, in response to determining that the change in gaze and the displacement of the device are out-of-sync, adjusting 605 the displayed content so as to improve the readability of the displayed content.
As is shown in Fig. 6, the determining that the change in gaze and the displacement of the device are out-of-sync may, e.g., comprise deriving 603 a phase difference between the measured change in gaze and the measured displacement of the device, and determining 604 that the change in gaze and the displacement of the device are out-of-sync if the derived phase difference exceeds a threshold value. As an alternative, the determining that the change in gaze and the displacement of the device are out-of-sync may comprise measuring at least one of an amplitude and a frequency of the displacement of the device, and determining that the change in gaze and the displacement of the device are out-of-sync if at least one of the measured amplitude and the measured frequency of the displacement of the device exceeds a corresponding threshold value.
It will be appreciated that method 600 may comprise additional, or modified, steps in accordance with what is described throughout this disclosure. Method 600 may be performed by a device for handheld operation, e.g., a mobile phone, a smartphone, a tablet, a gaming console, a media player, and a laptop, or by a device mounted in a vehicle, e.g., in or on the dashboard of a car. An embodiment of method 600 may be implemented as software, such as computer program 404, to be executed by a processing unit comprised in the device, whereby the device is operative to perform in accordance with embodiments of the invention described herein. The person skilled in the art realizes that the invention by no means is limited to the embodiments described above. On the contrary, many modifications and variations are possible within the scope of the appended claims.

Claims

1 . A device (100) for improving readability of content (121 , 122) displayed to a viewer (1 10), the device comprising:
a screen (101 ) for displaying the content to the viewer,
a motion sensor (102) for measuring a displacement (d (t)) of the device, and
an eye tracker (103) for measuring a change in gaze (1 12, g(t)) of the viewer when gazing at the screen,
the device being operative to, in response to determining that the change in gaze and the displacement of the device are out-of-sync, adjust the displayed content so as to improve the readability of the displayed content.
2. The device according to claim 1 , being operative to determine that the change in gaze and the displacement of the device are out-of-sync by: deriving a phase difference (p (t)) between the measured change in gaze {g(t)) and the measured displacement (d (t)) of the device, and
determining that the change in gaze and the displacement of the device are out-of-sync if the derived phase difference exceeds a threshold value (pmax).
3. The device according to claim 2, being operative to adjust the displayed content gradually with increasing derived phase difference.
4. The device according to any one of claims 1 to 3, being operative to determine that the change in gaze and the displacement of the device are out-of-sync by:
measuring at least one of an amplitude and a frequency of the displacement of the device, and determining that the change in gaze and the displacement of the device are out-of-sync if at least one of the measured amplitude and the measured frequency of the displacement of the device exceeds a
corresponding threshold value.
5. The device according to any one of claims 1 to 4, being operative to adjust the displayed content in response to determining that the change in gaze and the displacement of the device have been out-of-sync for a specified period of time.
6. The device according to any one of claims 1 to 5, being operative to adjust the displayed content so as to improve readability by at least one of increasing a font size of displayed text (121 ), enlarging one or more displayed graphical objects (122), and increasing a zoom level of the displayed content.
7. The device according to any one of claims 1 to 6, wherein the eye tracker comprises a camera (103) configured for imaging the eye or eyes of the viewer, the device being operative to measure the change in gaze of the viewer by image processing a series of images captured by the camera.
8. The device according to claim 7, wherein the camera (103) is mounted on the same face of the device as the screen.
9. The device according to any one of claims 1 to 8, wherein the eye tracker comprises an infra-red light source and an infra-red light detector, the device being operative to measure the change in gaze of the viewer based on infra-red light originating from the infra-red light source and being reflected by the eye or eyes of the viewer and detected by the infra-red light detector.
10. The device according to any one of claims 1 to 9, being a device (100) for handheld operation.
1 1 . The device according to any one of claims 1 to 10, being one of a mobile phone (100), a smartphone (100), a tablet, a gaming console, a media player, and a laptop.
12. The device according to any one of claims 1 to 9, being mounted in a vehicle.
13. A method (600) of a device (100), of improving readability of content (121 , 122) displayed to a viewer (1 10), the method comprising:
measuring (601 ) a displacement (d (t)) of the device,
measuring (602) a change in gaze (1 12, g(t)) of the viewer when gazing at a screen (101 ) comprised in the device, and
in response to determining that the change in gaze and the
displacement of the device are out-of-sync, adjusting (605) the displayed content so as to improve the readability of the displayed content.
14. The method according to claim 13, wherein the determining that the change in gaze and the displacement of the device are out-of-sync comprises:
deriving (603) a phase difference (p (t)) between the measured change in gaze {g(t)) and the measured displacement (d (t)) of the device, and
determining (605) that the change in gaze and the displacement of the device are out-of-sync if the derived phase difference exceeds a threshold value (pmax).
15. The method according to claim 14, wherein the displayed content is adjusted gradually with increasing derived phase difference.
16. The method according to any one of claims 13 to 15, wherein the determining that the change in gaze and the displacement of the device are out-of-sync comprises:
measuring at least one of an amplitude and a frequency of the displacement of the device, and
determining that the change in gaze and the displacement of the device are out-of-sync if at least one of the measured amplitude and the measured frequency of the displacement of the device exceeds a
corresponding threshold value.
17. The method according to any one of claims 13 to 16, wherein the displayed content is adjusted in response to determining that the change in gaze and the displacement of the device have been out-of-sync for a specified period of time.
18. The method according to any one of claims 13 to 17, wherein the adjusting the displayed content so as to improve readability comprises at least one of increasing a font size of displayed text (121 ), enlarging one or more displayed graphical objects (122), and increasing a zoom level of the displayed content.
19. The method according to any one of claims 13 to 18, wherein the measuring the change in gaze of the viewer comprises image processing a series of images captured by a camera (103) comprised in the device and being configured for imaging the eye or eyes (1 1 1 ) of the viewer.
20. The method according to claim 19, wherein the camera (103) is mounted on the same face of the device as the screen.
21 . The method according to any one of claims 13 to 18, wherein the measuring the change in gaze of the viewer comprises detecting infra-red light by an infra-red light detector comprised in the device, which infra-red light is originating from an infra-red light source comprised in the device and being reflected by the eye or eyes of the viewer.
22. The method according to any one of claims 13 to 21 , wherein the device is a device (100) for handheld operation.
23. The method according to any one of claims 13 to 22, wherein the device is one of a mobile phone (100), a smartphone (100), a tablet, a gaming console, a media player, and a laptop.
24. The method according to any one of claims 13 to 21 , wherein the device is mounted in a vehicle.
25. A computer program (404) comprising computer-executable instructions for causing a device to perform the method according to any one of claims 13 to 24, when the computer-executable instructions are executed on a processing unit (402) comprised in the device.
26. A computer program product comprising a computer-readable storage medium (403), the computer-readable storage medium having the computer program (402) according to claim 25 embodied therein.
PCT/EP2016/057384 2016-04-05 2016-04-05 Improving readability of content displayed on a screen WO2017174114A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP16715490.5A EP3440532B1 (en) 2016-04-05 2016-04-05 Improving readability of content displayed on a screen
US15/107,348 US9811161B2 (en) 2016-04-05 2016-04-05 Improving readability of content displayed on a screen
PCT/EP2016/057384 WO2017174114A1 (en) 2016-04-05 2016-04-05 Improving readability of content displayed on a screen

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2016/057384 WO2017174114A1 (en) 2016-04-05 2016-04-05 Improving readability of content displayed on a screen

Publications (1)

Publication Number Publication Date
WO2017174114A1 true WO2017174114A1 (en) 2017-10-12

Family

ID=55701940

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/057384 WO2017174114A1 (en) 2016-04-05 2016-04-05 Improving readability of content displayed on a screen

Country Status (3)

Country Link
US (1) US9811161B2 (en)
EP (1) EP3440532B1 (en)
WO (1) WO2017174114A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IN2015CH01313A (en) 2015-03-17 2015-04-10 Wipro Ltd
US11269403B2 (en) * 2015-05-04 2022-03-08 Disney Enterprises, Inc. Adaptive multi-window configuration based upon gaze tracking

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0816983A2 (en) * 1996-06-25 1998-01-07 Sun Microsystems, Inc. Method and apparatus for eyetrack-driven text enlargement
US20040100419A1 (en) * 2002-11-25 2004-05-27 Nissan Motor Co., Ltd. Display device
US20130009962A1 (en) 2009-06-29 2013-01-10 Core Wireless Licensing S.A.R.L Automatic zoom for a display
US8736692B1 (en) * 2012-07-09 2014-05-27 Google Inc. Using involuntary orbital movements to stabilize a video
US20150002392A1 (en) 2012-01-26 2015-01-01 Umoove Services Ltd. Eye tracking
US20150235084A1 (en) * 2014-02-20 2015-08-20 Samsung Electronics Co., Ltd. Detecting user viewing difficulty from facial parameters
WO2015124098A1 (en) * 2014-02-24 2015-08-27 Tencent Technology (Shenzhen) Company Limited Screen content display method and system
US20150346818A1 (en) 2014-05-27 2015-12-03 Umoove Services Ltd. System and method for detecting micro eye movements in a two dimensional image captured with a mobile device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9223134B2 (en) * 2010-02-28 2015-12-29 Microsoft Technology Licensing, Llc Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0816983A2 (en) * 1996-06-25 1998-01-07 Sun Microsystems, Inc. Method and apparatus for eyetrack-driven text enlargement
US20040100419A1 (en) * 2002-11-25 2004-05-27 Nissan Motor Co., Ltd. Display device
US20130009962A1 (en) 2009-06-29 2013-01-10 Core Wireless Licensing S.A.R.L Automatic zoom for a display
US20150002392A1 (en) 2012-01-26 2015-01-01 Umoove Services Ltd. Eye tracking
US8736692B1 (en) * 2012-07-09 2014-05-27 Google Inc. Using involuntary orbital movements to stabilize a video
US20150235084A1 (en) * 2014-02-20 2015-08-20 Samsung Electronics Co., Ltd. Detecting user viewing difficulty from facial parameters
WO2015124098A1 (en) * 2014-02-24 2015-08-27 Tencent Technology (Shenzhen) Company Limited Screen content display method and system
US20150346818A1 (en) 2014-05-27 2015-12-03 Umoove Services Ltd. System and method for detecting micro eye movements in a two dimensional image captured with a mobile device

Also Published As

Publication number Publication date
US9811161B2 (en) 2017-11-07
US20170285740A1 (en) 2017-10-05
EP3440532B1 (en) 2021-09-01
EP3440532A1 (en) 2019-02-13

Similar Documents

Publication Publication Date Title
CN111602140B (en) Method of analyzing objects in images recorded by a camera of a head-mounted device
KR102529120B1 (en) Method and device for acquiring image and recordimg medium thereof
KR20160108388A (en) Eye gaze detection with multiple light sources and sensors
CN111476306A (en) Object detection method, device, equipment and storage medium based on artificial intelligence
KR20180003235A (en) Electronic device and image capturing method thereof
EP2569934A1 (en) Imaging apparatus, image processing method, and recording medium for recording program thereon
CN109313797B (en) Image display method and terminal
CN110570460A (en) Target tracking method and device, computer equipment and computer readable storage medium
US11106915B1 (en) Generating in a gaze tracking device augmented reality representations for objects in a user line-of-sight
US9811161B2 (en) Improving readability of content displayed on a screen
US20180024661A1 (en) Method for performing display stabilization control in an electronic device with aid of microelectromechanical systems, and associated apparatus
EP3617990B1 (en) Picture processing method and apparatus, computer readable storage medium, and electronic device
US20190019028A1 (en) Systems and methods to enable and disable scrolling using camera input
CN110767144A (en) Anti-shake method and device for screen display of mobile terminal, mobile terminal and storage medium
CN110308821B (en) Touch response method and electronic equipment
CN111710315B (en) Image display method, image display device, storage medium and electronic equipment
JP2015064781A (en) Information equipment and control program
CN112308104A (en) Abnormity identification method and device and computer storage medium
CN112489006A (en) Image processing method, image processing device, storage medium and terminal
US11889197B1 (en) Reference frame selection for multi-frame image processing pipelines and other applications
US11314328B2 (en) Apparatus and method for adaptively magnifying graphic user interfaces on display
US20230400975A1 (en) Thermal management of an electronic device
CN108509165A (en) Data processing method, data processing equipment and terminal device
CN106921890A (en) A kind of method and apparatus of the Video Rendering in the equipment for promotion
CN114897915A (en) Image segmentation method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 15107348

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2016715490

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2016715490

Country of ref document: EP

Effective date: 20181105

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16715490

Country of ref document: EP

Kind code of ref document: A1