US20170205977A1 - Methods for displaying an e-book using a combination of colors for text and background that have a reduced myopiagenic effect - Google Patents

Methods for displaying an e-book using a combination of colors for text and background that have a reduced myopiagenic effect Download PDF

Info

Publication number
US20170205977A1
US20170205977A1 US15/409,049 US201715409049A US2017205977A1 US 20170205977 A1 US20170205977 A1 US 20170205977A1 US 201715409049 A US201715409049 A US 201715409049A US 2017205977 A1 US2017205977 A1 US 2017205977A1
Authority
US
United States
Prior art keywords
text
color
pixel
background
myopia
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/409,049
Other languages
English (en)
Inventor
Michael Benjamin Selkowe Fertik
Thomas W. Chalberg, JR.
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Waveshift LLC
Original Assignee
Waveshift LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Waveshift LLC filed Critical Waveshift LLC
Priority to US15/409,049 priority Critical patent/US20170205977A1/en
Assigned to WAVESHIFT LLC reassignment WAVESHIFT LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHALBERG, THOMAS W., FERTIK, MICHAEL BENJAMIN
Publication of US20170205977A1 publication Critical patent/US20170205977A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16BBIOINFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR GENETIC OR PROTEIN-RELATED DATA PROCESSING IN COMPUTATIONAL MOLECULAR BIOLOGY
    • G16B5/00ICT specially adapted for modelling or simulations in systems biology, e.g. gene-regulatory networks, protein interaction networks or metabolic networks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/11Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/003Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
    • G09G5/006Details of the interface to the display terminal
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16BBIOINFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR GENETIC OR PROTEIN-RELATED DATA PROCESSING IN COMPUTATIONAL MOLECULAR BIOLOGY
    • G16B25/00ICT specially adapted for hybridisation; ICT specially adapted for gene or protein expression
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16BBIOINFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR GENETIC OR PROTEIN-RELATED DATA PROCESSING IN COMPUTATIONAL MOLECULAR BIOLOGY
    • G16B45/00ICT specially adapted for bioinformatics-related data visualisation, e.g. displaying of maps or networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30041Eye; Retina; Ophthalmic
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0242Compensation of deficiencies in the appearance of colours
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0271Adjustment of the gradation levels within the range of the gradation scale, e.g. by redistribution or clipping
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/066Adjustment of display parameters for control of contrast
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0666Adjustment of display parameters for control of colour parameters, e.g. colour temperature
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0686Adjustment of display parameters with two or more screen areas displaying information with different brightness or colours
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0693Calibration of display systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/08Arrangements within a display terminal for setting, manually or automatically, display parameters of the display terminal
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/06Colour space transformation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/08Biomedical applications
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/14Electronic books and readers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/646Circuits for processing colour signals for image enhancement, e.g. vertical detail restoration, cross-colour elimination, contour correction, chrominance trapping filters

Definitions

  • LCDs and OLED displays are both examples of flat panel displays, and are also used in desktop monitors, TVs, and automotive and aircraft displays.
  • each pixel is composed of three sub-pixels that provide a different color.
  • each pixel can have a red, green, or blue sub-pixel, or a cyan, magenta, or yellow sub-pixel.
  • the color of the pixel as perceived by a viewer, depends upon the relative proportion of light from each of the three sub-pixels.
  • Color information for a display is commonly encoded as an RGB signal, whereby the signal is composed of a value for each of the red, green, and blue components of a pixel color for each signal in each frame.
  • a so-called gamma correction is used to convert the signal into an intensity or voltage to correct for inherent non-linearity in a display, such that the intended color is reproduced by the display.
  • chromaticity In the field of color science when applied to information display, colors are often specified by their chromaticity, which is an objective specification of a color regardless of its luminance. Chromaticity consists of two independent parameters, often specified as hue (h) and saturation (s). Color spaces (e.g., the 1931 CIE XYZ color space or the CIELUV color space) are commonly used to quantify chromaticity. For instance, when expressed as a coordinate in a color space, a pixel's hue is the angular component of the coordinate relative to the display's white point, and its saturation is the radial component. Once color coordinates are specified in one color space, it is possible to transform them into other color spaces.
  • hue hue
  • saturation saturation
  • cone cells Humans perceive color in response to signals from photoreceptor cells called cone cells, or simply cones. Cones are present throughout the central and peripheral retina, being most densely packed in the fovea centralis, a 0.3 mm diameter rod-free area in the central macula. Moving away from the fovea centralis, cones reduce in number towards the periphery of the retina. There are about six to seven million cones in a human eye.
  • FIG. 1A shows the response curves for each cone type.
  • the horizontal axis shows light wavelength (in nm) and the vertical scale shows the responsivity.
  • the curves have been scaled so that the area under each cone is equal, and adds to 10 on a linear scale.
  • the first type of cone responds the most to light of long wavelengths, peaking at about 560 nm, and is designated L for long.
  • the spectral response curve for L cones is shown as curve A.
  • the second type responds the most to light of medium-wavelength, peaking at 530 nm, and is abbreviated M for medium.
  • This response curve is curve B in FIG. 1A .
  • the third type responds the most to short-wavelength light, peaking at 420 nm, and is designated S for short, shown as curve C.
  • the three types have typical peak wavelengths near 564-580 nm, 534-545 nm, and 420-440 nm, respectively; the peak and absorption spectrum varies among individuals. The difference in the signals received from the three cone types allows the brain to perceive a continuous range of colors, through the opponent process of color vision.
  • the relative number of each cone type can vary. Whereas S-cones usually represent between 5-7% of total cones, the ratio of L and M cones can vary widely among individuals, from as low as 5% L/95% M to as high as 95% L/5% M. The ratio of L and M cones also can vary, on average, between members of difference races, with Asians believed to average close to 50/50 L:M and Caucasians believed to average close to 63% L cones (see, for example, U.S. Pat. No. 8,951,729). Color vision disorders also impact the proportion of L and M cones; protanopes have 0% L cones and deuteranopes have 0% M cones. Referring to FIG.
  • cones are generally arranged in a mosaic on the retina.
  • L and M cones are distributed in approximately equal numbers, with fewer S cones. Accordingly, when viewing an image on an electronic display, the response of the human eye to a particular pixel will depend on the color of that pixel and where on the retina the pixel is imaged.
  • Sunlight is considered an equal energy (EE) illuminant because it does not trigger the opponent color visual system (i.e., sunlight is neither red nor green, and neither blue nor yellow).
  • EE equal energy
  • a more recent discovery stipulates that overall contrast between neighboring cones stimulates asymmetric growth of the eye, leading to myopia. This could be, for example, excessive stimulation of L cones over M cones, but is not limited to that type of contrast alone.
  • the discovery stipulates that difference in stimulation in neighboring cones is critical, as opposed to the overall ratio of L vs. M over the entire retina.
  • Each L cone where surrounded by a number of M cones and/or S cones, acts as a highly stimulated “center” whereas the M or S cones in the “surround” are stimulated to a much lesser degree.
  • saturated red colors can be said to provide high contrast among adjacent retinal neurons and can be said to activate a high degree of center-surround antagonism.
  • high contrast causes high signaling differences between adjacent cones and other neurons in the visual system, and cause high center-surround antagonism in the visual system, these terms are used interchangeably to describe the degree of contrast within a receptive field on the retina.
  • the instant invention builds upon both recent biological discoveries to describe new methods, algorithms, and devices that can determine the level of myopiagenicity and reduce it, relative to current methods familiar to skilled artisans. Accordingly, among other aspects, the present disclosure features ways to characterize and/or reduce myopiagenic effects of displays while minimizing the viewer's perception of the correction on the image, and characterize and/or reduce contrast between neighboring cones in the retina.
  • the myopiagenic reduced techniques described may be implemented in a variety of ways.
  • the techniques may be implemented in TV sets via a stand-alone set top box, or via hardware (e.g., as an image processing chip) and/or software integration with the TV set itself, the cable box, or other product that interfaces with a TV set.
  • the techniques may be implemented in computer monitors, mobile devices, automobile display, aviation displays, wearable displays, and other applications using color displays.
  • the color scheme of content can be modified before delivery to an end user so that the end user gets the benefit of the myopiagenia reduction without the use of any additional hardware or software.
  • myopiagenia reduced content can be delivered to the end user via the internet or from a cable provider.
  • Techniques for quantifying the myopiagenic effect of a stimulus are also disclosed. Such techniques allow for comparison of different myopiagenic reducing algorithms on a stimulus. Implementations also account for both chromatic (e.g., how much red is in an image) and spatial (e.g., how much high-contrast high spatial frequency content there exists in an image) contributions of a stimulus to myopiagenia. Implementations allow for this being calculated and described either as the amount of contrast between adjacent neurons in the retina or the degree of center-surround antagonism in a receptive field.
  • the invention features a method, including: receiving initial image data for a sequence of frames including a first frame, f 1 i , and a second frame, f 2 i , wherein data for each pixel in f 1 i and f 2 i include a value, r i , for a first color, a value, g i , for a second color, and a value, b i , for a third color; for at least one pixel in f 1 i , determining a relative level of stimulation of cones in a viewer's eye based, at least, on the value, r i , for the first color and the value, g i , for the second color; generating modified image data for the sequence of frames including a second frame, f 2 m corresponding to the second frame, f 2 i , of the initial image data, where f 2 m includes a value, r m , for the first color and a
  • a frame often refers to a frame in a video file, it is intended to encompass images from non-video files as well.
  • a frame can include any changing or stationary image produced by a display, such as a page in a web browser, a page in an e-reader, a screen rendering in a video game, etc.
  • Implementations of the method can include one or more of the following features and/or features of other aspects.
  • Determining a relative level of stimulation of cones can include determining a relative level of stimulation of neighboring cones in the viewer's eye.
  • f 2 m When viewed on the electronic display, f 2 m may results in reduced contrast between neighboring cones in a viewer's eye compared to f 2 i .
  • the second frame can occur after the first frame in the sequence.
  • determining the relative level of stimulation includes comparing the value, r i , for the first color to the value, g i , for the second color.
  • r i can be compared to g i for a plurality of pixels in the first frame of the initial image data.
  • r m /g m can be equal to r i /g i when g i >r i .
  • r m /g m can be equal to a ⁇ r i /g i , where 0 ⁇ a ⁇ 1 and the value of a can depend on a number of frames in the sequence preceding f 2 i . a can increase as the number of frames in the sequence preceding f 2 i increases.
  • g i can be greater than r i .
  • Determining the relative level of stimulation can include determining coordinates in a universal chromaticity space representative of the color of the first pixel.
  • the chromaticity space is the 1931 x, y CIE chromaticity space or the CIE XYZ chromaticity space, or the 1964 or 1976 CIE chromaticity space.
  • the relative level of stimulation can be based on a relative spectral sensitivity of L-cones and M-cones in the viewer's eye.
  • the relative level of stimulation can be further based on a relative spectral sensitivity of S-cones in the viewer's eye.
  • the relative level of stimulation can be further based on a relative proportion of L-cones to M-cones in the viewer's eye.
  • the relative level of stimulation can be further based on a pixel/cone ratio of the frame when viewed.
  • the first, second, and third colors can be red, green, and blue, respectively. In some cases, the first, second, and third colors are cyan, magenta, and yellow.
  • the relative level of stimulation can be determined based on L, M, and S values determined based on at least some of the pixel's in f 1 i .
  • the invention features an apparatus that includes: an electronic processing module including an electronic processor, an input (e.g., electrical contacts such as electrodes for hardwiring or standard electrical connectors), and an output (e.g., electrical contacts such as electrodes for hardwiring or standard electrical connectors), wherein: the input is configured to receive initial image data for a sequence of frames including a first frame, f 1 i , and a second frame, f 2 i , wherein data for each pixel in f 1 i and f 2 i includes a value, r i , for a first color, a value, g i , for a second color, and a value, b i , for a third color; the electronic processor is programmed to receive the initial image data from the input and, for at least one pixel in f 1 i , configured to compare the value, r i , for the first color to the value, g i , for the second color and to generate modified image data for the sequence
  • Embodiments of the apparatus can include one or more of the following features and/or features of other aspects.
  • the electronic processor can be programmed to generate modified image data based on a relative level of stimulation of neighboring cones in the viewer's eye.
  • the electronic processing module can be programmed to determine the relative level of stimulation based, at least, on the corresponding values of r i and g i and b i for the at least one pixel in f 1 i .
  • the apparatus can include an electronic display panel configured to receive the modified image data from the output and display the sequence of frames based on the modified image data.
  • the electronic display can be a display selected from the group including a liquid crystal display, a digital micromirror display, an organic light emitting diode display, a projection display, quantum dot display, and a cathode ray tube display.
  • the apparatus is a semiconductor chip or a circuit board including a semiconductor chip.
  • the invention features a set top box, a flat panel display, a television, a mobile device, a wearable computer, a projection display, and/or a video game console including the foregoing apparatus.
  • the set top box can be configured to receive the input from another set top box, a DVD player, a video game console, or an internet connection.
  • the invention features a method, including: assessing uncorrected image data corresponding to a sequence of frames by identifying pixels having a red hue in each of the sequence of frames; providing modified image data corresponding to the sequence of frames based on the uncorrected image data and the assessment; displaying the sequence of frames including at least one corrected frame based on the modified image data, where one or more red-hued pixels in the corrected frame has a reduced degree of red saturation compared to the corresponding pixel in the uncorrected frame, wherein the degree of red saturation in the one or more red-hued pixels in the corrected frame is reduced based on the degree of red saturation in red-hued pixels in one or more of the frames displayed prior to displaying the corrected frame.
  • Implementations of the method can include one or more features of other aspects.
  • the invention features an apparatus that includes an input configured to receive uncorrected image data corresponding to a sequence of frames; an electronic processing module including an electronic processor, an input, and an output, the input being configured to receive uncorrected image data corresponding to a sequence of frames, the electronic processor being programmed to assess the uncorrected image data by identifying pixels having a red hue in each of the sequence of frames and configured to provide modified image data corresponding to the sequence of frames based on the uncorrected image data and the assessment, and the output being configured to transmit the modified image data from the electronic processing module to an electronic display.
  • the modified image data corresponds to the sequence of frames including at least one corrected frame, where one or more red-hued pixels in the corrected frame has a reduced degree of red saturation compared to the corresponding pixel in the uncorrected frame, the degree of red saturation in the one or more red-hued pixels in the corrected frame being reduced based on the degree of red saturation in red-hued pixels in one or more of the frames preceding the corrected frame.
  • Embodiments of the apparatus can include one or more features of other aspects.
  • the invention features a method, including: receiving initial image data including a first frame, f 1 i , wherein data for each pixel in f 1 i includes a value, r i , for a first color, a value, g i , for a second color, and a value, b i , for a third color; for at least a first pixel f 1 i , comparing the value, r i , for the first color to the value, g i , for the second color; generating modified image data including a first frame, f 1 m , including a value, r m , for the first color at a second pixel and a value, g m , for the second color at the second pixel, the second pixel being at a different location in the first frame from the first pixel, wherein a ratio r m /g m for the second pixel is different from a ratio r i /g i
  • Implementations of the method can include one or more of the following features and/or features of other aspects.
  • Determining a relative level of stimulation of cones can include determining a relative level of stimulation of neighboring cones in the viewer's eye.
  • f 1 m When viewed on a display, f 1 m can stimulates L cones in a viewer's eye less relative to M cones in the viewer's eye than f 1 i .
  • the difference between the ratios can also be based on r i and g i of the second pixel in f 1 i .
  • the difference between the ratios can be based on r i and g i of one or more additional pixels in f 1 i different from the first and second pixels.
  • the first pixel can be an n-th nearest neighbor to the second pixel.
  • the first pixel can be a nearest neighbor pixel to the second pixel.
  • r m /g m can be less than r i /g i when g i ⁇ r i .
  • r m /g m can be equal to r i /g i when g i >r i .
  • r m /g m can be equal to a ⁇ r i /g i , where 0 ⁇ a ⁇ 1 and the value of a can depend on a r i and g i of the first pixel. a can decrease as a ratio r i /g i for the first pixel increases.
  • r m can be less than r i for the second pixel.
  • g m can be greater than g i for the second pixel.
  • b m can be non-equal to b i for at least some of the pixels.
  • the first, second, and third colors can be red, green, and blue, respectively. In some embodiments, the first, second, and third colors are cyan, magenta, and yellow.
  • the invention features an apparatus, including: an input configured to receive initial image data including a first frame, f 1 i , wherein data for each pixel in f 1 i includes a value, r i , for a first color, a value, g i , for a second color, and a value, b i , for a third color; an electronic processing module programmed to receive the initial image data from the input and, for at least a first pixel f 1 i , compare the value, r i , for the first color to the value, g i , for the second color and to generate modified image data including a first frame, f 1 m , including a value, r m , for the first color at a second pixel and a value, g m , for the second color at the second pixel, the second pixel being at a different location in the first frame from the first pixel, wherein a ratio r m /g m for
  • Embodiments of the apparatus can include one or more of the following features and/or features of other aspects.
  • the invention features a method, including: assessing uncorrected image data corresponding to at least one uncorrected frame by identifying pixels having a red hue in the at least one uncorrected frame; providing modified image data based on the uncorrected image data and the assessment, the modified image data corresponding to at least one corrected frame corresponding to the at least one uncorrected frame; displaying the at least one corrected frame, where one or more red-hued pixels in the corrected frame has a reduced degree of red saturation compared to the corresponding pixel in the uncorrected frame, wherein the degree of red saturation in the one or more red-hued pixels in the corrected frame is reduced based on a comparison of a degree of red saturation in two or more different portions of the uncorrected frame.
  • Implementations of the method can include one or more of the following features and/or features of other aspects.
  • the two or more different portions can be red-hued portions.
  • the different portions can include one or more contiguous pixels.
  • the uncorrected image data can correspond to a plurality of uncorrected frames and the modified image data includes a corresponding plurality of corrected frames.
  • the invention features an apparatus, including: an electronic processing module including an electronic processor, an input, and an output, wherein: the input is configured to receive uncorrected image data corresponding to at least one uncorrected frame; the electronic processor is programmed to assess the uncorrected image data by identifying pixels having a red hue in the at least one uncorrected frame and to provide modified image data based on the uncorrected image data and the assessment; and the output is configured to transmit the modified image data from the electronic processing module to an electronic display, wherein the modified image data corresponds to at least one corrected frame, where one or more red-hued pixels in the corrected frame has a reduced degree of red saturation compared to the corresponding pixel in the uncorrected frame, and wherein the degree of red saturation in the one or more red-hued pixels in the corrected frame is reduced based on a comparison of a degree of red saturation in two or more different portions of the uncorrected frame.
  • an electronic processing module including an electronic processor, an input, and an output, wherein:
  • the apparatus can include one or more of the following features and/or features of other aspects.
  • the apparatus can include an electronic display panel configured to receive the modified image data from the output and display the sequence of frames based on the modified image data.
  • the electronic display can be a display selected from the group including a liquid crystal display, a digital micromirror display, an organic light emitting diode display, a projection display, quantum dot display, and a cathode ray tube display.
  • the apparatus is a semiconductor chip or a circuit board including a semiconductor chip.
  • the invention features a set top box, a flat panel display, a television, a mobile device, a wearable computer, a projection display, and/or a video game console including the foregoing apparatus.
  • the set top box can be configured to receive the input from another set top box, a DVD player, a video game console, or an internet connection.
  • the invention features a method, including: receiving initial image data including a first frame, f 1 i , wherein data for each pixel in the first frame includes a value, r i , for a first color, a value, g i , for a second color, and a value, b i , for a third color; for at least a first pixel in f 1 i , comparing r i to g i ; generating modified image data including a modified first frame, f 1 m , the modified first frame including a value, r m , for the first color and a value, g m , for the second color at the first pixel, wherein r m is different from r i for the first pixel and/or g m is different from g i for the first pixel, the difference being based on a location of the first pixel in the first frame; and transmitting the modified image data to an electronic display.
  • Implementations of the method can include one or more of the following features and/or features of other aspects.
  • the difference between r m and r i can increase the closer the location of the first pixel is to a nearest border of the display.
  • the difference between g m and g i can decrease the closer the location of the first pixel is to a nearest border of the display.
  • the difference between r m and r i can increase the closer the location of the first pixel is to a center of the display.
  • the difference between g m and g i can decrease the closer the location of the first pixel is to a center of the display.
  • b m ⁇ b i for at least one pixel.
  • the invention features an apparatus, including: an electronic processing module including an electronic processor, an input, and an output, wherein: the input is configured to receive initial image data for a sequence of frames including a first frame, f 1 i , wherein data for each pixel in f 1 i includes a value, r i , for a first color, a value, g i , for a second color, and a value, b i , for a third color; the electronic processor is programmed to receive the initial image data from the input and, for at least one pixel in f 1 i , configured to compare r i to g i and to generate modified image data including a modified first frame, f 1 m , the modified first frame including a value, r m , for the first color and a value, g m , for the second color at the first pixel, wherein r m is different from r i for the first pixel and/or g m
  • Embodiments of the apparatus can include one or more features of other aspects.
  • the invention features a method, including: assessing uncorrected image data corresponding to at least one uncorrected frame by identifying pixels having a red hue in the at least one uncorrected frame; providing modified image data based on the uncorrected image data and the assessment, the modified image data corresponding to at least one corrected frame corresponding to the at least one uncorrected frame; displaying the at least one corrected frame, where one or more red-hued pixels in the corrected frame has a reduced degree of red saturation compared to the corresponding pixel in the uncorrected frame, wherein the degree of red saturation in the one or more red-hued pixels in the corrected image frame is reduced based on a respective location of the one or more pixels in the corrected frame.
  • Implementations of the method can include one or more of the following features and/or features of other aspects.
  • the degree of red saturation in the one or more red-hued pixels in the corrected image frame can be reduced based on a proximity of the red-hued pixels to an edge of the corrected frame.
  • the degree of red saturation can be reduced more for pixels closer to the edge of the corrected frame than for pixels further from the edge of the corrected frame.
  • the invention features an apparatus, including: an electronic processing module including an electronic processor, an input, and an output, wherein: the input is configured to receive uncorrected image data corresponding to at least one uncorrected frame; the electronic processor is programmed to assess the uncorrected image data by identifying pixels having a red hue in the at least one uncorrected frame and to provide modified image data based on the uncorrected image data and the assessment; and the output is configured to transmit the modified image data from the electronic processing module to an electronic display, wherein the degree of red saturation in the one or more red-hued pixels in the corrected image frame is reduced based on a respective location of the one or more pixels in the corrected frame.
  • Embodiments of the apparatus can include one or more features of other aspects.
  • the invention features a method, including: receiving initial image data including a first frame, f 1 i , wherein data for each pixel in the first frame includes a value for a first color, r i , a value for a second color, g i , and a value for a third color, b i ; for at least a first pixel in f 1 i , calculating a degree of stimulation by the first pixel on a first set of one or more cones in a viewer's eye based, at least, on r i and g i and b i for the first pixel; for at least a second pixel in f 1 i , different from the first pixel, calculating a degree of stimulation by the second pixel on a second set of one or more cones in the viewer's eye based, at least, on r i and g i and b i for the second pixel; determining a difference in a degree of
  • Implementations of the method can include one or more of the following features and/or features of other aspects.
  • the cones of the first set can be from one cone type (L, M or S) and the cones of the second set are a different cone type (L, M, or S).
  • the first and second pixels can be neighboring pixels or groups of pixels.
  • the at least one second pixel can include each of the pixels neighboring the first pixel.
  • Calculating the degree of stimulation can include determining corresponding coordinates in a universal chromaticity space representative of the colors of the first and second pixels.
  • the chromaticity space can be the 1931 x, y CIE chromaticity space or the CIE XYZ chromaticity space, or the 1964 or 1976 CIE chromaticity space.
  • the degree of stimulation can be based on the relative spectral sensitivity of L-cones and M-cones in the viewer's eye.
  • the degree of stimulation can be further based on a relative proportion of L-cones to M-cones in the viewer's eye.
  • the degree of stimulation can be further based on a pixel/cone ratio of the image when viewed.
  • a red saturation of the first pixel can be reduced in the modified image data relative to the initial image data.
  • a contrast between the first pixel and the second pixel can be reduced in the modified image data relative to the initial image data.
  • r i can be greater than r m and/or g i can be less than g m . In some embodiments, b i ⁇ b m for at least one pixel.
  • the invention features an apparatus, including: an electronic processing module including an electronic processor, an input, and an output, wherein: the input is configured to receive initial image data for a sequence of frames including a first frame, f 1 i , wherein data for each pixel in f 1 i includes a value for a first color, r i , a value for a second color, g i , and a value for a third color, b i ; the electronic processor is programmed to: (i) receive the initial image data from the input, for at least one pixel in f 1 i ; (ii) to calculate a degree of stimulation by the first pixel on a first set of one or more cones in a viewer's eye based, at least, on r i and g i for the first pixel; (iii) for at least a second pixel in f 1 i , different from the first pixel, calculate a degree of stimulation by the second pixel on a second
  • Embodiments of the apparatus can include one or more of the following features and/or features of other aspects.
  • the cones of the first set are L-cones and the cones of the second set are M-cones.
  • the first and second pixels can be neighboring pixels.
  • the at least one second pixel can include each of the pixels neighboring the first pixel.
  • the electronic processing module can be programmed to determine the relative level of stimulation based, at least, on the corresponding values of r i and g i for the at least one pixel in f 1 i .
  • the apparatus can include an electronic display panel configured to receive the modified image data from the output port and display the sequence of frames based on the modified image data.
  • the electronic display is a display selected from the group including a liquid crystal display, a digital micromirror display, an organic light emitting diode display, a projection display, and a cathode ray tube display.
  • the apparatus is a semiconductor chip or a circuit board including a semiconductor chip.
  • the invention features a set top box, a flat panel display, a television, a mobile device, a wearable computer, a projection display, and/or a video game console including the foregoing apparatus.
  • the set top box can be configured to receive the input from another set top box, a DVD player, a video game console, or an internet connection.
  • the invention features a method of evaluating differential stimulation between neighboring sets of cones of a viewer's eye when viewing an image on an electronic display, the method including: calculating a degree of stimulation of a pixel in the image on a first set of one or more cones based, at least, on a color of the pixel; calculating a degree of stimulation of a pixel in the image on a second set of one or more cones based, at least, on a color of the second pixel; and determining a difference in the degree of stimulation between the first and second sets of one or more cones.
  • Implementations of the method can include one or more of the following features and/or features of other aspects.
  • the cones of the first set can be L-cones and the cones of the second set can be M-cones.
  • the first and second pixels can be neighboring pixels.
  • the at least one second pixel can include each of the pixels neighboring the first pixel.
  • Calculating the degree of stimulation can inclide determining corresponding coordinates in a two-dimensional chromaticity space representative of the colors of the first and second pixels.
  • the chromaticity space can be the 1931 x, y CIE chromaticity space or the CIE XYZ chromaticity space, or the 1964 or 1976 CIE chromaticity space.
  • the degree of stimulation can be based on the relative spectral sensitivity of L-cones and M-cones in the viewer's eye.
  • the degree of stimulation can be further based on a relative proportion of L-cones to M-cones in the viewer's eye.
  • the degree of stimulation can be further based on a pixel/cone ratio of the image when viewed.
  • the method can include evaluating a myopiagenic effect of a digital video file including the image based on the difference in the degree of stimulation between the first and second sets of one or more cones.
  • the digital video file can include a sequence of frames, and at least one of the frames includes the image.
  • the method can include assigning the digital video file a score indicative of the myopiagenic effect of the digital video file based on the evaluation.
  • the method can include modifying the color of the first pixel and/or the second pixel to reduce the difference in the degree of stimulation between the first and second sets of one or more cones.
  • the color modification can reduce a red saturation of the first pixel and/or the second pixel.
  • the color modification can reduce a contrast between the first pixel and the second pixel.
  • the invention features a method for evaluating a myopiagenic effect of a digital video file, including: determining, for at least a first pixel in a first frame of the digital video file, a relative level of stimulation of L-cones and a level of stimulation of M-cones in a viewer's eye by the first pixel based on a color of the first pixel; and assigning a score to the digital video file indicative of the myopiagenic effect of the digital video file based on the relative level of L-cone and M-cone stimulation by the first pixel in the first frame.
  • determining the relative level of stimulation of the L-cones and M-cones can include translating color data for each pixel to a co-ordinate in a two-dimensional chromaticity space.
  • the chromaticity space can be the 1931 x, y CIE chromaticity space or the CIE XYZ chromaticity space, or the 1964 or 1976 CIE chromaticity space.
  • a value for the relative level of stimulation of the L-cones and M-cones can be assigned to each pixel based on the coordinate for that pixel.
  • the method can include determining a level of stimulation of L-cones and a level of stimulation of M-cones in the viewer's eye by one or more additional pixels in the first frame based on a color of each of the respective additional pixels; and assigning the score based on a contrast between the relative levels of M-cone and L-cone stimulation between the first pixel and the additional pixels.
  • the one or more additional pixels can neighbor the first pixel in the frame. There can be six or eight additional pixels.
  • Determining the relative level of stimulation of the L-cones and M-cones can include translating color data for each pixel to a coordinate in a two-dimensional chromaticity space and assigning each pixel a value for the relative level of stimulation of the L-cones and M-cones based on the coordinate for that pixel.
  • Assigning the score includes calculating a neighbor sum of squares (NSS) based on the value for the relative level of stimulation.
  • NSS can be calculated for multiple pixels in the first frame.
  • the score can be assigned based on an average of the NSS of the multiple pixels in the first frame.
  • Assigning the score can include accounting for a relative density of L-cones to M-cones in the viewer's eye.
  • Assigning the score can include accounting for a pixel/cone ratio of the frame when viewed.
  • the determining can be repeated for multiple frames in the digital video file and the score can be assigned based on the determination for each of the multiple frames.
  • the method can include normalizing the score indicative of the myopiagenic effect of the digital video file and outputting the normalized score.
  • the method can include assigning the digital video file an alphanumeric grade based on the score indicative of the myopiagenic effect of the digital video file and outputting the alphanumeric grade.
  • the method can include displaying the alphanumeric grade with a medium containing the digital video file or a link to the digital video file.
  • the digital video file can have a format selected from the group consisting of MPEG, MP4, MOV, WMV, FLV, AVI, AVC, AVCHD, Divx, and MXF.
  • the invention features a method, including: assessing image data corresponding to pixels from one or more frames by identifying pixels having a red hue in at least one of the frames and determining a degree of red saturation for each of the red-hued pixels; and assigning a score to the image data based on the assessment, the score corresponding to a degree to which the image data, when viewed on an electronic display, differentially stimulates L-cones to M-cones in a viewer's eye.
  • Implementations of the method can include one or more of the following features and/or features of other aspects.
  • the data for each pixel in the image data can include a value, r, for a first color, a value, g, for a second color, and a value, b, for a third color the pixels having a red hue are identified by comparing r, g, and b for each pixel.
  • the first color can be red
  • the second color can be green
  • the third color can be blue.
  • Red-hued pixels can be identified as pixels for which r>g and r>b.
  • the first color can be cyan
  • the second color can be magenta
  • the third color can be yellow.
  • the score can be an alphanumeric score.
  • the method can include displaying the score in association with the image data.
  • the image data can be stored on a storage medium and the score is displayed on the medium or packaging for the medium.
  • the image data can be provided via the internet and the score is displayed in association with a hyperlink to the image data.
  • the image data can be formatted as a digital video file.
  • the invention features a method, including: accessing an electronic file including text; displaying at least one letter of text on at least one area of background in a modified format on a color LCD display; wherein the average variance or average absolute difference in L/M cone stimulation is reduced by more than 60% compared to the unmodified format in the displayed area.
  • Implementations of the method can include one or more of the following features and/or features of other aspects.
  • the electronic file can be an e-book.
  • the electronic file can be a text file for reading or word processing.
  • the area of modified format can be chosen according to the area being read at that moment in time (e.g., based on eye-tracking or a touch sensor). Alternatively, or additionally, the area of modified format can be chosen according to the area not being read at that moment in time.
  • the scale can be based on a measure of difference or variance, for example.
  • a measure of difference one can calculate text stimulation on L cones, text stimulation on M cones, background stimulation on L cones, and background stimulation on M cones.
  • For each small area of the retina calculate the average stimulation overall. Then calculate the absolute value of the difference for each cone versus the average for that area. Divide this result by the average stimulation, and average this value over the entire simulated retina.
  • the invention features a method, including: receiving an electronic file including a text, optionally on a mobile device, including a display; selecting a display mode for displaying the text from the group consisting of a color display mode and a contrast display mode; and displaying a page of the text on the flat panel display using the selected display mode, wherein: for the color display mode, the text is displayed in a text color and a background is displayed in a background color, wherein the text and background colors have at least a 30% myopia reduction compared to black text on a white background based on the LMS Myopia Scale, and for the contrast display mode, a first area of the page of text is displayed with a first contrast level between the text and the background and a second area of the page of text is displayed with a second contrast level lower than the first level.
  • Implementations of the method can include one or more of the following features and/or features of other aspects.
  • the text and background colors can have at least a 35%, 40%, 45%, 50%, 55%, 60%, or 65% myopia reduction (e.g., 68% or more, 70% or more, 75% or more, 80% or more, 85% or more, 90% or more, such as about 95%) compared to black text on a white background based on the LMS Myopia Scale.
  • the method can include presenting a user with a selection of combinations of colors for the text and background colors, and allowing the user to select one of the combinations for the myopia-safe contrast display scheme.
  • the second contrast level can be provided by changing a luminance level of the background and/or the text. Alternatively, or additionally, the second contrast level can be provided by blurring edges of the text in the displayed page.
  • Displaying the page of text can include scanning the first area over the page of text.
  • the first area can be determined based on the words that are being viewed.
  • the mobile device can include a camera facing the viewer, and the mobile device can track the movement of the viewer's eyes using the camera to determine which words are being viewed.
  • the first area can be scanned at a speed corresponding to 100 to 500 words of the text per minute.
  • the display mode can be selected by accessing the electronic file using a mobile app on the mobile device.
  • the electronic file can be an e-book file.
  • the mobile device can be a smart phone, tablet computer, or dedicated e-reader. More generally, the device can be a personal computer (e.g., desktop or laptop) or other device that includes a monitor.
  • the invention features a mobile device, including: a display; an electronic processing module in communication with the display, the electronic processing module being programmed to: receive an electronic file including a text; receive a selection of a display mode for displaying the text, the display mode being selected from the group consisting of a color display mode and a contrast display mode; and display, on the display a page of the text using the selected display mode, wherein: for the color display mode, the text is displayed in a text color and a background is displayed in a background color, wherein the text and background colors have at least a 30%, 35%, 40%, 45%, 50%, 55%, or 60% myopia reduction compared to black text on a white background based on the LMS Myopia Scale, and for the contrast display mode, a first area of the page of text is displayed with a first contrast level between the text and the background and a second area of the page of text is displayed with a second contrast level lower than the first level.
  • Embodiments of the mobile device can include one or more features of other aspects.
  • the invention features a non-transitory computer-readable medium storing a program causing a mobile device to perform steps including: receiving an electronic file including a text on the mobile device; selecting a display mode for displaying the text from the group consisting of a color display mode and a myopia-safe contrast display mode; and displaying a page of the text on a flat panel display of the mobile device using the selected display mode, wherein: for the color display mode, the text is displayed in a text color and a background is displayed in a background color, wherein the text and background colors have at least a 60% myopia reduction compared to black text on a white background based on the LMS Myopia Scale, and for the contrast display mode, a first area of the page of text is displayed with a first contrast level between the text and the background and a second area of the page of text is displayed with a second contrast level lower than the first level.
  • the invention features a method for displaying an e-book using a combination of colors for text and background that have a reduced myopiagenic effect compared to black text on white background, the method including: presenting a user with one or more combinations of colors for the text and background identified as having a reduced myopiagenic effect, wherein none of the presented combinations include either black or white text or either black or white background, and, when viewed by the user's retina, an image composed of text and background rendered in any of the presented color combinations provides reduced center-surround contrast on the user's retina compared to the image viewed as black text on white background; receiving a selection of one of the color combinations from the user; and displaying the e-book file using the combination of colors for the text and background selected by the user.
  • Implementations of the method can include one or more of the following features and/or features of other aspects.
  • the reduced center-surround contrast due to the color combinations yields a myopiagenic effect reduced by at least 35% (e.g., 40% or more, 50% or more, 60% or more, 80% or more, such as up to 90%) as calculated using a myopia scale that calculates a center-surround contrast of a modeled visual receptive field and assigns a score to the color combinations based on the calculated center-surround contrast.
  • the center-surround contrast can be calculated based on a difference between an average stimulation of the visual receptive field center versus its and a stimulation of the surround.
  • the visual receptive field center can correspond to a cone and the surround to its nearest neighbors.
  • the average stimulation can be determined based on LMS stimulus values of the cone and its nearest neighbors of the visual receptive field.
  • the method can further include receiving information about a desired myopiagenic level from the user and presenting the one or more combinations of colors according to the received information, the presented combinations of colors having a myopiagenic effect corresponding to the desired level.
  • the information about the desired myopiagenic level can be a desired percentage reduction of myopia potential as calculated using a myopia scale that calculates an impact on the retina based on a differential stimulation between the center and surround of a modeled visual receptive field.
  • the presented combinations of colors can have a myopiagenic level within 10% (e.g., within 5%, 3%, 2%, 1%) of the desired percentage reduction of myopia potential as calculated using the myopia scale.
  • the myopia scale can be a LMS Myopia Scale.
  • the e-book can be a file in any of the following formats: Broadband eBooks (BBeB), Comic Book Archive, Compiled HTML, DAISY, DjVu, DOC, DOCX, EPUB, eReader, FictionBook, Founder Electronics, HTML, iBook, IEC62448, INF, KF8, KPF, Microsoft LIT, MOBI, Mobipocket, Multimedia eBooks, Newton eBook, Open Electronic Package, PDF, Plain text, Plucker, PostScript, RTF, SSReader, Text Encoding Initiative, TomeRaider, and Open XML Paper Specification.
  • the e-book can be displayed on a mobile device, such as a smartphone, a tablet computer, or a dedicated e-reader (e.g., a Kindle e-reader, a Nook e-reader).
  • a mobile device such as a smartphone, a tablet computer, or a dedicated e-reader (e.g., a Kindle e-reader, a Nook e-reader).
  • the invention features a device for displaying an e-book, including: a display; an interface for receiving input from a user; and an electronic processing module programmed to cause the device to: (i) present the user with one or more combinations of colors for text and background identified as having a reduced myopiagenic effect, wherein none of the presented combinations include either black or white text or either black or white background, and, when viewed by the user's retina, an image composed of text and background rendered in any of the presented color combinations provides reduced center-surround contrast on the user's retina compared to the image viewed as black text on white background; (ii) receive a selection of one of the color combinations from the user via the interface; (iii) retrieve the e-book from memory; and (iv) display, using the display, the e-book using the combination of colors for the text and background selected by the user.
  • Embodiments of the device can include one or more of the following features and/or features of other aspects.
  • the reduced center-surround contrast due to the color combinations can yield a myopiagenic effect reduced by at least 35% (e.g., 40% or more, 50% or more, 60% or more, 70% or more, 80% or more, up to 90%) as calculated using a myopia scale that calculates a center-surround contrast of a modeled visual receptive field and assigns a score to the color combinations based on the calculated center-surround contrast.
  • the center-surround contrast can be calculated based on a difference between an average stimulation of the visual receptive field and a stimulation of the surround.
  • the visual receptive field can correspond to a cone and its nearest neighbors.
  • the electronic processing module can be further programmed to cause the device to receive information about a desired myopiagenic level from the user and present the one or more combinations of colors according to the received information, the presented combinations of colors having a myopiagenic effect corresponding to the desired level.
  • the information about the desired myopiagenic level can be a desired percentage reduction of myopia potential as calculated using a myopia scale that calculates an impact on the retina based on a differential stimulation between the center and surround of a modeled visual receptive field.
  • the interface can include a touch panel, mouse, or keyboard.
  • the display can be a flat panel display.
  • the device can be a smartphone, a tablet computer, or a dedicated e-reader.
  • the invention features a method for displaying an e-book using a combination of colors for text and background that have a reduced myopiagenic effect compared to black text on white background, the method including: displaying text using a text color other than black or white; and displaying a background to the text using a background color other than black or white, wherein an image displayed using the displayed text color on the displayed background color, when viewed by the user's retina, provides reduced center-surround contrast on the user's retina compared to the image when viewed in black and white.
  • Implementations of the method can include one or more of the following features and/or features of other aspects.
  • the text color and background color can yield a ratio of a Text Readability score to myopia score on a LMS myopia scale is greater than 0.60 (e.g., 0.65 or more, 0.7 or more, 0.75 or more).
  • the myopia potential can be reduced by more than 58% as calculated using a LMS myopia scale and a Text Readability score is decreased no more than 65% (e.g., 60% or less, 50% or less, 40% or less) compared to the image when viewed as black text on white background.
  • 65% e.g. 60% or less, 50% or less, 40% or less
  • the disclosed implementations can reduce the myopiagenic effect of electronic displays.
  • FIG. 1A is a plot showing normalized responsivity spectra of human cone cells, S, M, and L types.
  • FIG. 1B shows an example of cone mosaic on a retina.
  • FIG. 1C is CIE 1931 chromaticity diagram showing equal energy illuminant points CIE-E, CIE-D65, and CIE-C.
  • FIG. 2 shows an embodiment of a system including a set top box for reducing the myopiagenic effect of a TV set.
  • FIG. 3 shows another embodiment of a system including a set top box for reducing the myopiagenic effect of a TV set.
  • FIG. 4A shows an embodiment of a local area network including a server for delivering content for which the myopiagenic effect has been reduced.
  • FIGS. 4B-4C show side cross-sections of a myopic eye and a normal eye, respectively.
  • FIG. 5A shows a stimulus composed of a black and white checkerboard array.
  • FIG. 5B shows a distribution of L, M, and S cones in a simulated retina.
  • FIG. 5C shows a level of stimulation of the cones in the simulated retina shown in FIG. 5B by the stimulus shown in FIG. 5A .
  • FIG. 6A shows a stimulus composed of an array of red pixels.
  • FIG. 6B shows a distribution of L, M, and S cones in a simulated retina.
  • FIG. 6C shows a level of stimulation of the cones in the simulated retina shown in FIG. 6B by the stimulus shown in FIG. 6A .
  • FIG. 7 shows a flowchart of an algorithm for producing a modified video signal for reducing the myopiagenic effect of a display.
  • FIG. 8A shows a stimulus for which the watercolor effect has been used to reduce the myopiagenic effect of the image.
  • FIG. 8B shows a stimulus for which the cornsweet effect has been used to reduce the myopiagenic effect of the image.
  • FIG. 9 is a flowchart showing an algorithm for determining a cone stimulation level in a simulated retina.
  • FIG. 10 is a flowchart showing an algorithm for quantifying the myopiagenic effect of a stimulus.
  • FIGS. 11A and 11B show possible arrangements of cones in a simulated retina.
  • FIG. 12A is a schematic diagram showing the relationship between viewing distance and cone separation at maximal retinal resolution.
  • FIG. 12B is a schematic diagram illustrating a cone to pixel mapping for a 1080P 60′′ display.
  • FIG. 13 is a three-dimensional plot of calculated myopiagenic scale values as a function of different text and background colors.
  • FIG. 14A is a table listing calculated myopiagenic scale values and readability values for different text and background color combinations.
  • FIG. 14B is another table listing calculated myopiagenic scale values and readability values for different text and background color combinations.
  • FIG. 15A is a further table listing calculated myopiagenic scale values and readability values for two text and background color combinations.
  • FIG. 15B is a plot showing calculated cone stimulation from a strip of text between two strips of background for the color combination specified in the first row of the table in FIG. 15A .
  • FIG. 15C is a plot showing calculated cone stimulation from a strip of text between two strips of background for the color combination specified in the second row of the table in FIG. 15A .
  • FIG. 16A is another table listing calculated myopiagenic scale values and readability values for two additional text and background color combinations.
  • FIG. 16B is a plot showing calculated cone stimulation from a strip of text between two strips of background for the color combination specified in the first row of the table in FIG. 16A .
  • FIG. 16C is a plot showing calculated cone stimulation from a strip of text between two strips of background for the color combination specified in the second row of the table in FIG. 16A .
  • FIG. 17 is a flowchart showing an algorithm for displaying an e-book with a combination of colors for text and background that have a reduced myopiagenic effect compared to black text on white background;
  • FIG. 18 is a schematic diagram of an electronic processing module.
  • a set top box 100 for reducing the myopiagenic effect of a television (TV) set 130 is connected between a cable box 120 and TV set 130 .
  • a cable 125 connects an output port of cable box 120 to an input port of set top box 100
  • another cable 135 connects an output port of set top box 100 to an input port of TV set 130 .
  • Cables 125 and 135 are cables capable of carrying a video signal, including analogue video cables (e.g., composite video cables, S-video cables, component video cables, SCART cables, VGA cables) and digital video cables (e.g., serial digital interface (SDI) cables, digital visual interface (DVI) cables, HDMI cables, DisplayPort cables).
  • SDI serial digital interface
  • DVI digital visual interface
  • Set top box 100 includes an electronic processing module 110 and an internal power supply 140 .
  • Electronic processing module 110 includes one or more electronic processors programmed to receive an input video signal from the input port of set top box 100 and output a modified video signal to the output port.
  • ASIC application-specific integrated circuit
  • FPGA field programmable gate array
  • Electronic processing module 110 may include other integrated circuit components (e.g., one or more memory blocks) and/or electronic components.
  • Internal power supply 140 is connected to a power port, to which a power supply cable 105 is connected.
  • the power supply cable 105 connects set top box 100 to an external power source, such as a standard plug socket.
  • Power supply 140 is configured to receive electrical power from the external power source and convert that power to power appropriate for powering electronic processing module 110 (e.g., AC-to-DC conversion at suitable current and voltage levels).
  • Internal wiring connects power supply 140 to electronic processing module 110 .
  • TV set 130 may include any appropriate color display including, for example, a light emitting diode display (LEDs), liquid crystal displays (LCD), a LED-backlit LCD, an organic light emitting diode (OLED) display, a color projector displays, a quantum dot display, a cathode ray tube (CRT), or a MEMS-based display, such as a digital micromirror device (DMD).
  • LEDs light emitting diode display
  • LCD liquid crystal displays
  • OLED organic light emitting diode
  • color projector displays a quantum dot display
  • CRT cathode ray tube
  • MEMS-based display such as a digital micromirror device (DMD).
  • TV set 130 may be a direct view display or a projection display (e.g., a front or rear projection display).
  • cable box 120 receives an input signal, including a video signal, from a source via cable 122 .
  • cable 122 can be any of a variety of cables capable of carrying a video signal, such as an Ethernet cable, a co-axial cable, a DSL line.
  • the input signal source can be a satellite dish, a cable TV and/or broadband internet provider, or a VHF or UHF antenna.
  • the input signal can include content in addition to video signals, such as audio signals, internet web pages, interactive video games, etc.
  • the input video signal includes a sequence of image frames. Each frame is composed of a series of rows and columns of pixels, possibly arranged as a pixel array, and the input video signal includes information about the color of each pixel in each frame.
  • the input RGB video signal includes, for each pixel in each frame, a value for red, r i , and value for green, g i , and a value for blue, b i .
  • the higher the value for each color the higher the intensity of the primary contributing to the pixel color.
  • the range of values for each color depends on the number of bits, or color depth, of the signal. For 24-bit color, for example, each component color has a value in a range from 0 to 255, yielding 256 3 possible color combinations. Other color depths 8-bit color, 12-bit color, 30-bit color, 36-bit color, and 48-bit color.
  • RGB color coding in video signals to RGB
  • algorithms for transforming RGB signals to other color signal formats and back are known.
  • the electronic processing module 110 generates an output RGB video signal based on the input video signal so that the corresponding image displayed using TV 130 produces either (i) a reduced level of differential stimulation between L cones and M cones in a viewer's eye and/or (ii) a reduced level of differential stimulation between neighboring cones, compared with the viewing an image produced using the input video signal.
  • the electronic processing modules achieves this by outputting a video signal that includes, for each pixel in each frame, having a value for red, r m , a value for green, g m , and a value for blue, b m , based on at least the respective values r i , g i , and b i for the corresponding pixel in the corresponding frame in the input video signal.
  • a video signal that includes, for each pixel in each frame, having a value for red, r m , a value for green, g m , and a value for blue, b m , based on at least the respective values r i , g i , and b i for the corresponding pixel in the corresponding frame in the input video signal.
  • the video signal modification can vary depending on the factors that include, e.g., settings on TV 130 , content being viewed, viewing time, viewer's retinal composition, viewer's age, viewer's race or ethnicity, viewer's color vision status, etc. Exemplary algorithms for video signal modification are described below.
  • set top box 100 includes an internal power supply 140
  • other configurations are also possible.
  • an external power supply is used.
  • set top box 100 can draw power from batteries or from cable box 120 via cable 125 or a separate cable connecting the two components.
  • Set top box 100 can include additional components, such as memory buffers for buffering input signals before processing them, or modified signals after processing them before sending them to TV set 130 . Memory buffers may reduce latency during operation.
  • connections can be wireless connections (e.g., Wi-Fi connections or Bluetooth).
  • a TV set 200 includes an electronic processing module 210 in addition to a display panel 230 and display driver 220 .
  • a cable 205 connects cable box 120 to TV set 200 .
  • Electronic processing module 210 operates in a similar way as electronic processing module 110 described above in that it receives an input video signal from cable box 120 and outputs a modified video signal that for reduced myopiagenia. Electronic processing module 210 directs the modified video signal to display driver 220 , which in turn directs drive signals to display panel 230 to display the modified images.
  • video signals can be from other sources.
  • video signals may be supplied from a video game console or television set top box instead of (or in addition to) a cable box.
  • video signals from commercially-available set top box (such as Roku, Apple TV, Amazon Fire, etc.) or digital video recording (DVR) device such as TiVO or similar
  • video game consoles such as X-box consoles (from Microsoft Corp., Redmond Wash.), PlayStation consoles (from Sony Corp., New York, N.Y.), or Wii consoles (from Nintendo, Redmond, Wash.), can be modified.
  • a modified video signal is provided by a networked server 320 via a WAN 310 (e.g., the internet) to one or more end users 340 - 344 and no additional hardware is required by the end user.
  • the original (unmodified) video signal may be received by networked server 330 from either a networked provider 330 or via broadcast signal (e.g., VHF, UHF, or satellite signal) from a broadcaster 350 .
  • broadcast signal e.g., VHF, UHF, or satellite signal
  • the concepts disclosed herein may be generally applied to other devices that contain a color display.
  • the concepts may be implemented in computer monitors, digital signage displays, mobile devices (e.g., smart phones, tablet computers, e-readers), and/or wearable displays (e.g., head-mounted displays such as virtual reality and augmented reality headsets, Google glass, and smart watches).
  • mobile devices e.g., smart phones, tablet computers, e-readers
  • wearable displays e.g., head-mounted displays such as virtual reality and augmented reality headsets, Google glass, and smart watches.
  • video signal modification can be applied via software solutions alone.
  • video signals can be modified using software solutions installed on existing hardware (e.g., using a display's video card or a computer's or mobile device's processor).
  • video signals are modified using an app downloaded, e.g., from the internet.
  • an app e.g., from the internet.
  • a mobile device e.g., running Google's Android operating system or Apple's iOS operating system
  • signal modification may be implemented using a downloaded app.
  • versions of the system can be implemented in software, in middlewear, in firmware, in digital electronic circuitry, or in computer hardware, or in combinations of them.
  • the system can include a computer program product tangibly embodied in a machine-readable storage device for execution by a programmable processor, and method steps can be performed by a programmable processor executing a program of instructions to perform functions by operating on input data and generating output.
  • the system can be implemented in one or more computer programs that are executable on a programmable system including at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system, at least one input device, and at least one output device.
  • Each computer program can be implemented in a high-level procedural or object-oriented programming language, or in assembly or machine language if desired; and in any case, the language can be a compiled or interpreted language.
  • Suitable processors include, by way of example, both general and special purpose microprocessors.
  • a processor will receive instructions and data from a read-only memory and/or a random access memory.
  • a computer will include one or more mass storage devices for storing data files; such devices include magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and optical disks.
  • Storage devices suitable for tangibly embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM disks. Any of the foregoing can be supplemented by, or incorporated in, ASICs (application-specific integrated circuits).
  • semiconductor memory devices such as EPROM, EEPROM, and flash memory devices
  • magnetic disks such as internal hard disks and removable disks
  • magneto-optical disks magneto-optical disks
  • CD-ROM disks CD-ROM disks
  • Myopia or nearsightedness—is a refractive effect of the eye in which light entering the eye produces image focus in front of the retina, as shown in FIG. 4B for a myopic eye, rather than on the retina itself, as shown in FIG. 4C for a normal eye.
  • FIG. 4B For a myopic eye
  • FIG. 4C for a normal eye.
  • the spatial factor refers to the degree to which an image contains high spatial frequency, high contrast features. Fine contrast or detail, such as black text on a white page, form a high contrast stimulation pattern on the retinal cone mosaic.
  • the chromatic factor refers to how uniform blocks of highly saturated colors stimulate cone types asymmetrically, and therefore form a high contrast pattern on the retina. For example, red stimulates L cones more than M cones, whereas green light stimulates M cones more than L cones. Shorter wavelength light, such as blue, stimulates S cones more than either L or M cones.
  • the degree of color can refer to either the number of pixels of that color as well as their saturation levels, or both.
  • red pixels may be identified as pixels for which r is greater than g and/or b by a threshold amount or a percentage amount.
  • red pixels may be identified as pixels that have a red hue in the 1931 or 1976 CIE color space.
  • green pixels could be identified as pixels for which g is greater than r and/or b by a threshold or percentage amount; or green pixels may be identified as pixels that have a green hue in the 1931 or 1976 CIE color space.
  • blue pixels could be identified as pixels for which b is greater than r or g by a threshold amount or a percentage amount; or blue pixels could be identified as pixels that have a blue hue in the 1931 and 1976 CIE color space.
  • FIGS. 5A-5C and 6A-6C the spatial and chromatic effects can be explained as follows.
  • Each figure shows a hexagonal mosaic, corresponding to the spatial mosaic of cones on a retina.
  • the arrangement of cones is depicted in FIGS. 5B and 6B , where the L cones are colored red, the M cones are colored green, and the S cones are colored blue.
  • FIGS. 5A and 6A show two different types of stimuli at the retina and FIGS. 5C and 6C depict the cone responses due to the respective stimuli.
  • the stimuli in FIG. 5A corresponds to a high frequency, high contrast checkerboard pattern of white and black across the retina.
  • black refers to a pixel in its darkest state and white refers to a pixel in its brightest state.
  • black is typically represented by the values (0, 0, 0) and white by (255, 255, 255).
  • the spatial frequency of the checkerboard pattern is half the spatial frequency of the cones so, on a row by row basis, every alternate cone is has a high response (due to stimulation by white light) and the adjacent cones see no response (because there is no incident light at all). This response is depicted in FIG.
  • the stimuli in FIG. 6A corresponds to homogeneous red light of uniform intensity across the retina.
  • FIG. 6C there is a low response by the M and S cones (depicted by black squares in the mosaic) and some response by the L cones (depicted as grey squares). Accordingly, the red stimulus results in a differential stimulation of cones within the retina, particularly L cones compared to M cones.
  • the present disclosure further recognizes that high spatial frequency, high contrast images can similarly result in a similar myopiagenic response and a more comprehensive solution should account for the effect of such images. For example, if one considers only the amount of red in an image when applying a correction, the myopiagenic effect of a red image (e.g., that has L>M) is reduced, e.g., by introducing a green ring around the image and/or reducing saturation of the red image by decreasing the red level and/or increasing green. However, such as approach would not apply any improvement to an image on the basis of neighboring cone contrast.
  • a black and white checkerboard would not be improvable under the prior approach, because each black and each white pixel approximates an equal energy illuminant, and therefore would not be subject to an improved L/M ratio.
  • a black/white checkerboard would be subject to improvement in the present disclosure, because it creates high neighboring cone contrast; methods to improve such images are disclosed and described herein. Accordingly, algorithms that account for high spatial frequency effects are disclosed which can be used either alone or in combination with algorithms which reduce red saturation.
  • the color of each pixel in each frame can be modified based on one or more of the following parameters: (i) the color of the pixel in the frame itself; (ii) the location of the pixel in the frame, such as the proximity of the pixel to the edge of the frame; (iii) the color of another pixel in the frame, such as a neighboring pixel; (iv) the color of that same pixel in another frame, such as the preceding frame; and/or (v) the color of a different pixel in a different frame.
  • Implementations may reduce saturation of red pixels in an image, reduce contrast between adjacent pixels, or both.
  • an initial video signal 410 includes image information for a series of n initial frames, f 1 i , f 2 i , . . . , f n i .
  • Each frame is composed of k pixels, p 1 , p 2 , . . . , p k .
  • Each pixel is composed of three color component values, r i , g i , and b i , corresponding to values for red, green, and blue, respectively.
  • a relative level of stimulation of L cones, M cones, and/or S cones is determined for each pixel in each frame based on the values r i , g i , and b i .
  • this step may simply involve comparing the value of r i to the value of g i and/or b i for a pixel.
  • XYZ tristimulus values, LMS values, or other ways to measure cone stimulation may be calculated from the RGB values.
  • one or more pixels are identified for color modification based on the relative level of L, M, and/or S cone stimulation by each pixel. For example, in some embodiments, red pixels are identified by comparing the RGB values or based on a hue of each pixel. In other embodiments, pixels are chosen because of high levels of color contrast with other neighboring pixels. In still other embodiments, pixels are chosen because of high differences in cone stimulation levels among neighboring cones.
  • pixels are identified based on the color of other pixels in the frame. For example, groups of adjacent red pixels (e.g., corresponding to red objects in an image) are identified for modification but lone red pixels are left unmodified. Alternatively, or additionally, pixels may be identified for color modification based on the color of the same pixel in other frames. For example, in some embodiments, red pixels that persist for more than one frame (e.g., for one or several seconds, or more) may be identified for color modification, but those red pixels that exist for only one or just a few frames (e.g., a ⁇ 1 second, ⁇ 0.1 seconds, or ⁇ 0.01 seconds) may be left unmodified.
  • modified image data is generated based on the relative level of stimulation of L cones to M cones, or the level of adjacent cone contrast, and, in some cases, other factors (e.g., user preferences and/or aesthetic factors).
  • modification functions may be used. In general, the modification will reduce the level of red saturation in a pixel's color and/or reduce the contrast level between adjacent pixels or adjacent groups of pixels.
  • modified image data is generated by scaling r i , g i , and/or b i , e.g., by a corresponding scale factor ⁇ , ⁇ , ⁇ .
  • the scale factors ⁇ , ⁇ , and/or ⁇ for each pixel may vary depending on a variety of factors, such as, for example r i , g i , and/or b i for that pixel, r i , g i , and/or b i of another pixel in the same frame, r i , g i , and/or b i of the same pixel in a different frame, r i , g i , and/or b i of a different pixel in a different frame, and/or other factors.
  • factors such as, for example r i , g i , and/or b i for that pixel, r i , g i , and/or b i of another pixel in the same frame, r i , g i , and/or b i of the same pixel in a different frame, and/or other factors.
  • r i may be decreased for that pixel by some amount (i.e., 0 ⁇ 1) and/or g i may be increased for that pixel by some fractional amount (i.e., 1 ⁇ ).
  • ⁇ and/or ⁇ are functions of the difference between r i and g i .
  • scale factors can be established so that the larger the difference between r i and g i , the more the red value in the modified signal is reduced relative to the initial signal and/or the more the green value in the modified signal is increased.
  • one simple mathematical formulation for this type of scale is:
  • k ⁇ and k ⁇ are proportionality constants and c ⁇ and c ⁇ are constant offsets.
  • k ⁇ is negative so that a larger difference between r i and g i results in a smaller value for ⁇ .
  • k ⁇ is positive so that ⁇ increases proportionally to the difference between r i and g i .
  • the proportionality constants and constant offsets may be determined empirically.
  • red pixels in the modified image will appear darker than in the initial image.
  • red pixels in the modified image will appear whiter lighter than in the initial image. In both cases, the degree of red saturation in the red pixels will decrease as the amount of red decreases relative to green.
  • matrix multipliers may be used that create a linear transformation:
  • values for r f , g f , and b f are derived from linear combinations of their corresponding initial values and the difference between r and g.
  • a modified video signal 440 is output, containing image information for a series of n modified frames, f 1 m , f 2 m , . . . , f n m , each containing the same number of pixels, k, as the initial frames.
  • the RGB values are modified from the input signal.
  • the other pixels may be unchanged from the input signal. For example, the color of all the red pixels may be modified, while the color of the pixels that are not red are left unchanged.
  • a pixel's color is modified based on the color of a different pixel in the same frame.
  • the algorithm can include adjacent red pixels (e.g., corresponding red objects in an image), and reduce r i ⁇ g i for those pixels by a certain amount, while leaving isolated red pixels unchanged or reducing r i ⁇ g i by a different (e.g., lesser) amount.
  • the effect of color modification perceived by a viewer's visual processing in the brain may be reduced, e.g., using perceptual illusions such as the so-called watercolor effect or so-called Cornsweet effect.
  • a red object may appear to be more saturated than it actually is when the edge of the object is more saturated than the interior.
  • the watercolor effect may be used when modifying the color of objects in a frame, particularly when they are bordered by pixels that have chromaticies in opposite direction in color space or much darker pixels. See, e.g., http://www.scholarpedia.org/article/Watercolor_illusion.
  • the watercolor effect is illustrated for a red circle against a black background.
  • the initial image features a highly saturated, uniformly red circle.
  • There is a radial gradient toward the center where the gradient occurs on the outer 1 ⁇ 2 to 1 ⁇ 3 of the circle, avoiding the appearance of an annular discontinuity of the circle color.
  • the Cornsweet effect is an optical illusion where the gradient within a central line or section creates the impression that one side of the image appears darker than it actually is in reality. This effect may be utilized to reduce the brightness of red objects that border other red objects, for example, to allow a reduction in myopiagenic contrast while preserving an impression to the viewer that the image is highly saturated.
  • FIG. 8B shows an example of the Cornsweet effect.
  • the left most side of figure appears to be a brighter red than the right hand side.
  • both sides have the same brightness.
  • the illusion is created by the dark to bright gradient between the two sides when viewed from left to right.
  • Using the cornsweet effect it may be possible to reduce the saturation of certain red objects adjacent less saturated red objects with minimal change perceived by the viewer by introducing a light to dark gradient between the two objects.
  • Implementations that use illusions like the watercolor effect and Cornsweet effect may include additional image processing steps, such as identifying red objects in an image that may be candidates for the effect. Establishing candidacy of objects for these effects can be done based on factors such as the size and shape of the red object, uniformity of the red color of the object, and/or the nature of the bordering color.
  • the modification to a red pixel's color can be modified based on the location of the pixel in a frame. For example, if a pixel located closer to an edge of the frame may be modified, while a pixel of the same color located closer to the middle of the frame is unchanged or modified to a lesser degree.
  • the modification to a red pixel's color can be modified based on the type of object that the pixels represent. Certain objects may be deemed to be important to preserve in their original colors. One example might be a company logo or branded product where the colors are very recognizable. Using image analysis, those objects could be identified by comparison to an image database, and flagged for differential treatment in the algorithm.
  • the color of a pixel in one frame may be modified based on the color of that pixel in another frame.
  • the color of colored objects that persist over a series of frames may be modified so that the degree of saturation of the reds in the object lessen over time.
  • the time scale and rate of color change may be sufficient so that the effect is not easily noticeable to a viewer, but effectively reduces color saturation or overall retinal contrast.
  • the degree to which red pixels are modified may increase over time. Accordingly, the longer the viewer views the display during a particular viewing session, the greater the degree of modification of the red pixels.
  • the algorithm may implement one or more techniques to improve computation efficiency and avoid, for example, issues with latency when delivering images to a display. For example, in some embodiments, only a subset of the pixels and/or frames are evaluated for modification. For example, for purposes of computational efficiency, not every frame is evaluated (e.g., only every other frame, or fewer, is evaluated). Such sampling may improve latency of the algorithm when executed in real time.
  • not every pixel is evaluated in every frame. For example, only those pixels proximate to the center of the frame (e.g., where the viewer is more likely to focus) are evaluated. Alternatively, only those pixels distant from the center of the frame, where the viewer is less likely to notice changes, are evaluated. Alternatively, or additionally, image analysis techniques can be applied to identify which portions of a frame are in focus (and therefore likely to be focused on by the viewer) and apply color modification only to those pixels in the focused portions.
  • the algorithm periodically samples pixels in each frame in order to decide whether to evaluate other pixels. For example, the algorithm can check the color of every 2 nd or fewer pixels (e.g., every 3 rd pixel or fewer, every 5 th pixel, every 10 th pixel or fewer, every 20 th pixel). In the event that this initial sampling detects a pixel that is a candidate for modification, the algorithm can apply color modification to the identified pixel. Pixels in between the sampled areas can either be left unmodified or further sampled to determine if they are candidates for modification. Alternatively, they could be modified by the same linear transformation as the initially sampled pixel, or interpolated values in between sampled pixels could be used to determine the final pixel values. Such sampling techniques may be useful to improve speed of the algorithm, so that it is not necessary to evaluate every pixel in every frame.
  • the algorithm can check the color of every 2 nd or fewer pixels (e.g., every 3 rd pixel or fewer, every 5 th pixel, every 10 th
  • Compression techniques used for encoding images may also be used to improve efficiency.
  • chroma subsampling may be used. Examples of chroma subsampling include 4:2:2, 4:2:1, 4:1:1, and 4:2:0 subsampling. This subsampling may also be useful to improve speed of the algorithm, so that it is not necessary to evaluate every pixel in every frame.
  • the resolution of color pixels generally is reduced so that pixel rendering of color becomes easier without being readily noticeable to viewers.
  • the resolution could be kept the same as in the initial image, and in-between pixels would be derived from interpolated values or linear transformation based upon the sampled pixels.
  • Input from additional hardware components can also be used to modify the color modification algorithm.
  • the system can include an eye-tracking module in order to follow which location on the display a user is viewing. Subsequently, color modification is applied to only the location on the display being viewed. Alternatively, color modification is applied to only the locations on the display that are not being viewed.
  • Commerically-available eye tracking solutions may be used for this purpose.
  • An example of a commercially-available solution is the Tobii EyeX Controller, available from Tobii AB (Danderyd, Sweden).
  • the algorithm modifies those portions of an image that are not the focus of the viewer, but leaves the portion of the image that is focused on unchanged. In this way, the impact of the modification on the viewing experience is reduced because the modified pixels are in the viewer's periphery.
  • Text is often displayed in high-contrast black and white which, for reasons discussed previously, can elicit a particularly acute myopiagenic response even though these images typically contain no red pixels.
  • text can be rendered in high contrast only within a portion of the image (e.g., a viewing bubble) and text outside of this area can be display with reduced contrast and/or with a blurred effect.
  • the bubble can be moved over the text or the text can be moved through a stationary bubble.
  • the speed of relative movement may be selected according to a preferred reading speed of the user (e.g., 20 words per minute or more, 50 words per minute or more, 80 words per minute or more, 100 words per minute or more, 150 words per minute or more, 200 words per minute or more, 250 words per minute or more, 300 words per minute or more, 350 words per minute or more, 400 words per minute or more, 450 words per minute or more, 500 words per minute or more, up to about 800 words per minute).
  • a preferred reading speed of the user e.g., 20 words per minute or more, 50 words per minute or more, 80 words per minute or more, 100 words per minute or more, 150 words per minute or more, 200 words per minute or more, 250 words per minute or more, 300 words per minute or more, 350 words per minute or more, 400 words per minute or more, 450 words per minute or more, 500 words per minute or more, up to about 800 words per minute).
  • the size and shape of the viewing bubble can also vary as desired.
  • the viewing bubble can correspond to an angle of about 20° or less in a user's field of view (e.g., 15° or less, 10° or less, 5° or less) in the horizontal and/or vertical viewing directions.
  • the viewing bubble can be elliptical, round, or some other shape.
  • the user can set the size and/or shape of the viewing bubble.
  • the viewing bubble can track the user's finger as it traces across lines of text.
  • Devices may utilize a touch screen for finger tracking.
  • the bubble can be moved by tracing a, stylus, mouse, or other indicator of attention.
  • eye-tracking technology can be used to follow the location on the display a user is viewing.
  • the algorithm can use information from an eye-tracking camera to identify pixels for modification in real time. Those pixels away from the viewed location are modified while the area of focus is unmodified (or modified to a lesser extent).
  • Eye-tracking may be particularly useful in mobile devices (e.g., using the front facing camera), computer monitors (e.g., using a video-conferencing camera), and/or with video game consoles, for example.
  • the algorithm calculates other quantifiable measures of cone stimulation by the image. For example, it is possible to model how much an image will differentially stimulate center-surround antagonism in the human visual system by directly quantifying the extent of spatial and chromatic contrast contained in the image. Relatively high center-surround antagonism is expected to result in a high degree of differential stimulation and therefore a larger myopia-causing effect than center-surround contrast that is relatively lower.
  • the algorithm measures include only L cones and M cones. In other embodiments, the contributions of S cones are also included.
  • calculating cone stimulation first involves translating RGB values for each pixel to a color space that quantitatively links the spectral content of the pixel to the physiologically perceived colors in human vision.
  • a color space is the CIE 1931 XYZ color space, discussed previously. This color space defines the XYZ tristimulus values analogously to the LMS cone responses of the human eye.
  • algorithms can compare X and Y (or X, Y, and Z, if desired). For example, in some case, color modification is applied to those pixels for which X>Y and Z, but not for pixels where X ⁇ Y and/or Z.
  • cone stimulation values in LMS color space can be calculated from the XYZ tristimulus values (see, e.g., https://en.wikipedia.org/wiki/LMS_color_space). Algorithms for performing such calculations are known (see, e.g., the xyz2lms program, available at www.imageval.com/ISET-Manual-201506/iset/color/transforms/xyz2lms.html). With LMS values, color modification can be applied to candidate pixels, for example those whose L values are above a certain threshold and/or those pixels for which L>M (e.g., L>M and S).
  • L>M e.g., L>M and S
  • cone stimulation can be calculated directly using the physical properties of light.
  • Light intensity and wavelength from each of R, G, and B can be measured from a device such as a television, computer, or tablet. The intensity of each wavelength that passes through the eye and reaches the retina can be calculated.
  • These values can then be translated into stimulation of L, M, and S cones, for example by using the Smith-Pokorny cone fundamentals (1992) or the cone fundamentals as modified by Stockman and Sharpe (2000).
  • scales derived from calculations that determine cone stimulation based on LMS values are referred to as LMS myopia scales.
  • an exemplary algorithm 900 for determining cone stimulation levels by an RGB formatted stimulus on a simulated retina is as follows. Algorithm 900 starts ( 901 ) by establishing a simulated retina ( 920 ). Generally, this involves establishing a relative number of L, M, and S cones, and establishing their arrangement pattern. FIG. 6B shows an example of a simulated retina. Here, different numbers of L, M, and S cones are randomly arranged with hexagonal packing (i.e., on a brickwall-patterned grid).
  • Algorithm 900 receives the stimulus pattern in RGB format ( 910 ).
  • the RGB stimulus pattern corresponds to the colors of a pixel array, as discussed previously.
  • the pixel array can correspond to a single image frame or a portion of an image frame, for example.
  • each frame will correspond to a separate RGB stimulus pattern.
  • FIG. 6A shows an example of a stimulus pattern.
  • step 930 the RGB values for each element of the stimulus pattern are converted into a corresponding set of XYZ tristimulus values.
  • Such transformations are well-known. See, e.g., “Colour Space Conversions,” by Adrian Ford (ajoec1@wmin.ac.uk ⁇ defunct>) and Alan Roberts (Alan.Roberts@rd.bbc.co.uk), Aug. 11, 1998, available at http://www.poynton.com/PDFs/coloureq.pdf.
  • step 940 LMS values are calculated from each of the XYZ tristimulus values using, e.g., xyz2lms.
  • the stimulus pattern is then mapped onto the simulated retina.
  • the elements of the stimulus pattern is in a 1:1 correspondence with the cones of the simulated retina and the mapping results in the selection of the L, M, or S value at each element of the stimulus pattern depending on whether the cone at the corresponding retina location is an L cone, an M cone, or an S cone, respectively.
  • a stimulation level at each cone is determined from the mapping (step 960 ). In some implementations, this determination simply involves assigning each cone the L, M, or S value based on the mapping. In certain cases, the LMS value is scaled to fall within a particular range or the LMS value is weighted to increase or decrease a contribution due to certain portions of the spectrum or other factors.
  • the algorithm ends ( 999 ) after outputting the cone stimulation levels.
  • Implementations may involve variations of algorithm 900 .
  • algorithm 900 involves a 1:1 pixel to cone mapping, higher or lower mapping ratios may be used.
  • cone stimulation can be calculated for stimuli where more than one pixel is imaged to a single cone. This may occur, for example, in high resolution displays or where a display is viewed from relatively far away.
  • the algorithm can include an additional step of averaging the color of groups of pixels to provide a stimulus pattern having the same resolution and grid shape as the simulated retina. The number of pixels per cone may vary.
  • 2 or more pixels per cone may be used (e.g., 3 or more pixels per cone, 4 or more pixels/cone, 5 or more pixels per cone, 6 or more pixels per cone, 7 or more pixels per cone, 8 or more pixels per cone, 9 or more pixels per cone, or 10 pixels per cone).
  • the algorithm may account for fewer than one pixel being imaged to each cone (e.g., 2 or more cones per pixel, 3 or more cones per pixel, 4 or more cones per pixel, 5 or more cones per pixel, 6 more cones per pixel, 7 or more cones per pixel, 8 or more cones per pixel, 9 or more cones per pixel, up to 10 cones per pixel).
  • This is the case with lower resolution displays, or when displays are viewed from a closer distance.
  • a pixel can be assigned to more than one grid point in a stimulus pattern having the same resolution and grid shape as the simulated retina.
  • Some implementations can include calculating (i.e., accounting for) the number of pixels per cone for a specific display and/or user.
  • the number of pixels per cone may be calculated from the pixel density for a display as follows. First, the typical maximum retinal resolution, ⁇ , of 1 arc minute, is assumed, as well as a viewing distance, d, that is typically ⁇ 2.5 times the display's diagonal dimension (i.e., a 60′′ TV is viewed from 12.5′ away, and an iPhone 6 is viewed from a foot away). The calculation can be adjusted for other viewing distances, as desired.
  • a screen's size and resolution e.g., 1,920 ⁇ 1,080 for a 1080p 60′′ TV set, 1,334 ⁇ 750 for the Apple iPhone 6
  • a screen's size and resolution e.g., 1,920 ⁇ 1,080 for a 1080p 60′′ TV set, 1,334 ⁇ 750 for the Apple iPhone 6
  • the ratio of these numbers gives the number of pixels per cone (or the reciprocal). This illustrated for a 60′′ 1080P TV in FIG. 12B , for which the screen area per cone equals 0.24 mm 2 .
  • the point spread function of light can be used to map the light coming from the pixels to cones in the retina.
  • the point spread function of light is due to imperfect optics of the human eye, and effects how incident light strikes the retinal cone mosaic.
  • the equal area cone fundamentals from FIG. 1 are used to calculate the relative excitation of L, M, and S cones.
  • Other implementations using other representations of the cone fundamentals are possible. These include cone fundamentals based on quanta, those corrected to energy terms, and those that have been normalized to peak values. Cone fundamentals for either a two-degree or ten-degree observer could be used, or any other observer for which cone fundamental data is available can be used. In addition, these calculations can be adjusted and made specific for a person's age, macular pigmentation, cone mosaic composition, and/or other factors.
  • the equal energy illuminant D65 is used for conversions between RGB, XYZ, and LMS.
  • other illuminants can be used, such as CIE-A (incandescent lamps), CIE-C, or CIE-E.
  • the CIECAM02 matrix is used to convert between XYZ values and LMS values. In other embodiments, other matrices are used to perform linear transformations. Any acceptable transformation matrix (or none at all, if XYZ values are used directly) can be used in this respect.
  • an algorithm 1000 for scoring a digital video file is as follows. This algorithm, or similar algorithms, may be applied to other media, such as image files.
  • the algorithm starts ( 1001 ) by receiving (or generating) cone stimulus values for a simulated retina stimulated by a frame of the digital video file (step 1010 ).
  • the cone stimulus values may be determined using algorithm 900 shown in FIG. 9 , for example.
  • the algorithm calculates an average x of the LMS stimulus values for that cone (c) and each of its neighbors (n i ).
  • cone c is considered the center of a visual receptive field and the nearest neighbors are the surround.
  • x is calculated as:
  • the number of neighbors will depend on the cone pattern in the stimulated retina and how many neighbors are included for each cone. In one embodiment, only the nearest neighbors are considered. For example, in a grid pattern, a cone has eight nearest neighbors. Such a pattern is illustrated in FIG. 11A . With hexagonal packing, each cone has six nearest neighbors as shown in FIG. 11B .
  • steps 1030 and 1040 the difference between the neighbor stimulus values, n i , and the average, x , is calculated, and squared, and divided by x : (n i ⁇ x ) 2 / x .
  • This provides a measure of the relative difference in stimulation between the cone, c, and each of its nearest neighbors.
  • This value provides a quantitative measured of the level of stimulation of cone, c, relative to its nearest neighbors. It is believed that a relatively high NSS value represents a large differential response and corresponds to a larger myiopiagenic response from cone, c, than a lower NSS value.
  • n i and x may be used instead.
  • may be used.
  • Other alternatives include calculate a variance of the values or a standard deviation.
  • NSS values are calculated for each cone in the stimulated retina ( 1060 ) and then the NSS values can be averaged over the entire frame ( 1070 ). This process is repeated for each frame ( 1080 ) and then the NSS values averaged over all frames ( 1090 ).
  • the frame-averaged NSS value is scaled to a desired range (e.g., a percentage) and/or the media file is scored based on the frame-averaged NSS value.
  • Table 1 below, provides exemplary results of such a calculation for varying stimuli.
  • a 100 ⁇ 100 pixel array was used (“pixel count”), and a 1:1 cone-to-pixel mapping assumed.
  • the percentage of L-to-M-to-S cones varied as indicated in columns 2-4.
  • the results of each calculation is provided in column 6 (“Raw Scale”). The score is quoted raw, un-normalized to any particular value.
  • center-surround models are also possible.
  • such models can account for a variety of factors that are believed to influence center-surround interactions, such as relative center and surround contrasts, relative phase/collinearity, width of surround, relative orientations, spatial frequencies, and speeds, threshold vs. suprathreshold, and individual differences, which are not generally mutually exclusive.
  • Another model for center-surround interactions is described by J. Xing and D. J. Heeger in “Measurement and modeling of center-surround suppression and enhancement,” in Vision Research , Vol. 41, Issue 5 (March 2001), pp. 571-583.
  • the model is based on a non-linear interaction of four components: local excitation, local inhibition, surround excitation, and surround inhibition.
  • the myopiagenic value can be normalized to a scale or assigned some other identifier indicative of the contents myopiagenic effect.
  • the value can be presented as a value in a range (e.g., from 1 to 10), as a percentage, or by some other alphanumeric identifier (e.g., as a letter grade), color scale, or description.
  • Myopiagenic scales for content may be useful in many ways.
  • a scale allows one to rate content (e.g., movies or other video files) as to its myopiagenic effect on a viewer.
  • a scale also provides an objective way to measure algorithms that modify images, including changing colors of images. They can be used to rate efficacy of algorithms designed to increase or decrease neighboring cone contrast. They can also be used to rate efficacy of algorithms designed to increase or decrease myopiagenicity. For example, one can compare algorithms by comparing the score of a common video file after it is modified using a respective algorithm. In some embodiments, one can compare the effect on myopiagenic reduction of algorithms having differing computational efficiencies using the scale. For instance, one can evaluate the tradeoff between an algorithm that modifies every frame in a video file, versus one that modifies fewer frames (e.g., every other frame, every third frame, etc) Similarly, one can evaluate the tradeoff between algorithms that evaluate every pixel versus sampling pixels within frames.
  • Quantitative myopiagenic scales may be useful in the design of products in addition to evaluating media.
  • myopiagenic scales can be used to evaluate combinations of colors in certain types of displays and identify those color combinations rating favorably on the myopiagenic scale.
  • Such color combinations are useful when displaying text, in particular, which is commonly displayed using black text on a white background at the maximum contrast allowed by the display.
  • the high level of contrast between the text and background produces high levels of contrast at a viewer's retina, which in turn leads myopia.
  • the myopiagenic effects of reading may be reduced by selecting a color combination offering relatively low overall cone contrast. This may be useful in displaying text in various settings, including but not limited to e-book hardware, e-book software, word processing software, and the like.
  • a myopiagenic scale such as the one described above, may be useful for selecting color combinations for displaying text. This can be accomplished by evaluating, using the scale, different combinations of colors for text and background.
  • an exemplary evaluation was performed for a series of color combinations modeled using a 100 ⁇ 100 checkerboard of candidate text and background colors, with varying contrast edges.
  • This pattern provides a stimulus with 50% text color and 50% background color.
  • Other patterns providing different ratios between text and background colors can be used, which may be more representative of certain fonts, spacing, and margins (for example, approximately 5% text color, approximately 10% text color, approximately 15% text color, approximately 20% text color, approximately 25% text color, approximately 30% text color, approximately 35% text color, approximately 40% text color, or approximately 45% text color).
  • a simulated retina was used having a 100 ⁇ 100 cone pattern in linear row and column grid, and a 1:1 ratio of pixels to cones was used.
  • each color was selected with values from 0-255 for each RGB.
  • the available color space was sampled using every color in steps of 50 (6 3 values for each of text and background), resulting in a total of 6 6 or 46,656 combinations in total.
  • a three-dimensional plot shows the results of the experiment.
  • the vertical scale gives the unscaled myopiagenic score.
  • the horizontal axes give the respective Text Color and Background Color. Note that the values on the horizontal scales are expressed in hexadecimal, where the 0-255 RGB values is converted to hex and the colors reported as RRGGBB.
  • Results range from myopiagenic scores of 0 (white text on white background and black text on black background) to 419.34 (black text on white background). Accordingly, color combinations that provide a reduced myopiagenic score compared to black text on white background (e.g., light green on cyan, with a score of 155) may be selected for use when displaying text.
  • black text on white background e.g., light green on cyan, with a score of 155
  • the lowest scores are impractical because they provide no contrast between text and background and cannot be read.
  • color combinations with low but non-zero scores can be selected.
  • additional criteria may be considered when selecting e-reader color combinations.
  • an objective index for readability may be considered. Highest readability is expected to occur when the color system can differentiate best between text and background colors (e.g., when L and M values are most different between text and background). This is different from the myopiagenic scale which assumes that the highest myopiagenic effect occurs when adjacent cones have highest differential stimulation. In other words, the myopiagenic effect comes from both differences between text and background (which improves readability but increases myopia), but also from within text and background (which does not improve readability but increases myopia).
  • readability may be scored by surveying respondents. Alternatively, it can be scored based on color contrast between text and background using the LMS system or another color system. Such differences may be quantified using a formula such as the following:
  • R ⁇ R ( ( L 1 - L 2 ) 2 1 2 ⁇ ( L 1 + L 2 ) ) + ⁇ R ( ( M 1 - M 2 ) 2 1 2 ⁇ ( M 1 + M 2 ) ) + ⁇ R ( ( S 1 - S 2 ) 2 1 2 ⁇ ( S 1 + S 2 ) )
  • L, M, and S are the values described above for which the subscript 1 refers to the text color and 2 refers to the background color.
  • results of several color combinations from an experiment are tabulated.
  • columns 1, 2, and 3 are the RGB values for the background color (each from 0-255)
  • columns 4-6 are the corresponding X, Y, Z tristimulus values
  • columns 7-9 the corresponding LMS values.
  • Columns 10, 11, and 12 are the RGB values for the text color (each from 0-255)
  • columns 13-15 are the corresponding X, Y, Z tristimulus values
  • columns 16-18 the corresponding LMS values.
  • the calculated myopiagenic scale score based on a 100 ⁇ 100 checkerboard grid with 50% text/50% background is given in column 19 and the % reduction in score relative to black text on white background (row 1) is given in column 20.
  • An example of the color scheme is shown in column 21.
  • the next four columns (22-25) give values related to the readability score.
  • column 22 gives the values for
  • Column 26 provides a composite score that consists of the ratio readability/myopia score.
  • FIG. 15A shows a table in which columns 1, 2, and 3 are the RGB values for the background color, columns 4-6 are the corresponding X, Y, Z tristimulus values, and columns 7-9 the corresponding LMS values.
  • Columns 10, 11, and 12 are the RGB values for the text color, columns 13-15 are the corresponding X, Y, Z tristimulus values, and columns 16-18 the corresponding LMS values.
  • Column 19 shows the myopiagenic scale score and column 20 shows the percent reduction (as a decimal) from black text on white background;
  • column 21 shows an example of text rendered using the color combination.
  • Columns 22-24 give the same parameters as columns 22-24 in FIG. 14
  • column 25 gives the readability score. Accordingly, using the scale described above, the myopia scores for the first and second combinations are similar (both ⁇ 18). As is evident (at least anecdotally) from the example text in column 21, the first color combination is easier to read than the second color combination. This is borne out by their relative readability scores, which are approximately 2.0 and 0.1, respectively.
  • FIGS. 15B and 15C show simulated cone stimulation for the first color combination.
  • the text and cones have different levels of stimulation with text stimulation levels varying approximately within a range from 32 to 40.
  • the background stimulation levels vary within a lower, largely non-overlapping range approximately from 22 to 30.
  • FIG. 15C shows cone stimulation levels for the second color combination.
  • variance within text and background is similar to variance between text and background.
  • Both text and background have larger variance compared to the first color combination (ranging from approximately 35 to 55, with the exception of a few cones having lower stimulation values due to background, in this example from simulated S cones).
  • Cone stimulation of text overlaps with cone stimulation of background.
  • FIGS. 16A-16C illustrate the same principle for two further color combination examples.
  • the first color combination has RGB values (150, 150, 150) for background and (150, 50, 50) for text.
  • the second color combination has RGB values (250, 100, 250) for background and (150, 150, 200) for text. Again, anecdotally, the first color combination is significantly more readable than the second color combination.
  • Columns 1-26 shows the same parameters as columns 1-26 in FIG. 15A .
  • FIG. 16B show a plot of cone stimulation for a stripe of text between two stripes of background for the first color combination.
  • the text and background have significantly different levels of stimulation and variance for within the text and within the background are low compared to variance between text and background levels.
  • FIG. 16C show a plot of cone stimulation for a stripe of text between two stripes of background for the second color combination. Variance within text and background is similar to variance between text and background. Both text and background have larger variance compared to the first color combination and cone stimulation of text overlaps with cone stimulation of background.
  • While commercially-available e-readers include modes of operation that display text in color combinations other than black and white that may have a reduced myopiagenic effect compared to black text on a white background, it is believed that the disclosed implementations provide color combinations offering substantially greater reductions.
  • the NookColor offers “color text modes” such as “Night,” “Gray,” “Butter,” “Mocha,” and “Sepia” in addition to “Day” (basic black text against white background (see, e.g., http://www.dummies.com/how-to/content/nook-tablet-text-and-brightness-tools.html).
  • color combinations having a myopia score using the LMS myopia scale of less than about 130 are possible (e.g., about 120 or less, about 110 or less, about 100 or less, about 90 or less, about 80 or less, about 70 or less, about 60 or less, about 50 or less, about 40 or less, about 30 or less, such as from about 20 to about 30).
  • such colors can offer an improvement in myopia reduction of about 65% or more (e.g., about 70% or more, about 75% or more, about 80% or more, about 85% or more, about 90% or more, about 95% or more).
  • Color combinations having a composite readability/myopia score of 0.80 or more are possible (e.g., 0.85 or more, 0.90 or more, 0.95 or more, 1.00 or more, 1.05 or more, 1.10 or more, 1.15 or more, 1.20 or more, 1.25 or more, 1.30 or more, 1.35 or more, 1.40 or more, such as 1.45).
  • e-reader or word processing solutions based on the above may be implemented in a variety of ways.
  • color combinations with favorable myopiagenic scores and readability scores may be selected by the user as an option.
  • the e-reader can present the user with a variety of color combination options, from which the user can selected a desirable choice. This is advantageous because preferred color combinations are expected to vary from user to user and providing a selection of choices will allow each user to use a color combination most desirable to them.
  • word processing solutions could be determined in a similar fashion.
  • Monochrome e-readers may be used having color combinations have reduced myopiagenic scores and relatively good readability based on scales such as the those described above.
  • each pixel is composed of one or more “microcapsules” containing two types of pigmented particles having opposite charge. When a charge is applied to a particular pixel, the particles having like charge are repelled from one side of the pixel to the other, and those having opposite charge are attracted. Accordingly, by reversing the charge on the pixel, the pixel can take on the color of one pigment or the other, or various combinations of the two depending on how long the charge is applied.
  • pigments can be selected (alone or in combination with black and/or white pigments) to correspond to color combinations that have reduced myopiagenic scores relative to black and white pigments.
  • such pigment combinations can reduce contrast between adjacent neurons of the retina and/or reduce center-surround antagonism.
  • a user can input a desired level of myopia reduction and the e-reader returns a selection of color combinations that correspond to the desired level.
  • FIG. 17 shows an algorithm 1700 in which a user can select text-background color combinations having a desired level of myopia reduction.
  • the e-reader presents the user with an interface, such as an input box, slider, dropdown box, radio buttons, or other input tool, in which the user can input a desired level of myopia reduction.
  • the desired level can be a minimum amount of myopia reduction, a range of myopia reduction values, or a single value indicative of the desired level.
  • Levels may be expressed as a percentage (e.g., where the most myopiagenic combination corresponds to 0% reduction and the most myopia reducing combination is 100%) or on some other scale (e.g., from 0 to 10 or some other alphanumeric scale).
  • algorithm 1700 Upon receiving the user's input (step 1710 ), algorithm 1700 retrieves color combinations corresponding to the level designated by the user and presents one or more combinations to the user (step 1720 ).
  • the color combinations can be calculated using a myopia scale such as by the algorithm, or can be calculated beforehand and stored in a database (e.g., locally or remote) that is accessed by the algorithm.
  • the number of color combinations presented to the user can vary.
  • the algorithm can present only a subset of combinations that most closely match the user's desired level (e.g., 10 or fewer, 8 or fewer, 5 or fewer).
  • the algorithm can present those color combinations that match the user's desired myopia reduction level within a certain range (e.g., within 10% of the desired level, within 5%, within 2%, within 1%).
  • the algorithm Upon viewing the presented color combinations, the user selects the desired combination. Upon receiving the selection (step 1730 ), the algorithm displays text using the selected color combination (step 1740 ).
  • the algorithm can present color combinations to the user based on one or more criteria in addition to the desired level of myopia reduction. For instance, the user can be presented color combinations based on a readability score (see above) in addition to level of myopia reduction. Alternatively, the user can be presented color combinations based on the preferences gathered from other users or the preferences previously expressed by a particular user and/or derived by previous behavior of a particular user or group of users.
  • the algorithm includes a recommendation engine that provides a selection of myopia-reducing color combinations based on the nature of content in the e-book.
  • the recommendation can vary depending on whether the e-book is primarily text (e.g., a novel or nonfiction book), contains both text and figures (e.g., a textbook, magazine, or newspaper), or is primarily figures (e.g., a graphic novel or comic).
  • Recommended color combinations for different e-book content can be based on a myopiagenic scale (e.g., the LMS scale described above) which is used to evaluate the myopiagenic effect of different types of content.
  • recommendations can be based on data collected and observed about user preferences (e.g., the individual user in front of the screen at the moment, broad sets of user data about which is accumulated over time from many users, or both) that may be preferable or suitable for e-reading different types of content.
  • an e-reader can include modes for users: a conventional mode that displays e-books using conventional color schemes, and a myopia-safe mode for displaying e-books using a color combination with a reduced myopiagenic effect compared to the conventional mode.
  • different color combinations can be associated with different accounts on device.
  • an e-reader can feature a user experience that allows a parent to create settings for children (e.g., one or more) as well as themselves that have different myopia reduction levels. In other words, kids may not be able to select color combinations when operating the e-reader under their account (or at least have a reduced ability to change display colors).
  • an administrator e.g., adult account
  • the color combinations used to present text and background can vary (automatically, or upon prompting) over time.
  • a myopia-reduced mode can begin a reading session using a color combination have a first level of myopia reduction and change the color combination as the reading session progresses.
  • colors with increasing myopia reduction can be used as a reading session progresses (e.g., as measured by time or progress in reading the content). The color changes can happen automatically. Alternatively, the user can be prompted to change the color combination as the reading session progresses.
  • the e-reader can change between color combinations that have similar myopia scores as a reading session progresses, e.g., simply to present a change for the user.
  • Myopia-reduced color combinations can be implemented in an e-reader in a variety of ways. For example, myopia-reduced color combinations can be included as part of the operating system of the e-reader as discussed above. Alternatively, the myopia-reduced color combinations can be implemented via software as an add-on to existing e-reader programs or as standalone e-reader applications that can be installed on an e-reader, other mobile device, or any other device used for reading e-books.
  • any format e-book can be displayed using a combination of colors that have a reduced myopia potential compared to black and white, including (without limitation) Broadband eBooks (BBeB) (e.g., e-book files using extensions .lrf; .lrx), Comic Book Archive file (e.g., e-book files using file extensions .cbr (RAR); .cbz (ZIP); .cb7 (7z); .cbt (TAR); .cba (ACE)), Compiled HTML (e.g., e-book files using extension .chm), DAISY-ANSI/NISO Z39.86, DjVu (e.g., e-book files using extension .djvu), DOC (e.g., e-book files using extension .DOC), DOCX (e.g., e-book files using extension .DOCX), EPUB (e.g., e-book files using extension .epub), e
  • aspects of the systems and methods described here can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them.
  • the electronic processing modules disclosed above can be implemented using digital electronic circuitry, or in computer software, firmware, or hardware, or in combinations of one or more of them.
  • the term “electronic processing module” encompasses all kinds of apparatus, devices, and machines for processing data and/or control signal generation, including by way of example a programmable processor, a computer, a system on a chip, or multiple ones, or combinations, of the foregoing.
  • the module can include special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
  • the module can also include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross-platform runtime environment, a virtual machine, or a combination of one or more of them.
  • the module and execution environment can realize various different computing model infrastructures, such as web services, distributed computing and grid computing infrastructures.
  • a computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages.
  • a computer program may, but need not, correspond to a file in a file system.
  • a program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code).
  • a computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • Some of the processes described above can be performed by one or more programmable processors executing one or more computer programs to perform actions by operating on input data and generating output.
  • the processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
  • processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and processors of any kind of digital computer.
  • a processor will receive instructions and data from a read only memory or a random access memory or both.
  • a computer includes a processor for performing actions in accordance with instructions and one or more memory devices for storing instructions and data.
  • a computer may also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks.
  • mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks.
  • a computer need not have such devices.
  • Devices suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices (e.g., EPROM, EEPROM, flash memory devices, and others), magnetic disks (e.g., internal hard disks, removable disks, and others), magneto optical disks, and CD ROM and DVD-ROM disks.
  • semiconductor memory devices e.g., EPROM, EEPROM, flash memory devices, and others
  • magnetic disks e.g., internal hard disks, removable disks, and others
  • magneto optical disks e.g., CD ROM and DVD-ROM disks.
  • the processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
  • a computer having a display device (e.g., a flat panel display, or another type of display device) for displaying information to the user and a keyboard and a pointing device (e.g., a mouse, a trackball, a tablet, a touch sensitive screen, or another type of pointing device) by which the user can provide input to the computer.
  • a display device e.g., a flat panel display, or another type of display device
  • a keyboard and a pointing device e.g., a mouse, a trackball, a tablet, a touch sensitive screen, or another type of pointing device
  • Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
  • a computer can interact with a user by sending documents to and receiving documents from a
  • a computing system may include a single computing device, or multiple computers that operate in proximity or generally remote from each other and typically interact through a communication network.
  • Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), an inter-network (e.g., the Internet), a network comprising a satellite link, and peer-to-peer networks (e.g., ad hoc peer-to-peer networks).
  • LAN local area network
  • WAN wide area network
  • Internet inter-network
  • peer-to-peer networks e.g., ad hoc peer-to-peer networks.
  • a relationship of client and server may arise by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
  • FIG. 18 shows an example electronic processing module 800 that includes a processor 810 , a memory 820 , a storage device 830 and an input/output device 840 .
  • the processor 810 is capable of processing instructions for execution within the system 800 .
  • the processor 810 is a single-threaded processor, a multi-threaded processor, or another type of processor.
  • the processor 810 is capable of processing instructions stored in the memory 820 or on the storage device 830 .
  • the memory 820 and the storage device 830 can store information within the module 800 .
  • the input/output device 840 provides input/output operations for the module 800 .
  • the input/output device 840 can include one or more of a network interface devices, e.g., an Ethernet card, a serial communication device, e.g., an RS-232 port, and/or a wireless interface device, e.g., an 802.11 card, a 3G wireless modem, a 4G wireless modem, etc.
  • the input/output device can include driver devices configured to receive input data and send output data to other input/output devices, e.g., keyboard, printer and display devices 860 .
  • mobile computing devices, mobile communication devices such as smart phones or tablet computers, and other devices can be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Human Computer Interaction (AREA)
  • Molecular Biology (AREA)
  • Biotechnology (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Radiology & Medical Imaging (AREA)
  • Ophthalmology & Optometry (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Surgery (AREA)
  • Data Mining & Analysis (AREA)
  • Genetics & Genomics (AREA)
  • Physiology (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Image Processing (AREA)
  • Processing Or Creating Images (AREA)
US15/409,049 2016-01-18 2017-01-18 Methods for displaying an e-book using a combination of colors for text and background that have a reduced myopiagenic effect Abandoned US20170205977A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/409,049 US20170205977A1 (en) 2016-01-18 2017-01-18 Methods for displaying an e-book using a combination of colors for text and background that have a reduced myopiagenic effect

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662279954P 2016-01-18 2016-01-18
US15/409,049 US20170205977A1 (en) 2016-01-18 2017-01-18 Methods for displaying an e-book using a combination of colors for text and background that have a reduced myopiagenic effect

Publications (1)

Publication Number Publication Date
US20170205977A1 true US20170205977A1 (en) 2017-07-20

Family

ID=59314567

Family Applications (4)

Application Number Title Priority Date Filing Date
US15/409,049 Abandoned US20170205977A1 (en) 2016-01-18 2017-01-18 Methods for displaying an e-book using a combination of colors for text and background that have a reduced myopiagenic effect
US16/070,771 Expired - Fee Related US10621948B2 (en) 2016-01-18 2017-01-18 Evaluating and reducing myopiagenic effects of electronic displays
US16/070,765 Abandoned US20190057673A1 (en) 2016-01-18 2017-01-18 Method and apparatus for reducing myopiagenic effect of electronic displays
US16/798,955 Active US11205398B2 (en) 2016-01-18 2020-02-24 Evaluating and reducing myopiagenic effects of electronic displays

Family Applications After (3)

Application Number Title Priority Date Filing Date
US16/070,771 Expired - Fee Related US10621948B2 (en) 2016-01-18 2017-01-18 Evaluating and reducing myopiagenic effects of electronic displays
US16/070,765 Abandoned US20190057673A1 (en) 2016-01-18 2017-01-18 Method and apparatus for reducing myopiagenic effect of electronic displays
US16/798,955 Active US11205398B2 (en) 2016-01-18 2020-02-24 Evaluating and reducing myopiagenic effects of electronic displays

Country Status (9)

Country Link
US (4) US20170205977A1 (ko)
EP (2) EP3405925A2 (ko)
JP (2) JP2019503517A (ko)
KR (2) KR20180122609A (ko)
CN (2) CN109196574A (ko)
AU (2) AU2017209051A1 (ko)
CA (2) CA3011794A1 (ko)
TW (2) TW201737238A (ko)
WO (2) WO2017127444A1 (ko)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190068896A1 (en) * 2016-12-21 2019-02-28 Samsung Electronics Co., Ltd. Method for producing media file and electronic device thereof
CN109814977A (zh) * 2019-02-02 2019-05-28 珠海金山网络游戏科技有限公司 一种文字显示方法、装置、计算设备及存储介质
US20190340791A1 (en) * 2017-09-12 2019-11-07 Jamison HILL Intelligent systems and methods for dynamic color hierarchy & aesthetic design computation
US10621948B2 (en) 2016-01-18 2020-04-14 Waveshift Llc Evaluating and reducing myopiagenic effects of electronic displays
US10872582B2 (en) 2018-02-27 2020-12-22 Vid Scale, Inc. Method and apparatus for increased color accuracy of display by compensating for observer's color vision properties
US10902644B2 (en) * 2017-08-07 2021-01-26 Samsung Display Co., Ltd. Measures for image testing
US10958987B1 (en) * 2018-05-01 2021-03-23 Amazon Technologies, Inc. Matching based on video data
US11048532B1 (en) * 2019-11-27 2021-06-29 Amazon Technologies, Inc. Device agnostic user interface generation based on device input type
US11110292B2 (en) * 2018-03-19 2021-09-07 New England College Of Optometry Balanced cone excitation for controlling refractive error and ocular growth to inhibit development of myopia
US11164352B2 (en) * 2017-04-21 2021-11-02 Intel Corporation Low power foveated rendering to save power on GPU and/or display
US20210345469A1 (en) * 2019-01-18 2021-11-04 Opple Lighting Co., Ltd. Measurement method and device of light source parameters, illumination system and terminal apparatus
US20210386284A1 (en) * 2020-06-16 2021-12-16 Natific Ag Color vision variability test system
US20220309982A1 (en) * 2021-03-25 2022-09-29 Boe Technology Group Co., Ltd. Color setting method, display device, non-transitory computer readable storage medium
US11463667B2 (en) * 2017-06-12 2022-10-04 Hewlett-Packard Development Company, L.P. Image projection
US11470326B2 (en) 2018-05-01 2022-10-11 Amazon Technologies, Inc. Encoder output coordination
US11837140B2 (en) * 2020-04-17 2023-12-05 Dolby Laboratories Licensing Corporation Chromatic ambient light correction
US11904102B2 (en) 2019-12-17 2024-02-20 Tobii Ab Method and system for applying adapted colour rendering on a display

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109886210B (zh) * 2019-02-25 2022-07-19 百度在线网络技术(北京)有限公司 一种交通图像识别方法、装置、计算机设备和介质
CN112148241B (zh) * 2019-06-28 2023-09-01 百度在线网络技术(北京)有限公司 灯光处理方法、装置、计算设备和存储介质
IT202100001022A1 (it) * 2021-01-21 2022-07-21 Univ Degli Studi Padova Dispositivo atto ad essere connesso tra un computer ed uno schermo per modifica in tempo reale di un segnale video digitale in uscita da detto computer verso detto schermo

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140368525A1 (en) * 2013-06-12 2014-12-18 Google Inc. Systems and methods for changing contrast based on brightness of an output for presentation on a display

Family Cites Families (63)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3524976A (en) 1965-04-21 1970-08-18 Rca Corp Binary coded decimal to binary conversion
CA1089981A (en) 1977-06-29 1980-11-18 Takao Tsuchiya Clamping circuit for color television signals
JPS5558688A (en) 1978-10-24 1980-05-01 Tokyo Hoso:Kk Color correcting circuit
GB2202708B (en) 1987-03-16 1991-05-29 Mitsubishi Electric Corp Color converting device
JP3626514B2 (ja) 1994-01-21 2005-03-09 株式会社ルネサステクノロジ 画像処理回路
US6278434B1 (en) 1998-10-07 2001-08-21 Microsoft Corporation Non-square scaling of image data to be mapped to pixel sub-components
KR100614326B1 (ko) 2000-01-24 2006-08-18 엘지전자 주식회사 디지털 티브이의 영상 제어 장치
CN1165183C (zh) 2000-05-15 2004-09-01 北京北达华彩科技有限公司 自适应色度补偿法及其补偿装置
GB0102154D0 (en) 2001-01-27 2001-03-14 Ibm Decimal to binary coder/decoder
EP1380177A1 (en) 2001-04-11 2004-01-14 Koninklijke Philips Electronics N.V. Picture signal contrast control
US7956823B2 (en) 2001-05-30 2011-06-07 Sharp Kabushiki Kaisha Color display device, color compensation method, color compensation program, and storage medium readable by computer
US6724435B2 (en) * 2001-08-06 2004-04-20 Oplus Technologies Ltd. Method for independently controlling hue or saturation of individual colors in a real time digital video image
US7205973B2 (en) * 2003-02-12 2007-04-17 Nvidia Corporation Gradual dimming of backlit displays
SG153630A1 (en) 2003-04-18 2009-07-29 Sharp Kk Color display device, color compensation method, color compensation program, and storage medium readable by computer
JP2005134866A (ja) 2003-04-18 2005-05-26 Sharp Corp カラー表示装置、色補正方法および色補正プログラム
US20050132087A1 (en) 2003-12-12 2005-06-16 Lech Glinski Method and apparatus for video signal skew compensation
US7471843B2 (en) 2004-02-04 2008-12-30 Sharp Laboratories Of America, Inc. System for improving an image displayed on a display
US20060227809A1 (en) 2005-03-23 2006-10-12 Miller Rodney D System and method for base band-directional communication protocol for time-division multiplexing graphic data applications
JP2006295595A (ja) 2005-04-12 2006-10-26 Sharp Corp 画像表示装置及び画像処理方法
US8054314B2 (en) 2005-05-27 2011-11-08 Ati Technologies, Inc. Applying non-homogeneous properties to multiple video processing units (VPUs)
KR20080025593A (ko) 2006-09-18 2008-03-21 삼성전기주식회사 디스플레이 영상의 색 보정을 위한 장치 및 방법
US20080204471A1 (en) 2006-10-27 2008-08-28 Jaeger Brian J Systems and methods for improving image clarity and image content comprehension
US7477778B2 (en) 2006-12-26 2009-01-13 Texas Instruments Incorporated Sequential color reproduction method
KR20080064568A (ko) 2007-01-05 2008-07-09 삼성전자주식회사 다양한 커넥터를 통해 입력되는 영상을 디스플레이하는디스플레이 장치
JP4139433B1 (ja) 2007-05-15 2008-08-27 スクルド・エンタープライズ有限会社 画像信号補正方法
US8111935B2 (en) 2007-10-03 2012-02-07 Himax Technologies Limited Image processing methods and image processing apparatus utilizing the same
US8094933B2 (en) 2007-12-13 2012-01-10 Global Oled Technology Llc Method for converting an input color signal
JP4560741B2 (ja) 2007-12-13 2010-10-13 ソニー株式会社 情報処理装置および方法、プログラム、並びに情報処理システム
KR101623890B1 (ko) 2007-12-20 2016-06-07 에이티아이 테크놀로지스 유엘씨 비디오 소스 디바이스와 비디오 싱크 디바이스를 구비하는 시스템에서의 비디오 프로세싱 조정
US8078658B2 (en) 2008-02-01 2011-12-13 International Business Machines Corporation ASCII to binary decimal integer conversion in a vector processor
WO2010062647A2 (en) 2008-10-28 2010-06-03 Pixtronix, Inc. System and method for selecting display modes
EP2379028B1 (en) 2008-12-22 2017-09-13 Medical College Of Wisconsin, Inc. Apparatus for limiting growth of eye length
JP4528861B2 (ja) 2009-01-19 2010-08-25 シャープ株式会社 画像表示装置及び画像表示方法
CN101533635A (zh) 2009-04-09 2009-09-16 南京Lg新港显示有限公司 一种显示器图像处理方法
TWI424425B (zh) 2009-07-22 2014-01-21 Chunghwa Picture Tubes Ltd 三色色彩灰階值轉換四色色彩灰階值的裝置及其方法、液晶顯示器及其驅動方法
US8847972B2 (en) 2010-01-20 2014-09-30 Intellectual Ventures Fund 83 Llc Adapting display color for low luminance conditions
JP5335706B2 (ja) 2010-01-21 2013-11-06 シャープ株式会社 画像表示装置
US20110279472A1 (en) * 2010-05-14 2011-11-17 Sony Ericsson Mobile Communications Ab Display arrangement and method of displaying information
JP2011248060A (ja) 2010-05-26 2011-12-08 Fujitsu Ten Ltd 表示装置、及び、表示方法
EP2580709A4 (en) * 2010-06-11 2016-05-25 Back In Focus SYSTEMS AND METHODS FOR RENDERING ON A DISPLAY DEVICE TO COMPENSATE A VISUAL DISPLAY OF A SPECTATOR
US9437160B2 (en) 2010-07-15 2016-09-06 Mersive Technologies, Inc. System and method for automatic color matching in a multi display system using sensor feedback control
CN101908330B (zh) * 2010-07-26 2012-05-02 武汉大学 一种低动态范围显示设备再现高动态范围图像的方法
TWI631334B (zh) 2011-01-14 2018-08-01 美國華盛頓大學商業中心 診斷及治療眼球長度相關病症之方法
WO2012145672A1 (en) 2011-04-21 2012-10-26 University Of Washington Through Its Center For Commercialization Myopia-safe video displays
GB2495317A (en) * 2011-10-06 2013-04-10 Sharp Kk Image processing method for reduced colour shift in multi-primary LCDs
JP2013104912A (ja) 2011-11-10 2013-05-30 Sony Corp 表示装置および表示方法
JP5825681B2 (ja) 2012-05-31 2015-12-02 国立大学法人 鹿児島大学 画像処理装置、画像処理方法及びプログラム
CN103631548B (zh) 2012-08-22 2019-01-08 慧荣科技股份有限公司 影像外接装置及处理影像外接装置的方法
JP5814966B2 (ja) * 2013-03-19 2015-11-17 キヤノン株式会社 画像表示装置及びその制御方法
US9965062B2 (en) 2013-06-06 2018-05-08 Microsoft Technology Licensing, Llc Visual enhancements based on eye tracking
JP5811228B2 (ja) 2013-06-24 2015-11-11 大日本印刷株式会社 画像処理装置、表示装置並びに画像処理方法及び画像処理用プログラム
WO2015044830A1 (en) * 2013-09-27 2015-04-02 Visuality Imaging Ltd Methods and system for improving the readability of text displayed on an electronic device screen
US10134342B2 (en) * 2013-11-18 2018-11-20 Elwha Llc Systems and methods for producing narrowband images
US10372788B2 (en) 2014-01-31 2019-08-06 Rakuten Kobo Inc. E-reader to help users with dyslexia by providing enhancement features including moving sentences within a paragraph away from an edge of a page
US9773473B2 (en) 2014-06-03 2017-09-26 Nvidia Corporation Physiologically based adaptive image generation
TWI602111B (zh) * 2014-06-10 2017-10-11 和碩聯合科技股份有限公司 電子裝置及使用者介面的色彩設定方法
GB201410635D0 (en) 2014-06-13 2014-07-30 Univ Bangor Improvements in and relating to the display of images
KR102634148B1 (ko) 2015-03-16 2024-02-05 매직 립, 인코포레이티드 건강 질환 진단과 치료를 위한 방법 및 시스템
EP3742324A1 (en) 2015-09-15 2020-11-25 Gatekeeper Ltd. System and method for securely connecting to a peripheral device
US9490880B1 (en) 2016-01-07 2016-11-08 Freecsale Semiconductor, Inc. Hardware-based time alignment of wireless links
US10127706B2 (en) * 2016-01-12 2018-11-13 Esight Corp. Language element vision augmentation methods and devices
KR20180122609A (ko) 2016-01-18 2018-11-13 웨이브시프트 엘엘씨 전자 디스플레이의 근시 발생 효과 평가 및 감소
US20200143536A1 (en) 2017-07-14 2020-05-07 University Of Washington Methods and systems for evaluating and reducing myopic potential of displayed color images

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140368525A1 (en) * 2013-06-12 2014-12-18 Google Inc. Systems and methods for changing contrast based on brightness of an output for presentation on a display

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10621948B2 (en) 2016-01-18 2020-04-14 Waveshift Llc Evaluating and reducing myopiagenic effects of electronic displays
US11205398B2 (en) 2016-01-18 2021-12-21 Waveshift Llc Evaluating and reducing myopiagenic effects of electronic displays
US10645306B2 (en) * 2016-12-21 2020-05-05 Samsung Electronics Co., Ltd. Method for producing media file and electronic device thereof
US20190068896A1 (en) * 2016-12-21 2019-02-28 Samsung Electronics Co., Ltd. Method for producing media file and electronic device thereof
US11164352B2 (en) * 2017-04-21 2021-11-02 Intel Corporation Low power foveated rendering to save power on GPU and/or display
US11587273B2 (en) 2017-04-21 2023-02-21 Intel Corporation Low power foveated rendering to save power on GPU and/or display
US11463667B2 (en) * 2017-06-12 2022-10-04 Hewlett-Packard Development Company, L.P. Image projection
US10902644B2 (en) * 2017-08-07 2021-01-26 Samsung Display Co., Ltd. Measures for image testing
US20190340791A1 (en) * 2017-09-12 2019-11-07 Jamison HILL Intelligent systems and methods for dynamic color hierarchy & aesthetic design computation
US10789735B2 (en) * 2017-09-12 2020-09-29 Jamison HILL Intelligent systems and methods for producing a dynamic color combination for a design output
US10872582B2 (en) 2018-02-27 2020-12-22 Vid Scale, Inc. Method and apparatus for increased color accuracy of display by compensating for observer's color vision properties
US11110292B2 (en) * 2018-03-19 2021-09-07 New England College Of Optometry Balanced cone excitation for controlling refractive error and ocular growth to inhibit development of myopia
US10958987B1 (en) * 2018-05-01 2021-03-23 Amazon Technologies, Inc. Matching based on video data
US11470326B2 (en) 2018-05-01 2022-10-11 Amazon Technologies, Inc. Encoder output coordination
US20210345469A1 (en) * 2019-01-18 2021-11-04 Opple Lighting Co., Ltd. Measurement method and device of light source parameters, illumination system and terminal apparatus
CN109814977A (zh) * 2019-02-02 2019-05-28 珠海金山网络游戏科技有限公司 一种文字显示方法、装置、计算设备及存储介质
US11048532B1 (en) * 2019-11-27 2021-06-29 Amazon Technologies, Inc. Device agnostic user interface generation based on device input type
US11904102B2 (en) 2019-12-17 2024-02-20 Tobii Ab Method and system for applying adapted colour rendering on a display
US11837140B2 (en) * 2020-04-17 2023-12-05 Dolby Laboratories Licensing Corporation Chromatic ambient light correction
US20210386284A1 (en) * 2020-06-16 2021-12-16 Natific Ag Color vision variability test system
US11813021B2 (en) * 2020-06-16 2023-11-14 Natific Ag Color vision variability test system
US20220309982A1 (en) * 2021-03-25 2022-09-29 Boe Technology Group Co., Ltd. Color setting method, display device, non-transitory computer readable storage medium
US11908367B2 (en) * 2021-03-25 2024-02-20 Boe Technology Group Co., Ltd. Display device and color setting method based on user selections

Also Published As

Publication number Publication date
WO2017127457A3 (en) 2017-09-14
EP3405943A1 (en) 2018-11-28
CN109196574A (zh) 2019-01-11
JP6801003B2 (ja) 2020-12-16
WO2017127457A2 (en) 2017-07-27
KR20180122609A (ko) 2018-11-13
JP2019505019A (ja) 2019-02-21
KR20180123477A (ko) 2018-11-16
US11205398B2 (en) 2021-12-21
EP3405943A4 (en) 2019-01-02
EP3405925A4 (en) 2018-11-28
CA3011808A1 (en) 2017-07-27
KR102162339B1 (ko) 2020-10-06
US20190057673A1 (en) 2019-02-21
AU2017210160A1 (en) 2018-08-02
JP2019503517A (ja) 2019-02-07
CN109313793A (zh) 2019-02-05
TW201737238A (zh) 2017-10-16
TW201738877A (zh) 2017-11-01
US20200265796A1 (en) 2020-08-20
WO2017127444A1 (en) 2017-07-27
EP3405925A2 (en) 2018-11-28
AU2017209051A1 (en) 2018-08-02
TWI736574B (zh) 2021-08-21
US10621948B2 (en) 2020-04-14
US20190035358A1 (en) 2019-01-31
CA3011794A1 (en) 2017-07-27

Similar Documents

Publication Publication Date Title
US11205398B2 (en) Evaluating and reducing myopiagenic effects of electronic displays
CN108604133B (zh) 图像形成的改进
JP5654065B2 (ja) 映像処理方法および装置
US10297186B2 (en) Display device and image processing method thereof
US11182934B2 (en) Method and apparatus for color-preserving spectrum reshape
US10366673B2 (en) Display device and image processing method thereof
KR20140033890A (ko) 이미지 구동 방법 및 이를 이용하는 이미지 구동 장치
KR102389196B1 (ko) 표시장치와 그 영상 렌더링 방법
TW201232522A (en) Image display device and method of driving the same
US20200143536A1 (en) Methods and systems for evaluating and reducing myopic potential of displayed color images
Gong et al. Impacts of appearance parameters on perceived image quality for mobile-phone displays
US20230016631A1 (en) Methods for color-blindness remediation through image color correction
Dolph Perception and Mitigation of Artifacts in a Flat Panel Tiled Display System
JP2011053702A (ja) 表示装置

Legal Events

Date Code Title Description
AS Assignment

Owner name: WAVESHIFT LLC, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FERTIK, MICHAEL BENJAMIN;CHALBERG, THOMAS W.;SIGNING DATES FROM 20170202 TO 20170208;REEL/FRAME:041276/0073

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION