WO2021087398A1 - Afficheurs vidéo sans danger pour la myopie - Google Patents

Afficheurs vidéo sans danger pour la myopie Download PDF

Info

Publication number
WO2021087398A1
WO2021087398A1 PCT/US2020/058410 US2020058410W WO2021087398A1 WO 2021087398 A1 WO2021087398 A1 WO 2021087398A1 US 2020058410 W US2020058410 W US 2020058410W WO 2021087398 A1 WO2021087398 A1 WO 2021087398A1
Authority
WO
WIPO (PCT)
Prior art keywords
component
input
magnitude
output
blue
Prior art date
Application number
PCT/US2020/058410
Other languages
English (en)
Inventor
David William OLSEN
Original Assignee
Visu, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Visu, Inc. filed Critical Visu, Inc.
Publication of WO2021087398A1 publication Critical patent/WO2021087398A1/fr

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/2003Display of colours
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/08Biomedical applications

Definitions

  • Cone cells are photoreceptor cells in the retina of the eye that are responsible for color vision.
  • a human eye typically comprises three types of cones, each of which has a response curve (roughly a normal distribution) over a range of wavelengths of light and a peak sensitivity over a particular, smaller range of wavelengths of light.
  • Long wavelength sensitive cones also referred to as L cones (or red cones) respond most intensely to light having long wavelengths: the peak sensitivities of L cones are typically around wavelengths 564-580 nm (greenish- yellow light).
  • M cones Medium wavelength sensitive cones
  • S cones Short wavelength sensitive cones
  • S cones respond most to light having short wavelengths: the peak sensitivities of S cones are typically around wavelengths 420-440 nm (blue light).
  • Myopia is a refractive defect of the eye in which light entering the eye produces image focus in front of the retina, rather than on the retina itself.
  • Myopia is often colloquially referred to as nearsightedness.
  • Myopia may be measured in diopters, which is a unit of measurement of the optical power of the eye's lens, equal to the reciprocal of the focal length of the lens.
  • Each display screen whether for televisions, computers, tablets, mobile phones, virtual reality (VR) headsets, etc. — has unique spectral output characteristics. That is, the primary colors chosen by the respective display’s manufacturers to reproduce the intended image to the viewer each has its own unique spectra centered around a wavelength that the manufacturer has selected to represent that particular color. It is by combining these primary colors in varying proportions that the display screen is able to reproduces the intended colors of the image for the viewer. The full suite of colors available for a given screen is referred to as the gamut of the display.
  • the gamut can be visualized easily using a two-dimensional diagram, like the CIE chromacity diagram shown in FIGURE 2, where, all the colors in the display’s gamut are constrained within the region whose vertices are anchored at the primary colors’ wavelengths.
  • a two-dimensional diagram like the CIE chromacity diagram shown in FIGURE 2, where, all the colors in the display’s gamut are constrained within the region whose vertices are anchored at the primary colors’ wavelengths.
  • three-primary color screens which are the most common types of screens today, those colors form a triangle made up of red, green, and blue. Screens which use more than three primary color drivers enable a richer gamut of colors to be presented to the viewer because they cover more area in the xy- plane (per FIGURE 2) and thus carve out a larger gamut.
  • Some screens like analog CRTs, tend to produce very strong red signals, which preferentially activate L cones six to eight times more effectively than M cones.
  • the myopiagenic load of such video signals can be reduced by decreasing the magnitude of the red component in each pixel, thereby reducing adjacent LM cone contrast on the retina.
  • the integrity of the picture can be maintained by correspondingly increasing the green and/or blue sub-pixel weight by some offset or offsets, with the goal of maintaining roughly equivalent overall weighting of the sum of the three original sub-pixels.
  • blue sub-pixel emissions are a driver of myopia in predominantly blue-green hued pixels, where the magnitude of the M cone stimulation from the incident light from this pixel onto the retina exceeds that of the adjacent L cones.
  • Figure 3 illustrates this using measured data.
  • the blue primary drives the M- cone hard, almost as strongly as the red primary drives the L-cone, with a ratio of approximately 1.28-to-l.
  • Adj acent LM cone contrast on the retina of a viewer can be reduced by attenuating the magnitude of the blue sub-pixel emission in a pixel with a predominantly blue-to-green hue.
  • This methodology can be applied in series with the red sub-pixel attenuation methodology, wherein the myopiagenic toxicity of the red sub-pixel emissions is reduced through a similar approach. Used in concert, these two related but distinct methodologies can act synergistically to produce an even safer screen-viewing experience. The order of the two effects may not be critical to operation of the combined system, as long as they are cascaded serially.
  • the present disclosure describes methods for creating video signals including (a) receiving an input video signal including an input red component, an input green component, and an input blue component; (b) determining (i) that a magnitude of the input blue component is greater than a magnitude of the input green component and (ii) a differential between the magnitude of the input blue component and the magnitude of the input green component; and (c) sending an output video signal including an output red component, an output green component, and an output blue component, where at least one of the following is true: (i) the output blue component is decreased by a fractional amount relative to the input blue component based on the differential; and/or (ii) the output green component is increased by a fractional amount relative to the input green component based on the differential, and the output red component is increased by a fractional amount relative to the input red component based on the differential.
  • the myopia-driving sub-pixel under consideration here (blue, rather than red) is compared against the green sub-pixel to calculate the differential quantity. That is, green is subtracted from blue.
  • the reason for using green is related to the goal of at least some implementations described herein, which is to ameliorate the problem of too much adjacent LM cone contrast - in this case, too much M cone stimulation - due to the superposition of the M cone-stimulating wavelengths of the blue sub-pixel with the M cone stimulating wavelengths of the green sub-pixel.
  • the goal of the methods and technology described in this disclosure is to reduce this superposition effect, with minimal aesthetic impact on the screen.
  • the methodologies described in this disclosure can be used in concert with other methodologies (see, e.g., U.S. 9,955,133) to create a serial cascade of video processing effects.
  • the steps can be performed in either order.
  • Combining the techniques reduces myopia-genesis thanks to two complementary processes: (i) lowering of myopia-causing red emissions which overstimulate L cones (as described in the prior art) and (ii) lowering of myopia-causing green emissions which overstimulate M cones, all with minimal impact on image integrity.
  • a non-transitory computer-readable medium that has program instructions stored thereon that are executable by at least one processor or GPU (graphics processing unit), the program instructions including instructions that cause the processor or GPU to perform steps of any embodiment or combination of embodiments of the methods of the first aspect.
  • the program instructions may include: (a) instructions for receiving an input video signal including an input red component, an input-green component, and an input-blue component; (b) instructions for determining (i) that a magnitude of the input blue component is greater than a magnitude of the input green component and (ii) a differential between the magnitude of the input blue component and the magnitude of the input green component, respectively; and (c) instructions for sending an output video signal including an output red component, an output green component, and an output blue component, where at least one of the following is true: (i) the output blue component is decreased by a fractional amount relative to the input blue component based on the differential; and/or (ii) the output green component is increased by a fractional amount relative to the input green component based on the differential, and the output red component is increased by a fractional amount relative to the input blue component based on the differential.
  • the circuits are provided for creating video signals, including: (a) a comparator configured to (i) receive a blue component input signal and a green component input signal and (ii) output a blue-green differential signal including a difference of the blue component input signal to the green component input signal; (b) an arithmetic block configured to (i) receive the blue- green differential signal and (ii) output a scaled blue-green differential signal when the blue component input signal is of a greater magnitude than the green component input signal; (c) a first adder configured to (i) receive the red component input signal and the scaled blue-green differential signal and (ii) output a red component output signal including a summation of the red component input signal and the scaled blue- green differential signal; and (d) a second adder configured to (i) receive a green component input signal and the scaled blue-green differential signal and (ii) output a green component output signal including a summation of the green component input signal and the scaled blued blue- green differential signal; and (
  • devices are provided yfor creating video signals, including: (a) a comparator unit configured to receive a blue component input signal and a green component input signal; compare a magnitude of the blue component input signal and a magnitude of the green component input signal; and output a blue- green differential signal when the magnitude of the blue component input signal is greater than at least the magnitude of the green component input signal; and (b) an adding unit configured to receive (i) the blue-green differential signal, (ii) the green component input signal, the (iii) a red component input signal, and (iv) the blue component input signal; output a green component output signal including a summation of the green component input signal and a fraction of the blue- green differential signal; output a red component output signal including a summation of the red component input signal and a fraction of the blue-green differential signal; and output a blue component output signal including a subtraction of a fraction of the blue-green differential signal from the blue component input signal.
  • FIGURE la shows the simultaneous excitation of all three types of cones when exposed to red phosphor stimuli, in accordance with some implementations of the current subject matter.
  • FIGURE lb shows the simultaneous excitation of all three types of cones when exposed to green phosphor stimuli, in accordance with some implementations of the current subject matter.
  • FIGURE 2 shows an example RGB (three-color) display gamut within a CIE Chromacity diagram, in accordance with some implementations of the current subject matter.
  • FIGURE 3 shows Oculus Rift CV1 primary color spectra as measured by the Neitz lab superimposed onto comeal sensitivity profiles for the S, M, and L cones of a typical 18 year old, in accordance with some implementations of the current subject matter.
  • FIGURE 4 is a schematic of an example circuit, in accordance with some implementations of the current subject matter.
  • FIGURE 5 is a schematic of an example circuit, in accordance with some implementations of the current subject matter.
  • FIGURE 6 is a block diagram of an example computing device capable of implementing some implementations.
  • FIGURE 7 depicts an example computer-readable medium, in accordance with some implementations of the current subject matter.
  • FIGURE 8 is a flow chart illustrating an example method, in accordance with some implementations of the current subject matter.
  • FIGURE 9a shows a model unit cube representing a standard RGB display, in accordance with some implementations of the current subject matter.
  • FIGURE 9b shows a model unit cube representing an exemplary myopia-safe display for which the blue primaries would otherwise cause strong differential stimulation of adjacent LM cones, in accordance with some implementations of the current subject matter.
  • FIGURE 9c shows a model unit cube representing an exemplary myopia-safe display for which the red primaries would otherwise cause strong differential stimulation of adjacent LM cones, in accordance with some implementations of the current subject matter.
  • FIGURE 9d shows a model unit cube representing an exemplary myopia-safe display for which the red and blue primaries would otherwise cause strong differential stimulation of adjacent LM cones, in accordance with some implementations of the current subject matter.
  • the output blue component is decreased by a fractional amount relative to the input blue component based on the differential; and/or (ii) the output green component is increased by a fractional amount relative to the input green component based on the differential, and the output red component is increased by a fractional amount relative to the input red component based on the differential.
  • the methods can be used with any electronic displays that utilize the signals of standard video (i.e.: Red, Green, Blue (RGB)).
  • video displays include, but are not limited to, cathode ray tube (CRT) displays, liquid crystal displays (LCD), light-emitting diodes (LED), displays that combine LCDs with LEDs (LCD/LED), organic LED displays, plasma displays, digital light processing displays (DLP), and virtual reality (VR) headsets, among other examples.
  • CTR cathode ray tube
  • LCD liquid crystal displays
  • LED light-emitting diodes
  • LCD/LED organic LED displays
  • plasma displays plasma displays
  • DLP digital light processing displays
  • VR virtual reality
  • M medium wavelength
  • L long wavelength
  • red red
  • FIGURE la shows the simultaneous excitation of all three types of cones when exposed to red phosphor stimuli. As can be seen, the red cones are excited to a relatively large degree in FIGURE la. Similarly, FIGURE lb shows the simultaneous excitation of all three types of cones when exposed to green phosphor stimuli.
  • FIGURE 2 shows that the blue primary output spectra of a popular VR headset overlaps significantly with the spectral sensitivity profile of the M (green) cones of a typical 18 year old human, which implies that the blue sub-pixel output can also be a strong driver of LM cone contrast and thus myopia.
  • Color displays cause myopia by producing a difference in activity between L cones and M cones. In children and young adults whose eyes are still developing, differences in activity between neighboring cones signal the eye to grow in length. Overstimulation of the eye with images that produce activity differences between neighboring cones cause the eye to grow too long resulting in near sightedness. The amount of eye growth is proportional to the magnitude of the differences in activity between cones.
  • modem display types including but not limited to LCD, LCD/LED, organic LED displays, plasma displays, and DLP
  • the blue primaries in other video display devices such as VR headsets also drive myopia due to their strong activation of M-cones.
  • using the methods to reduce the difference in excitation levels of L and M cones can cause video displays to be more myopia-safe. Since viewers still perceive red, green, and blue even when the difference in L and M cone excitation is greatly reduced, a display can be altered such that the blues are desaturated without losing the basic coloring and image integrity of the original display.
  • the methods and myopia safe displays described herein produce a video output that reduces the activity differences between L and M cones produced by the display compared to a standard display, while having minimal impact on the viewing experience. This is achieved by desaturating the blue colors in the display with the goal of minimizing the superposition of M cone stimulating green emissions from the blue and green sub-pixels in the display.
  • the technique can be used in concert with other myopia-safe display techniques (see, e.g., U.S. 9,955,133) as a secondary process to make displays even more myopia-safe.
  • Blue primary colors which demonstrate spectral overlap of the M cone sensitivity profile as illustrated in FIGURE 3 produce large activities in M cones but little activity in L cones.
  • Saturation or more formally, excitation purity, is defined here as the difference between a color and white along a line in the International Commission on Illumination 1931 color space (ICE 1931) chromaticity diagram in which all colors have the same hue.
  • White is, by definition, the color that produces equal activity in all three cone types.
  • Saturation is defined in terms of additive color mixing and has the property of being proportional to any scaling along a line in color space centered at white. The closer to white a color is the more desaturated it is, the smaller the differences in L and M cone activity and the less myopia inducing it is.
  • color space is nonlinear in terms of psychophysically perceived color differences.
  • the blue color of video displays can be desaturated to drastically reduce the myopia-genic properties of the display with only modest changes in the perceptual experience. Combining any color with its complementary color produces white. If an amount of the complementary color that is less than the amount needed to make white is added, a desaturated version of the color results.
  • the “input video signal” for a particular pixel (or, for analog video signals, the pixel equivalent) in a particular video frame comprises red, green, and blue components.
  • the methods comprise determining (i) that an intensity of the blue component is greater than an intensity of the green component; and (ii) a differential between the intensity of the blue component and the intensity of the green component. Once the differential is determined, a modified output video signal is generated in which some of the blue color is desaturated to replace the original color with one less likely to induce myopia. This is done by at least one of the following processes:
  • the output green (G) component is increased by a fractional amount relative to the input green component based on the differential
  • the output red (B) component is increased by a fractional amount relative to the input blue component based on the differential.
  • the method includes desaturating the blue component by increasing the output green component by a fractional amount relative to the input green component based on the differential and increasing the output red component by a fractional amount relative to the input red component based on the differential. In one further embodiment, the output green component and the output red component are increased by the same fractional amount.
  • the goal is to reduce the saturation of the blue colors in the image without changing their hue or brightness. This is mainly achieved by increasing G and R equally.
  • the output green component and the output red component are increased by different, respective, fractional amounts. Increasing R more than G or G more than R might be done to optimize the reduction of the potential for inducing myopia and to maximize the viewing experience.
  • the output blue component may be decreased by a fractional amount relative to the input red or green component based on the differential.
  • the output blue (B) component is decreased by a fractional amount relative to the input blue component based on the differential, where the R and G components are not modified. Decreasing the B-G differential by decreasing B will tend to make the color darker (decreasing its brightness) and may be done in some circumstances to optimize the reduction of the potential for inducing myopia and to maximize the viewing experience.
  • the methods comprise changing an output component intensity by some fractional amount of the B-G differential, when B is greater than G.
  • B-G is less than or equal to zero R
  • G and B are unchanged.
  • R and G are increased by a percentage of B-G and/or B is decreased by a percentage of the differential. Any suitable percentage can be used as is determined appropriate for a given system and a desired level of protection against myopia.
  • the method of any one of the claims is carried out only if the B-G differential is greater than a predetermined threshold.
  • a predetermined threshold may be set equal to 30% of B.
  • G must be less than or equal to 70% of B for the method to be carried out.
  • the present disclosure describes a non-transitory and/or physical computer-readable medium having program instructions stored thereon that are executable by at least one processor, the program instructions comprising instructions that cause the processor to perform steps of any embodiment or combination of embodiments of the methods of the first aspect of the implementation.
  • the program instructions may comprise:
  • the output blue component is decreased by a fractional amount relative to the input blue component based on the differential; and/or (ii) the output green component is increased by a fractional amount relative to the input green component based on the differential, and the output red component is increased by a fractional amount relative to the input red component based on the differential.
  • the computer-readable media can be used with any electronic display that utilizes the signals of standard video (i.e.: Red, Green, Blue (RGB)).
  • video displays include, but are not limited to, cathode ray tube (CRT) displays, liquid crystal displays (LCD), light-emitting diodes (LED), displays that combine LCDs with LEDs (LCD/LED), organic LED displays, plasma displays, digital light processing displays (DLP), and virtual reality (VR) headsets.
  • non-transitory and/or physical computer readable medium includes magnetic disks, optical disks, organic memory, and any other volatile (e.g., Random Access Memory (“RAM”)) or non-volatile (e.g., Read- Only Memory (“ROM”)) mass storage system readable by the CPU or GPU (graphics processing unit).
  • RAM Random Access Memory
  • ROM Read- Only Memory
  • the computer readable medium includes cooperating or interconnected computer readable media, which exist exclusively on the processing system or which may be distributed among multiple interconnected processing systems that may be local or remote to the processing system.
  • FIGURE 7 is a schematic illustrating a conceptual partial view of an example computer program product that includes a computer program for executing a computer process on a computing device, arranged according to at least some embodiments presented herein.
  • the example computer program product 700 is provided using a signal bearing medium 702.
  • the signal bearing medium 702 may include one or more programming instructions 704 that, when executed by one or more processors may provide functionality or portions of the functionality described herein with respect to FIGURES 1-6.
  • the signal bearing medium 702 may encompass a computer-readable medium 706, such as, but not limited to, a hard disk drive, a Compact Disc (CD), a Digital Video Disk (DVD), a digital tape, memory, etc.
  • the signal bearing medium 702 may encompass a computer recordable medium 708, such as, but not limited to, memory, read/write (R/W) CDs, R/W DVDs, etc.
  • the signal bearing medium 702 may encompass a communications medium 710, such as, but not limited to, a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communications link, a wireless communication link, etc.).
  • a communications medium 710 such as, but not limited to, a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communications link, a wireless communication link, etc.).
  • the signal bearing medium 702 may be conveyed by a wireless form of the communications medium 710.
  • the one or more programming instructions 704 may be, for example, computer executable and/or logic implemented instructions.
  • a computing device such as the computer system 600 of FIGURE 6 (discussed further below) may be configured to provide various operations, functions, or actions in response to the programming instructions 704 conveyed to the computer system 600 by one or more of the computer readable medium 706, the computer recordable medium 708, and/or the communications medium 710.
  • the non-transitory and/or physical computer readable medium could also be distributed among multiple data storage elements, which could be remotely located from each other.
  • the computing device that executes some or all of the stored instructions could be a computer system, such as computer system 600 illustrated in FIGURE 6.
  • the computing device that executes some or all of the stored instructions could be another computing device, such as a server.
  • circuits for creating video signals comprising:
  • a comparator configured to (i) receive a blue component input signal and a green component input signal and (ii) output a blue-green differential signal comprising a difference of the blue component input signal and the green component input signal;
  • a module to (i) receive the blue-green differential signal and (ii) output a blue-green scaled-differential signal when the blue component input signal is of a greater magnitude than the green component input signal;
  • a first adder configured to (i) receive the green component input signal and the blue-green scaled-differential signal and (ii) output a green component output signal comprising a summation of the green component input signal and the blue-green scaled-differential signal;
  • a second adder configured to (i) receive a blue component input signal and the blue-green scaled-differential signal and (ii) output a red component output signal comprising a summation of the red component input signal and the blue- green scaled differential signal.
  • the blue-green differential signal received by the first adder and the second adder may be a varied blue-green differential signal
  • an analog version of the circuit may further include a resistive element configured to receive the blue- green differential signal and output the varied blue-green differential signal.
  • the resistive element may be one of a static resistor, a variable resistor, a rheostat, a variable potentiometer, and a digitally programmable resistor.
  • each, or any of the comparator or adders may be a respective differential amplifier.
  • the second comparator may be a feedback differential amplifier.
  • the circuit may be implemented entirely using digital logic.
  • the circuit may further include a subtractor configured to (i) receive the blue component input signal and the blue-green scaled-differential signal and (ii) output a blue component output signal comprising a difference of the blue component input signal and the blue-green scaled-differential signal.
  • the subtractor may be implemented in digital logic or using a differential amplifier in the case of an analog circuit.
  • the circuit may include additional and/or alternative circuit elements necessary to achieve any particular desired functionality, including any of the functionality described above with respect to the first aspect.
  • the circuit may include additional static resistive elements, variable resistive elements, direct-current power sources, alternating-current power sources, and/or differential amplifiers.
  • the devices are described for creating video signals, comprising:
  • a comparator unit configured to: receive a blue component input signal and a green component input signal; compare a magnitude of the blue component input signal and a magnitude of the green component input signal; and output a blue-green differential signal when the magnitude of the blue component input signal is greater than at least the magnitude of the green component input signal;
  • an adding unit configured to: receive (i) the blue-green differential signal, (ii) the green component input signal, and (iii) a red component input signal; output a green component output signal comprising a summation of the green component input signal and the blue-green differential signal; and output a red component output signal comprising a summation of the red component input signal and the blue-green differential signal.
  • the comparator unit may operate according to the comparator functionality described above with respect to the first aspect of the implementation described herein, and/or may be implemented using any suitable aspects of the circuitry, computing device, or computer readable medium described above with respect to the second and third aspects of the implementation discussed above.
  • the comparator unit may include the functionality described above with respect to the third aspect of the implementation described herein.
  • the adding unit may operate according to the adding functionality described above with respect to the first aspect of the implementation described herein, and/or may be implemented using any suitable aspects of the circuitry, computing device, or computer readable medium described above with respect to the second and third aspects discussed above, respectively.
  • the adding unit may include the adders described above with respect to the third aspect of the implementation described herein.
  • the comparator may output the blue-green differential signal when the magnitude of the blue component input signal is greater than the sum of the magnitude of the green component input signal and a threshold value.
  • the blue-green differential signal received by the adding unit may be a varied blue-green differential signal and the device may also include a scaling unit configured to receive the blue- green differential signal and output the varied blue-green differential signal.
  • the scaling unit may include at least one of a static resistor, a variable resistor, a rheostat, a variable potentiometer, and a digitally programmable resistor.
  • the device may further include a display unit such as at least one of a cathode ray tube (CRT) display, a liquid crystal display (LCD), a light-emitting diode (LED) display, an organic LED display, a plasma display, a digital light processing (DLP) display, or virtual reality (VR) headset for some examples.
  • the display device may be configured to receive at least one of (i) the blue component input signal, (ii) the red component output signal, and (iii) the green component output signal and to display the received at least one of (i) the blue component input signal, (ii) the red component output signal, and (iii) the green component output signal.
  • the device may further include a subtracting unit configured to receive (i) the blue-green differential signal and (ii) the blue component input signal and to output a blue component output signal comprising a difference of the blue component input signal and the blue-green scaled-differential signal.
  • the display device may be configured to receive at least one of (i) the red component output signal, (ii) the green component output signal, and (iii) the blue component output signal and to display the received at least one of (i) the red component output signal, (ii) the green component output signal, and (iii) the blue component output signal.
  • FIGURE 8 is a flow chart illustrating an example method 800 for creating myopia-safe video signals.
  • method 800 involves receiving an input video signal comprising an input red component, an input green component, and an input blue component.
  • method 800 involves determining (i) that a magnitude of the input blue component is greater than a magnitude of the input green component and (ii) a differential between the magnitude of the input blue component and the magnitude of the input green component
  • FIGURE 8 is a flow chart illustrating an example method 800 for creating myopia-safe video signals.
  • method 800 involves receiving an input video signal comprising an input red component, an input green component, and an input blue component.
  • method 800 involves determining (i) that a magnitude of the input blue component is greater than a magnitude of the input green component and (ii) a differential between the magnitude of the input blue component and the magnitude of the input green component. And at step 806, method 800 involves sending an output video signal comprising an output red component, an output green component, and an output blue component. In accordance with step 806, at least one of the following is true: (i) the output blue component is decreased by a fractional amount relative to the input blue component based on the differential; and/or (ii) the output green component is increased by a fractional amount relative to the input green component based on the differential, and the output red component is increased by a fractional amount relative to the input red component based on the differential.
  • FIGURE 4 and FIGURE 5 are two example schematics of the circuit logic needed to carry out example method 800.
  • Method 800 may be carried out by other circuits, devices, and/or components thereof as well.
  • the example schematics of FIGURE 4 and FIGURE 5 have three components, segregated by color: red, green, and blue.
  • method 800 involves receiving an input video signal comprising an input red component, an input green component, and an input blue component. The method analyzes those three input components and adjusts them to create a more myopia-safe display.
  • myopia-safe refers to any appreciable reduction in the risk of myopia that a stimuli projects — it does not necessarily correspond to a complete lack of risk of myopia.
  • Input red component 402 (or 502), input green component 404 (or 504), and input blue component 406 (or 506) are shown in example schematic 400 (or 500, respectively).
  • method 800 involves determining (i) that a magnitude of the input blue component is greater than a magnitude of the input green component and (ii) a differential between the magnitude of the input blue component and the magnitude of the input green component.
  • the method first differences the input blue and green channels (B-G), or differences input blue component 406 (or 506) and input green component 404 (or 504). Such a difference may be accomplished via the differencing element 408 (or 508).
  • the input green component 404 (or 504) is of greater magnitude than the input blue component 406 (or 506) (i.e. G>B or B-G ⁇ 0), no change in the output is needed.
  • the method then comprises determining an appropriately scaled differential based on the B-G difference. In one embodiment, to accomplish this, the method sends the determined differential through a variable gain element 510 to determine an appropriately scaled differential.
  • step 806 involves sending an output video signal comprising an output red component, an output green component, and an output blue component.
  • the output blue component is decreased by a fractional amount relative to the input blue component based on the differential; and/or (ii) the output green component is increased by a fractional amount relative to the input green component based on the differential, and the output red component is increased by a fractional amount relative to the input red component based on the differential.
  • the system sends the blue-green differential signal (including threshold value, t) 412 through to the output red component 418, the output green component 414, and the output blue component 416.
  • the output green component 414 and the output red component 416 are then incremented by the net blue-green differential or a null signal 412, in an amount proportional to the B G difference.
  • the output blue component 418 remains the same as the input blue component. That is, the output blue component 416 (or 516) is not decreased by the fractional amount relative to the input green component 402 (or 502).
  • the output blue component may be decreased by a fractional amount relative to the input blue component based on the differential.
  • the blue component may include a subtractor configured to subtract the differential 412 (or 512) from the input blue component 406 (or 506).
  • the output green component 414 may be increased by a fractional amount relative to the input green component 404 (or 504) based on the differential 412 (or 512) and the output red component 418 (or 518) may be increased by a fractional amount relative to the input red component 402 (or 502) based on the differential 412 (or 512).
  • the output green component 414 (or 514) and the output red component 418 (or 518) are increased by the same fractional amount.
  • adder 420 (or 520) and adder 422 (or 522) may be configured similarly.
  • the output green component 414 (and 514) and the output blue component 416 (or 516) are increased by different, respective, fractional amounts.
  • adder 420 (or 520) and adder 422 (or 522) may be configured differently so as to cause the desired different increases.
  • the incrementing of the green and red channels (or components) corresponds to an increase of the green and red components that are eventually displayed, which “desaturates” the blue color (i.e., brings it closer to white) and reduces the differential activity of the L (red) and M (green) cones.
  • the difference in activity between L and M cones is much greater than is needed to produce a red percept. That is, though there is an 80% differential between the excitation of green cones and red cones in the presence of light emitted by the red phosphor, a person would still perceive the stimuli as “red” if the excitation of the green cones were increased to be much closer than, but still short of, that of the red cones.
  • the myopia producing difference between L and M cones is reduced by addition of the scaled differential to the green and red channels without much perceived loss in the quality of the color display.
  • FIGURE 9a shows a model unit cube representing a standard red, green, and blue (RGB) color model display.
  • the standard RGB model of FIGURE 9a is a unit cube situated on three axes: red, green, and blue.
  • the RGB model or display is capable of rendering all colors within the unit cube, and each point within the cube represents a renderable color.
  • the coordinates of a point represent the intensities of each primary axis color in rendering the color of the point.
  • the colors red, green, and blue are points at the comers (1, 0, 0), (0, 1, 0), and (0, 0, 1) respectively. Black is at (0, 0, 0), white is at (1, 1, 1), and grays are any equal combination of the three primaries (e.g.
  • grays are represented by a straight line, labeled gray scale, which passes through both the black (origin) and the white (1, 1, 1) points. All achromatic lights — black, white, and all levels of gray — lie along this line, and it can therefore be referred to as the “achromatic line.” Cyan (0, 1, 1) is the combination of green and blue with no red; magenta (1, 0, 1) is the combination of blue and red with no green; and yellow (1 1, 0) is the combination of red and green with no blue.
  • FIGURE 9b shows the model unit cube of FIGURE 9a, as it would be altered in a system of the implementations described herein, for example after the implementation of the method described in reference to FIGURES 4, 5 and 8.
  • the system modifies a typical RGB system so that when blues are presented alone, a portion of their energy is diverted into both the green and red channels, moving the color closer to the achromatic line.
  • the respective red, green, and blue coordinates may be thought of as relative intensities, such that where the blue channel intensity remains constant and the red and green intensities are increased, a corresponding point in the cube would move positively along the green and red axes.
  • the amount of energy diverted to the other channels is a fraction (between zero and one) multiple of the blue channel.
  • FIGURE 9c shows the model unit cube of FIGURE 9a, as it may be altered in a system designed based on other methodologies (see, e.g., U.S. 9,955,133).
  • FIGURE 9d shows a combination of effects of both methods, as might be realized in a real application in which both techniques were implemented sequentially.
  • a circuit could be implemented using any suitable components, digital or analog, capable of performing these basic operations (e.g. digital signal processors, microcontrollers, microprocessors, graphics processors, FPGAs, ASICs, etc.), or could be implemented in software.
  • the outputs may be provided, generated, or otherwise created by any suitable combination of circuit elements, hardware, and/or software.
  • FIGURE 6 is a block diagram of an example computing device 600 capable of implementing the embodiments described above and other embodiments.
  • Example computing device 600 includes a processor 602, data storage 604, and a communication interface 606, all of which may be communicatively linked together by a system bus, network, or other mechanism 608.
  • Processor 602 may comprise one or more general purpose processors (e.g., INTEL microprocessors) or one or more special purpose processors (e.g., digital signal processors, FPGA, ASIC, etc.)
  • Communication interface 606 may allow data to be transferred between processor 602 and input or output devices or other computing devices, perhaps over an internal network or the Internet.
  • Instructions and/or data structures may be transmitted over the communication interface 606 via a propagated signal on a propagation medium (e.g., electromagnetic wave(s), sound wave(s), etc.).
  • Data storage 604 may comprise one or more storage components or physical and/or non-transitory computer-readable media, such as magnetic, optical, or organic storage mechanisms, and may be integrated in whole or in part with processor 602.
  • Data storage 604 may contain program logic 610.
  • Program logic 610 may comprise machine language instructions or other sorts of program instructions executable by processor 602 to carry out the various functions described herein.
  • program logic 610 may define logic executable by processor 602, to receive video display channel inputs, to adjust those inputs according to the methods of the implementations described herein, and to output the adjusted video display channels.
  • these logical functions can be implemented by firmware or hardware, or by any combination of software, firmware, and hardware.
  • method 800 involves sending an output video signal comprising an output red component, an output green component, and an output blue component.
  • step 806 at least one of the following is true: (i) the output blue component is decreased by a fractional amount relative to the input blue component based on the differential; and/or (ii) the output green component is increased by a fractional amount relative to the input green component based on the differential, and the output red component is increased by a fractional amount relative to the input red component based on the differential.
  • FIGURE 4 and FIGURE 5 are two example schematics of the circuit logic needed to carry out example method 800.
  • Method 800 may be carried out by other circuits, devices, and/or components thereof as well.
  • the example schematics of FIGURE 4 and FIGURE 5 have three components, segregated by color: red, green, and blue.
  • method 800 involves receiving an input video signal comprising an input red component, an input green component, and an input blue component.
  • the method analyzes those three input components and adjusts them to create a more myopia-safe display.
  • myopia-safe refers to any appreciable reduction in the risk of myopia that a stimuli projects — it does not necessarily correspond to a complete lack of risk of myopia.
  • Input red component 402 (or 502), input green component 404 (or 504), and input blue component 406 (or 506) are shown in example schematic 400 (or 500, respectively).
  • method 800 involves determining (i) that a magnitude of the input blue component is greater than a magnitude of the input green component and (ii) a differential between the magnitude of the input blue component and the magnitude of the input green component.
  • the method first differences the input blue and green channels (B-G), or differences input blue component 406 (or 506) and input green component 404 (or 504). Such a difference may be accomplished via the differencing element 408 (or 508).
  • the input green component 404 (or 504) is of greater magnitude than the input blue component 406 (or 506) (i.e. G>B or B-GO), no change in the output is needed.
  • the method then comprises determining an appropriately scaled differential based on the B-G difference. In one embodiment, to accomplish this, the method sends the determined differential through a variable gain element 510 to determine an appropriately scaled differential.
  • step 806 involves sending an output video signal comprising an output red component, an output green component, and an output blue component.
  • the output blue component is decreased by a fractional amount relative to the input blue component based on the differential; and/or (ii) the output green component is increased by a fractional amount relative to the input green component based on the differential, and the output red component is increased by a fractional amount relative to the input red component based on the differential.
  • the system sends the blue-green differential signal (including threshold value, t) 412 through to the output red component 418, the output green component 414, and the output blue component 416.
  • the output green component 414 and the output red component 416 are then incremented by the net blue-green differential or a null signal 412, in an amount proportional to the B-G difference.
  • the output blue component 418 remains the same as the input blue component. That is, the output blue component 416 (or 516) is not decreased by the fractional amount relative to the input green component 402 (or 502).
  • the output blue component may be decreased by a fractional amount relative to the input blue component based on the differential.
  • the blue component may include a subtractor configured to subtract the differential 412 (or 512) from the input blue component 406 (or 506).
  • the output green component 414 may be increased by a fractional amount relative to the input green component 404 (or 504) based on the differential 412 (or 512) and the output red component 418 (or 518) may be increased by a fractional amount relative to the input red component 402 (or 502) based on the differential 412 (or 512).
  • the output green component 414 (or 514) and the output red component 418 (or 518) are increased by the same fractional amount.
  • adder 420 (or 520) and adder 422 (or 522) may be configured similarly.
  • the output green component 414 (and 514) and the output blue component 416 (or 516) are increased by different, respective, fractional amounts.
  • adder 420 (or 520) and adder 422 (or 522) may be configured differently so as to cause the desired different increases.
  • the incrementing of the green and red channels corresponds to an increase of the green and red components that are eventually displayed, which “desaturates” the blue color (i.e., brings it closer to white) and reduces the differential activity of the L (red) and M (green) cones.
  • the difference in activity between L and M cones is much greater than is needed to produce a red percept. That is, though there is an 80% differential between the excitation of green cones and red cones in the presence of light emitted by the red phosphor, a person would still perceive the stimuli as “red” if the excitation of the green cones were increased to be much closer than, but still short of, that of the red cones.
  • the myopia producing difference between L and M cones is reduced by addition of the scaled differential to the green and red channels without much perceived loss in the quality of the color display.
  • FIGURE 9a shows a model unit cube representing a standard red, green, and blue (RGB) color model display.
  • the standard RGB model of FIGURE 9a is a unit cube situated on three axes: red, green, and blue.
  • the RGB model or display is capable of rendering all colors within the unit cube, and each point within the cube represents a renderable color.
  • the coordinates of a point represent the intensities of each primary axis color in rendering the color of the point.
  • the colors red, green, and blue are points at the comers (1, 0, 0), (0, 1, 0), and (0, 0, 1) respectively. Black is at (0, 0, 0), white is at (1, 1, 1), and grays are any equal combination of the three primaries (e.g.
  • grays are represented by a straight line, labeled gray scale, which passes through both the black (origin) and the white (1, 1, 1) points. All achromatic lights — black, white, and all levels of gray — lie along this line, and it can therefore be referred to as the “achromatic line.” Cyan (0, 1, 1) is the combination of green and blue with no red; magenta (1, 0, 1) is the combination of blue and red with no green; and yellow (1 1, 0) is the combination of red and green with no blue.
  • FIGURE 9b shows the model unit cube of FIGURE 9a, as it would be altered in a system of the implementations described herein, for example after the implementation of the method described in reference to FIGURE 8, FIGURE 4, and FIGURE 5.
  • the system modifies a typical RGB system so that when blues are presented alone, a portion of their energy is diverted into both the green and red channels, moving the color closer to the achromatic line.
  • FIGURE 9c shows the model unit cube of FIGURE 9a, as it would be altered in a system designed based on other methodologies (see, e.g., U.S. 9,955,133).
  • FIGURE 9d shows a combination of effects of both methods, as might be realized in a real application in which both techniques were implemented sequentially.
  • a circuit could be implemented using any suitable components, digital or analog, capable of performing these basic operations (e.g. digital signal processors, microcontrollers, microprocessors, graphics processors, FPGAs, ASICs, etc.), or could be implemented in software.
  • the outputs may be provided, generated, or otherwise created by any suitable combination of circuit elements, hardware, and/or software.
  • FIGURE 6 is a block diagram of an example computing device 600 capable of implementing the embodiments described above and other embodiments.
  • Example computing device 600 includes a processor 602, data storage 604, and a communication interface 606, all of which may be communicatively linked together by a system bus, network, or other mechanism 608.
  • Processor 602 may comprise one or more general purpose processors (e.g., INTEL microprocessors) or one or more special purpose processors (e.g., digital signal processors, FPGA, ASIC, etc.)
  • Communication interface 606 may allow data to be transferred between processor 602 and input or output devices or other computing devices, perhaps over an internal network or the Internet.
  • Instructions and/or data structures may be transmitted over the communication interface 606 via a propagated signal on a propagation medium (e.g., electromagnetic wave(s), sound wave(s), etc.).
  • Data storage 604 may comprise one or more storage components or physical and/or non-transitory computer-readable media, such as magnetic, optical, or organic storage mechanisms, and may be integrated in whole or in part with processor 602.
  • Data storage 604 may contain program logic 610.
  • Program logic 610 may comprise machine language instructions or other sorts of program instructions executable by processor 602 to carry out the various functions described herein.
  • program logic 610 may define logic executable by processor 602, to receive video display channel inputs, to adjust those inputs according to the methods of the implementations described herein, and to output the adjusted video display channels.
  • these logical functions can be implemented by firmware or hardware, or by any combination of software, firmware, and hardware.
  • Various implementations of the subject matter described herein can be realized/implemented in digital electronic circuitry, integrated circuitry, specially designed application specific integrated circuits (ASICs), computer hardware, firmware, software, and/or combinations thereof. These various implementations can be implemented in one or more computer programs. These computer programs can be executable and/or interpreted on a programmable system.
  • the programmable system can include at least one programmable processor, which can have a special purpose or a general purpose.
  • the at least one programmable processor can be coupled to a storage system, at least one input device, and at least one output device.
  • the at least one programmable processor can receive data and instructions from, and can transmit data and instructions to, the storage system, the at least one input device, and the at least one output device.
  • a computer can display data to one or more users on a display device, such as a cathode ray tube (CRT) device, a liquid crystal display (LCD) monitor, a light emitting diode (LED) monitor, or any other display device.
  • a display device such as a cathode ray tube (CRT) device, a liquid crystal display (LCD) monitor, a light emitting diode (LED) monitor, or any other display device.
  • CTR cathode ray tube
  • LCD liquid crystal display
  • LED light emitting diode
  • the computer can receive data from the one or more users via a keyboard, a mouse, a trackball, a joystick, or any other input device.
  • other devices can also be provided, such as devices operating based on user feedback, which can include sensory feedback, such as visual feedback, auditory feedback, tactile feedback, and any other feedback.
  • the input from the user can be received in any form, such as acoustic input, speech input, tactile input, or any other input.
  • the subject matter described herein can be implemented in a computing system that can include at least one of a back-end component, a middleware component, a front-end component, and one or more combinations thereof.
  • the back-end component can be a data server.
  • the middleware component can be an application server.
  • the front-end component can be a client computer having a graphical user interface or a web browser, through which a user can interact with an implementation of the subject matter described herein.
  • the components of the system can be interconnected by any form or medium of digital data communication, such as a communication network. Examples of communication networks can include a local area network, a wide area network, internet, intranet, Bluetooth network, infrared network, or other networks.
  • the computing system can include clients and servers.
  • a client and server can be generally remote from each other and can interact through a communication network.
  • the relationship of client and server can arise by virtue of computer programs running on the respective computers and having a client-server relationship with each other.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Selon l'invention, la création d'un signal vidéo comprend les étapes consistant à recevoir un signal vidéo d'entrée doté de composantes RVB (rouge, verte, bleue), à déterminer que la composante bleue est supérieure d'une certaine valeur seuil à la composante verte et un différentiel entre les composantes bleue et verte, et à émettre un signal vidéo de sortie doté de composantes RVB, au moins une des propositions suivantes étant vraie : (i) la composante B de sortie est diminuée d'une quantité fractionnaire par rapport à la composante B d'entrée, respectivement, sur la base du différentiel ; et/ou (ii) la composante R de sortie est augmentée d'une quantité fractionnaire par rapport à la composante R d'entrée sur la base du différentiel, et la composante V de sortie est augmentée d'une quantité fractionnaire par rapport à la composante B d'entrée sur la base du différentiel.
PCT/US2020/058410 2019-10-30 2020-10-30 Afficheurs vidéo sans danger pour la myopie WO2021087398A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962928270P 2019-10-30 2019-10-30
US62/928,270 2019-10-30

Publications (1)

Publication Number Publication Date
WO2021087398A1 true WO2021087398A1 (fr) 2021-05-06

Family

ID=75715353

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/058410 WO2021087398A1 (fr) 2019-10-30 2020-10-30 Afficheurs vidéo sans danger pour la myopie

Country Status (2)

Country Link
TW (1) TW202123676A (fr)
WO (1) WO2021087398A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012145672A1 (fr) * 2011-04-21 2012-10-26 University Of Washington Through Its Center For Commercialization Dispositifs d'affichage vidéo sans danger pour la myopie
US9046752B2 (en) * 2012-04-05 2015-06-02 Mitsubishi Electric Corporation Projector having blue light alleviating part
WO2017051768A1 (fr) * 2015-09-24 2017-03-30 シャープ株式会社 Dispositif d'affichage et procédé d'expansion d'espace de couleur
US9773473B2 (en) * 2014-06-03 2017-09-26 Nvidia Corporation Physiologically based adaptive image generation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012145672A1 (fr) * 2011-04-21 2012-10-26 University Of Washington Through Its Center For Commercialization Dispositifs d'affichage vidéo sans danger pour la myopie
US9046752B2 (en) * 2012-04-05 2015-06-02 Mitsubishi Electric Corporation Projector having blue light alleviating part
US9773473B2 (en) * 2014-06-03 2017-09-26 Nvidia Corporation Physiologically based adaptive image generation
WO2017051768A1 (fr) * 2015-09-24 2017-03-30 シャープ株式会社 Dispositif d'affichage et procédé d'expansion d'espace de couleur

Also Published As

Publication number Publication date
TW202123676A (zh) 2021-06-16

Similar Documents

Publication Publication Date Title
US10587853B2 (en) Myopia-safe video displays
US11205398B2 (en) Evaluating and reducing myopiagenic effects of electronic displays
US11100619B2 (en) Image formation
US10446092B2 (en) Image processing apparatus, display apparatus, image processing method, and image processing program
JP4833203B2 (ja) レーザ画像表示装置
CN104509105A (zh) 提供观察者同色异谱故障减小的显示系统
JP2009500654A (ja) ディスプレイを駆動するための信号を変換する方法及び装置並びに前記方法及び装置を利用したディスプレイ
CN1853419B (zh) 处理和显示图像的方法和使用所述方法的显示器
KR102662177B1 (ko) 컬러-보존 스펙트럼 재성형을 위한 방법 및 장치
Hassani et al. Investigating color appearance in optical see‐through augmented reality
WO2014088975A1 (fr) Procédé permettant de produire une image couleur et dispositif d'imagerie l'utilisant
WO2011083603A1 (fr) Dispositif electronique, procede de reglage de la saturation des couleurs, programme associe, et support d'enregistrement
JP2008129162A (ja) 映像変換処理方法および映像変換システム
WO2011039811A1 (fr) Dispositif de traitement de signal d'image et procédé de traitement de signal d'image
WO2021087398A1 (fr) Afficheurs vidéo sans danger pour la myopie
KR100716976B1 (ko) 순차 구동 방식의 화상 표시 장치의 영상 표시 방법
US20230016631A1 (en) Methods for color-blindness remediation through image color correction
Thomas Colorimetric characterization of displays and multi-display systems
Tabakov Introduction to vision, colour models, and image compression
Pendhari et al. Video processing in visual system for color detection for people with tritanomaly
JP4202817B2 (ja) 視覚負荷度測定装置および方法、視覚負荷度測定プログラムならびにそのプログラムを記録した記録媒体
Sanchez et al. Quantification of the Helmholtz-Kohlrausch effect for CRT color monitors
Harrar et al. Regulation of chromatic induction by neighboring images
JP3604412B2 (ja) 自己発光型カラーモニターの白色点簡易評価方法及び白色点簡易評価チャート
Hassan Improved Methods to Enhance the Color Perception for People with Color Vision Deficiency

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20880838

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20880838

Country of ref document: EP

Kind code of ref document: A1