EP3338273A1 - Electronic display with environmental adaptation of display characteristics based on location - Google Patents
Electronic display with environmental adaptation of display characteristics based on locationInfo
- Publication number
- EP3338273A1 EP3338273A1 EP16837775.2A EP16837775A EP3338273A1 EP 3338273 A1 EP3338273 A1 EP 3338273A1 EP 16837775 A EP16837775 A EP 16837775A EP 3338273 A1 EP3338273 A1 EP 3338273A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- sunrise
- sunset
- display
- gamma
- electronic display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 230000007613 environmental effect Effects 0.000 title claims description 19
- 230000006978 adaptation Effects 0.000 title claims description 10
- 230000007704 transition Effects 0.000 claims abstract description 58
- 238000000034 method Methods 0.000 claims description 80
- 238000012545 processing Methods 0.000 description 35
- 230000008569 process Effects 0.000 description 34
- 230000009466 transformation Effects 0.000 description 32
- 230000006870 function Effects 0.000 description 29
- 239000003607 modifier Substances 0.000 description 16
- 230000000007 visual effect Effects 0.000 description 12
- 238000005286 illumination Methods 0.000 description 9
- 230000004048 modification Effects 0.000 description 8
- 238000012986 modification Methods 0.000 description 8
- 238000000844 transformation Methods 0.000 description 8
- 230000008570 general process Effects 0.000 description 7
- 239000003086 colorant Substances 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 238000004422 calculation algorithm Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 238000012360 testing method Methods 0.000 description 5
- 230000001419 dependent effect Effects 0.000 description 4
- 238000003860 storage Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 3
- 238000012937 correction Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000004438 eyesight Effects 0.000 description 3
- 238000009877 rendering Methods 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 229910000078 germane Inorganic materials 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000010606 normalization Methods 0.000 description 2
- 230000003442 weekly effect Effects 0.000 description 2
- GZPBVLUEICLBOA-UHFFFAOYSA-N 4-(dimethylamino)-3,5-dimethylphenol Chemical compound CN(C)C1=C(C)C=C(O)C=C1C GZPBVLUEICLBOA-UHFFFAOYSA-N 0.000 description 1
- 239000008186 active pharmaceutical agent Substances 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000005352 clarification Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 210000001508 eye Anatomy 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000036651 mood Effects 0.000 description 1
- 210000001328 optic nerve Anatomy 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 238000010561 standard procedure Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/10—Intensity circuits
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/16—Controlling the light source by timing means
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0673—Adjustment of display parameters for control of gamma adjustment, e.g. selecting another gamma curve
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/14—Detecting light within display terminals, e.g. using a single or a plurality of photosensors
- G09G2360/144—Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/16—Calculation or use of calculated indices related to luminance levels in display data
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02B—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
- Y02B20/00—Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
- Y02B20/40—Control techniques providing energy savings, e.g. smart controller or presence detection
Definitions
- the exemplary embodiments herein pertain to a display and a method that utilizes measured or calculated properties of the viewing environment in order to automatically vary the visual characteristics of a display according to a set of predefined rules. Some embodiments provide an autonomous display that exhibits optimal visual perception for image reproduction at all environmental viewing conditions.
- Displays are used in a very wide range of applications, including entertainment (e.g., television, e-books), advertisement (e.g., shopping malls, airports, billboards), information (e.g., automotive, avionics, system monitoring, security), and cross-applications (e.g., computers, smart phones).
- entertainment e.g., television, e-books
- advertisement e.g., shopping malls, airports, billboards
- information e.g., automotive, avionics, system monitoring, security
- cross-applications e.g., computers, smart phones.
- the exemplary embodiments herein utilize location-based and/or time-based determinations of ambient conditions in conjunction with stored characteristic display data to dynamically (in real-time) process and alter an image and/or video signal so that key display performance parameters such as brightness, black level, gamma, saturation, hue, and sharpness would be perceived as optimal, meaning they are tuned to their best intended rendering for the given viewing conditions.
- Other embodiments also provide the method by which a display is calibrated to perform as described as well as the method for performing the dynamic performance process.
- FIGURE 1 is a graphical representation of a typical image reproduction process.
- FIGURE 2 is a block diagram for a signal encoding and decoding process.
- FIGURE 3 is a graphical representation of image signal transformations per ITU-R BT.709/1886.
- FIGURE 4 is a graphical representation of end-to-end power vs. ambient illumination.
- FIGURE 5 is a graphical representation of end-to-end power vs. ambient illumination in a discrete implementation.
- FIGURE 6 is a block diagram for the basic elements in an embodiment of the invention.
- FIGURE 7 is an illustration of reflected ambient light and its relationship with displayed light.
- FIGURE 8 is a logical flow chart for performing an exemplary embodiment of location-based determinations of ambient conditions.
- FIGURE 9 is block diagram for an embodiment using post-decoding adjustment of black level and/or linearity.
- FIGURE 10 is a graphical representation of example signal transforms of Figure 8 using Equation (2).
- FIGURE 11 is a block diagram of an alternate embodiment for post-decoding adjustment.
- FIGURE 12 is a block diagram of an embodiment for pre-decoding adjustment of black level and/or linearity.
- FIGURE 13 is a graphical representation of example signal transforms of Figure 11 using Equation (4).
- FIGURE 14 is graphical representation of example signal transforms of Figure 11 using Equation (5).
- FIGURE 15 is graphical representation of example signal transforms of Figure 11 using Equation (6).
- FIGURE 16 is graphical representation of example signal transforms of Figure 11 using Equation (7).
- FIGURE 17 is a detailed view of the lower left hand corner of Figure 15.
- FIGURE 18 provides a logical flowchart for performing an embodiment that uses the AAS technique during sunset/sunrise transition times while using a nighttime/daytime level for the remaining times.
- FIGURE 19 provides a logical flowchart for performing an embodiment that uses the AAS technique with only a single transition period while using nighttime/daytime instructions for the remaining times.
- FIGURE 20 provides a logical flowchart for performing the advanced embodiment that uses the AAS technique during sunset/sunrise transition times as well as the daytime while factoring in the local weather information.
- Embodiments of the invention are described herein with reference to illustrations that are schematic illustrations of idealized embodiments (and intermediate structures) of the invention. As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, embodiments of the invention should not be construed as limited to the particular shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing.
- FIG. 1 A very high-level diagram of an exemplary image capture and reproduction process is shown in Figure 1.
- Images typically originate from either real-world scenes captured by video/still cameras or from computer-generated scenery.
- the lofty goal of most reproduction systems is to display the most life-like image that is possible to the final human observer. There are many impediments to doing this perfectly; in fact, some "enhancements" are often purposely added to the displayed image to improve the viewing experience.
- One of the major impediments to high-fidelity reproduction is that the local viewing environment of the final observer cannot be definitively predicted, yet the viewing environment can have a profound impact on the visual quality of the reproduction. Also, the viewing environment can change almost continuously except in a few special cases such as the tightly controlled environment of a theater.
- a subtle but very germane aspect of Figure 1 is that the total light that is reflected from a physical object is essentially the linear summation of the reflected light from all light sources that impinge upon the object.
- an object may also emit its own light, and this light is also linearly added to the reflected contributions from the object in order to arrive at the total observed light.
- the absolute brightness or luminance of any point in a scene is proportional to all constituent components of light that are traceable to that point. This is the reality that is presented to the human observer of a real scene, and is also the manner in which computer-generated scenery is typically created.
- a display device should also adhere to the principle of luminance linearity for the purest form of reproduction. Or more generally, the entire end-to-end chain of processes, from the light that enters a camera to the light that exits the display, should adhere to the principle of luminance linearity. This principle will be relevant to various aspects of the subject invention.
- the goal of a display should be to reproduce a life-like replica of the original scene.
- One such limitation is the difficulty for a display to match the dynamic range of luminance that exists in the real world, especially at the upper end of the scale (e.g., the sun and reflections thereof).
- Another limitation is that a display is a predominately "flat” version of the original scene; hence true three-dimensional (3D) depth reproduction is not possible, although various "3D” technologies exist to produce the illusion of depth, at least from one or more specific perspectives.
- common displays cannot begin to simulate the nearly hemispherical field- of-view of the human eye, although special venues such as IMAX ® theaters attempt to overcome this.
- the display itself is a physical object that exists in some environment, and the environment itself can have a very significant impact on the visual quality of the reproduction.
- each pixel is typically comprised of 3 sub-pixels, one for each of the primary colors - typically red, green, and blue. While there are displays that may use 4 or more sub-pixels, the embodiments herein do not depend on the precise number of sub-pixels or colors that they represent.
- the information content of a displayed image is the result of uniquely commanding, or driving, each sub-pixel, with the specifics of the driving process being technology-dependent (e.g., CRT, plasma, LCD, OLED, etc.). It should be noted, that the exemplary embodiments herein can be utilized on any type of electronic display, and is not specific to one display type.
- the drive level of each sub-pixel can range from full off to full on - this is the fundamental process by which images are formed by a display.
- the total range of displayable colors i.e., the color gamut
- Non-primary colors are produced when the human eye integrates the 3 sub-pixels to produce an effective blended color via the controlled mixing of the primary colors.
- a gray level is a special case where all sub-pixels are being driven at the same level (as defined by VESA FPDM 2.0).
- Gamma (symbolized by ⁇ ) refers to the mathematic exponent in a power function S Y that transforms the scaling of gray levels (on a sub-pixel basis) in an image.
- FIG. 1 The conceptually simplest image reproduction stream is illustrated in Figure 2.
- a detector commonly a solid-state pixilated detector using CCD or CMOS technology
- O/E optical-to- electrical
- This image signal is typically a voltage signal that is approximately proportional to the amount of light falling on each pixilated detector element, but S s may be immediately converted into a digital signal.
- the source image signal S s may originate from computer-generated graphics that are typically developed in the linear domain in much the same way as light behaves in the real world.
- the a exponent is referred to as a gamma-correction exponent, but for the purposes of this discussion it will be referred to more generally as a signal encoding exponent.
- the decoded image signal Sd is then used to drive the components in the display that convert the electrical image data into light that is emitted by the display (L 0 ) via an electrical-to-optical (E/O) conversion process.
- E/O electrical-to-optical
- the details of the E/O process are unique to the display technology; e.g., LCD, plasma, OLED, etc.
- the decoding function d was an integral part of the E/O conversion process.
- the signals 'S' represent normalized values typically ranging from 0 to 1.
- VMAX For the case of voltage signals, the actual signals would be normalized by VMAX such that
- DMAX For the case of digital signals, the signals would be normalized by DMAX such that 256).
- the signal normalization process generally requires processing steps that are not explicitly shown in Figure 2, but are implied herein. As long as normalized signals are consistently used it does not matter whether the signals represent voltage levels or bit levels, and both would be covered by the exemplary embodiments herein.
- Eq(l) can be implemented in a discrete fashion, as illustrated in Figure 5.
- the number of discretized levels illustrated in Figure 5 is representative; various embodiments of the invention may implement an arbitrary number of discrete levels, depending on the processing abilities and application.
- FIG. 6 provides a schematic view of the basic components of an exemplary embodiment.
- an environment processor 200 may obtain video data from a video source 150.
- the display controller 110 may contain several components, including but not limited to a microprocessor and electronic storage (which can store the calibration data 120).
- the environmental processor 200 is preferably then in electrical communication with the display 300. In some embodiments, it would be the display controller 110 that is in electrical communication with the video source 150 and the display 300.
- One or more microprocessors found on the display controller 110 can perform any of the functionality described herein.
- the video source 150 can be any number of devices which generate and/or transmit video data, including but not limited to television/cable/satellite transmitters, DVD/Blue Ray players, computers, video recorders, or video gaming systems.
- the display controller 110 may be any combination of hardware and software that utilizes the location-based ambient environment data and modifies the video signal based on the calibration data.
- the calibration data 120 is preferably a nonvolatile data storage which is accessible to the display controller that contains calibration data for the location-based ambient environment data and optionally including reflectance information for the display assembly.
- the display 300 can be any electronic device which presents an image to the viewer.
- the desired brightness (i.e., maximum luminance) of a display may change, but perhaps the most obvious case is when displays are used outdoors.
- the ambient light illumination that surrounds the display may vary anywhere from the dark of night to the full sun of midday - roughly a factor of ten million, or 7 orders of magnitude.
- the operation of the human visual system (comprising the eye, optic nerve, and brain) is a very complex subject; indeed, there is not full consensus on its parametric performance by most of the leading experts in the field. The issue is exacerbated by the highly adaptive and non-linear nature of the human visual system. Hence, there is no utility in attempting to define specific visual capabilities in this disclosure. However, there are a few generalities on which everyone would agree. For one, the human visual system can adapt over a very wide range of light levels given some time to adapt, by perhaps as much as 12 orders of magnitude. However, there is a limit to the instantaneous dynamic range of human vision at any particular level of adaptation, perhaps 2-3 orders of magnitude (this varies with the absolute level of adaptation).
- a specific adaptation level depends on the integrated field-of-view of the eye (nearly hemispherical) taking into account all viewable objects and sources of light in this range. Since a display will only occupy some fraction of the total field-of-view then the maximum brightness of the display should be varied to accommodate the overall adaptation of human vision to various light levels, which of course would include the light from the display itself.
- a display that produces 500 candela per square meter (nits) might be painfully bright when viewing at nighttime or other dark environments (unless one walked up close enough to the display so that it mostly fills their field-of-view and then allows some time for proper adaptation to occur), but the same display would appear somewhat dim and unimpressive on a bright sunlit day, and in fact may have lower gray levels that are indiscernible.
- any display will reflect ambient environmental light to a certain degree.
- the reflected light level may be high enough to substantially dominate the darker regions of the displayed image or video content (hereafter simply 'image').
- the visual details in the darker regions of the image are essentially "washed out”.
- the display cannot produce visually discernable brightness levels in an image that fall below the equivalent brightness level of the reflected ambient light.
- Figure 7 where RAL is the effective brightness of the reflected ambient light and DBL is the displayed brightness level of any portion of an image. Wherever DBL ⁇ RAL in the image then there will be a discernable loss in image content in those regions.
- An analogy is not being able to hear quieter passages within music while listening in an environment that has excessive background noise. For this very reason most radio broadcasts transmit signals that have a compressed dynamic range for improved "perceptual" listening in the noisy environment of a car.
- an exemplary embodiment of the invention provides a means of automatically adjusting the black level and/or the gamma of a display according to pre-defined rules, such as but not limited to those previously discussed.
- FIG. 8 The conceptually and functionally easiest location to perform autonomous black level and linearity adjustments are after the normal image signal decoding process, as generally illustrated in Figure 8.
- the signal flow is similar to that described previously in Figure 2, except that now a new signal processing block labeled f v has been inserted into the signal flow for the purposes of providing automatic, real-time image signal adjustment in response to varying environmental conditions (i.e., an environmentally-reactive adjustment).
- the processing block represented by f v can be viewed as a post-decoding processor since it operates after the normal signal decoding processor that is represented by fd.
- the ambient environmental conditions can be determined based on the geographical location of the display and based on the calendar date, the approximate times of sunrise and sunset can be calculated and compared to the current time in order to determine what the ambient environmental conditions currently are.
- Figure 8 provides a logical flow chart for performing a first embodiment of the method which is controlled only based on the display location data.
- the system preferably determines the geographical location data for the display. This can be performed in a number of ways.
- the physical address of the display may be used to determine the city/state in which the display is located.
- the physical address of the display can be exchanged for the latitude and longitude coordinates. This technique can be performed by accessing a number of online tools, including but not limited to www.latlong.net.
- the location of the display can be determined by reading coordinates from a GPS capable device 400 within the electronic display, and may form part of the display controller 110 or could be a separate device in electrical connection with the display controller 110.
- the sunset and sunrise times for this location are preferably determined.
- the timing for performing this step can vary. In some embodiments, this step could be performed only once, with 365 days of data being used for the display throughout the remainder of the display's lifetime. Alternatively, this step could be performed annually, monthly, weekly, or even daily. This step can also be performed in a number of ways. First, when given a physical address, the system can determine the sunrise/sunset times based on this address and store them within the electronic storage on the display controller 110.
- the system can determine the sunrise/sunset times based on these coordinates and store them within the electronic storage on the display controller 110.
- the location data can be converted to sunrise/sunset times by accessing any number of online databases, including but not limited to: www.sunrisesunset.com, www.suncalc.net, and various NOAA online tools. Additionally the latitude and longitude data can be used to calculate sunrise/sunset times based on the sunrise equation:
- &> is the hour angle at either sunrise (when negative value is taken)
- ⁇ is the latitude of the observer on the Earth.
- d is the sun declination.
- the steps of determining geographical location data for the display and determining approximate sunrise/sunset times based on the geographical location data may be performed either electronically or manually and may also be performed before the display is shipped to its actual location. In other embodiments, the display may be installed within its actual location prior to performing these two steps.
- the system would then check to see what the current time is and determine whether it is currently night or day. While the figure reads the logic as “does the current time fall after sunset and before sunrise,” it seems clear that this could also be performed by determining "does the current time fall after sunrise and before sunset” and it makes no difference in any of the subject embodiments.
- the system determines that it is currently nighttime, the system provides nighttime instructions to the display controller 110.
- the system determines that it is daytime, the system provides nighttime instructions to the display controller 110.
- the nighttime/daytime instructions may simply be an indication sent to the display controller 110 (possibly from another component within the display controller 110) that simply indicates that it's currently daytime/nighttime.
- the relative daytime and nighttime settings and variables for the display can be selected for this embodiment through a simple binary operation where a first set of settings and variables for the display is desired during nighttime and a second set of settings and variables for the display is desired during daytime.
- an appropriate gamma or black level may be selected for the "nighttime instructions" with another for the "daytime instructions" and this could be selected from a look-up table based on a desired relationship between ambient light and gamma (similar to what is shown in Figure 4 or 5) or any other desired relationship between ambient light and gamma or black level (or other display settings).
- the dashed lines on the figure indicate the option of the system returning to determine the approximate sunrise/sunset times, if practicing an embodiment where this data is updated annually, monthly, weekly, or daily.
- the daytime/nighttime indications are preferably sent to an environmental processing unit labeled 'Proc' that contains at minimum a lookup table and/or computational algorithms that determine the desired display black level (and other settings discussed below) relative to the daytime/nighttime indications.
- This environmental processing unit can be a part of the display controller 110 and may use the microprocessor on the display controller 110 or a separate processor. Additionally the environmental processor may communicate with electronic storage having a lookup table and/or computational algorithms for image signal linearity modification (e.g., a power function) relative to the daytime/nighttime instructions. Also, a provision to add real-time programmable instructions to the environmental processor is shown.
- the environmental processor output signal S a contains the instantaneous value of the desired display black level Sb and optionally a signal linearity modification value that, for the case of a simple power function, takes the form of an exponent herein called ⁇ .
- the decoded image signal Sd and environmentally-reactive control signal S a are fed into the image signal processing block labeled f v which, in an exemplary embodiment, produces a final display driving signal S p according to Eq(2) below.
- This equation assumes that the all three image signal transformations (encoding, decoding, and post-decoding) are performed with power law functions. Signal encoding and decoding with power law functions, are typical in the industry, although this is not a necessity for the invention as other functions may be used with various embodiments herein.
- the right-hand side of Eq(2) represents the processing functionality of block f ? , accepting the input signals S a and Sd and outputting the signal S p .
- s p s b +Q-s b y s E « ⁇ 3 ⁇ 4
- ⁇ linearity modifier power exponent (assuming power law modification)
- the encoding exponent a and the decoding exponent ⁇ are known quantities, as assumed in Eq(2), then the final end-to-end signal linearity is determined solely by the value of the linearity modifier exponent ⁇ ; i.e., ⁇ is equivalent to the previously defined end-to-end linearity power exponent ⁇ .
- the encoding exponent a is typically known based on the source of the image data, and the decoding exponent ⁇ is either given by the manufacturer of the display and/or can be determined by testing.
- Eq(2) offers a specific example of the processes described in this section based on a specific method of signal encoding/decoding, but the general process is the same for any other method of encoding/decoding.
- Eq(2) The functionality of Eq(2) is illustrated in Figure 10.
- the requested black level has been set at 0.1 and the linearity modifier exponent ⁇ has been set to 1.20, but generally these values will be (preferably automatically) chosen by the system based on the daytime/nighttime indications.
- the functionality of the image signal decoding block f& could be absorbed into the environmental processor block f ? as a new processing block labeled dp, as shown in Figure 11.
- the encoded image signal S e and environmentally-reactive control signal S a are fed into the image signal processing block labeled f ? which, in an exemplary embodiment, produces a final display driving signal S p according to Eq(4) below.
- This equation assumes that the all three image signal transformations (encoding, pre-decoder processing, and decoding) are performed with power law functions. Signal encoding and decoding with power law functions, are typical in the industry, although this is not a necessity for any of the embodiments herein.
- the right-hand side of Eq(4) represents the processing functionality of block f p , accepting the input signals S a and S e and outputting the signal S p . ⁇ -
- Sb desired black level offset as a fraction of the full-scale signal
- ⁇ linearity modifier power exponent (assuming power law modification)
- the encoding exponent a and the decoding exponent ⁇ are known quantities, as assumed in Eq(4), then the final signal linearity is determined solely by the value of the linearity modifier exponent ⁇ ; i.e., ⁇ is equivalent to the previously defined end-to-end linearity power exponent ⁇ .
- the encoding exponent a is typically known based on the source of the image data, and the decoding exponent ⁇ is either given by the manufacturer of the display and/or can be determined by testing.
- Eq(4) offers a specific example of the processes described in this section based on a specific method of signal encoding/decoding, but the general process is the same for any other method of encoding/decoding.
- FIG. 13 An example of the functionality of Eq(4) is illustrated in Figure 13.
- the requested black level has been set at 0.1 and the linearity modifier exponent ⁇ has been set to 1.20.
- Figure 13 appears identical to Figure 10 because the same black level (Sb) and linearity modifier ( ⁇ ) have been requested in both cases.
- the encoded image signal S e and environmentally-reactive control signal S a are fed into the image signal processing block labeled f ? which, in an exemplary embodiment, produces a pre-decoding image signal S p according to Eq(5) below.
- Eq(5) represents the processing functionality of block f p , accepting the input signals S a and S e and outputting the signal S P .
- a new feature of this embodiment is the introduction of a gray level threshold labeled St, leading to the two conditional cases expressed in Eq(5).
- the 1 st condition is applicable when encoded signal levels fall below a level that is derived from St, in which case those signal levels will be set to 0 (i.e., full black). Otherwise, the 2 nd condition in Eq(5) is applicable for encoded signal levels that fall above the level derived from St.
- St desired gray level threshold as a fraction of the full-scale input signal
- the gray level threshold (St) may be: 1) an environmentally-reactive variable determined via a lookup table or computational algorithms within the processing block labeled 'Proc' , or 2) provided by the 'programmable instructions' port on 'Proc' , or 3) be a fixed value pre-programmed within 'Proc' , or 4) any combination of the above.
- St may be a fixed value within the f ? processing block.
- the encoding exponent a and the decoding exponent ⁇ are known quantities, as assumed in Eq(5), then the final signal linearity beyond the gray level threshold St is determined solely by the value of the linearity modifier exponent ⁇ ; i.e., ⁇ is equivalent to the previously defined end-to-end linearity power exponent ⁇ .
- the encoding exponent a is typically known based on the source of the image data, and the decoding exponent ⁇ is either given by the manufacturer of the display and/or can be determined by testing.
- Eq(5) offers a specific example of the processes described in this section based on a specific method of signal encoding/decoding, but the general process is the same for any other method of encoding/decoding.
- Eq(5) An example of the functionality of Eq(5) is illustrated in Figure 14.
- the requested black level has been set at 0.1
- the requested black level threshold has been set to 0.05
- the linearity modifier exponent ⁇ has been set to 1.20.
- the "cliff type of threshold cutoff produced by Eq(5) and illustrated in Figure 13 may produce objectionable visual artifacts in the image, especially for higher levels of thresholds and/or black level offsets. This would manifest as darker regions in an image that suddenly and unnaturally become black - this a phenomena that is sometimes referred to as banding. This effect can be reduced by softening the edge of the threshold.
- Eq(6) represents the processing functionality of block f v , accepting the input signals S a and S e and outputting the signal S p .
- a new feature of this embodiment is the introduction of a gray level turn-off point labeled So, leading to the three conditional cases expressed in Eq(6).
- the 1 st condition is applicable when the encoded signal levels fall below a level that is derived from So, in which case those signal levels are set to 0 (i.e., full black).
- the 2 nd condition in Eq(6) is applicable for encoded signal levels that fall above the level derived from So but below the threshold level derived from St.
- the 3 rd condition in Eq(6) is applicable for encoded signal levels that fall above the level derived from St.
- St desired gray level threshold as a fraction of the full-scale input signal
- Sb desired black level offset as a fraction of the full-scale output signal
- ⁇ linearity modifier power exponent (assuming power law modification)
- the gray level turn-off point (So) and gray level threshold (St) may be: 1) environmentally-reactive variables determined via a lookup table or computational algorithms within the processing block labeled 'Proc' , or 2) provided by the 'programmable instructions' port on 'Proc', or 3) be fixed values pre-programmed within 'Proc', or 4) any combination of the above.
- So and St may be fixed values within the f v processing block.
- the encoding exponent a and the decoding exponent ⁇ are known quantities, as assumed in Eq(6), then the final signal linearity beyond the gray level threshold St is determined solely by the value of the linearity modifier exponent ⁇ ; i.e., ⁇ is equivalent to the previously defined end-to-end linearity power exponent ⁇ .
- the encoding exponent a is typically known based on the source of the image data, and the decoding exponent ⁇ is either given by the manufacturer of the display and/or can be determined by testing.
- Eq(6) offers a specific example of the processes described in this section based on a specific method of signal encoding/decoding, but the general process is the same for any other method of encoding/decoding.
- Eq(6) An example of the functionality of Eq(6) is illustrated in Figure 15.
- the requested black level offset has been set at 0.1
- the requested gray level turn-off has been set to 0.02
- the gray level threshold has been set to 0.05
- the linearity modifier exponent ⁇ has been set to 1.20.
- the linear ramp between (So, 0) and (St, St.) serve to reduce the aforementioned banding effect.
- the encoded image signal S e and environmentally- reactive control signal S a are preferably fed into the image signal processing block labeled f v which, in an exemplary embodiment, produces a pre-decoding image signal S p according to Eq( 7 ) below.
- Eq( 7 ) represents the processing functionality of block f p , accepting the input signals S a and S e and outputting the signal S p .
- a new feature of this embodiment is the introduction of a gray level turn-off point labeled So, leading to the three conditional cases expressed in Eq( 7 ).
- the 1 st condition is applicable when the encoded signal levels fall below a level that is derived from So, in which case those signal levels are set to 0 (i.e., full black).
- the 2 nd condition in Eq( 7 ) is applicable for encoded signal levels that fall above the level derived from St.
- St desired gray level threshold as a fraction of the full-scale input signal
- Sb desired black level offset as a fraction of the full-scale output signal
- ⁇ linearity modifier power exponent (assuming power law modification)
- the gray level turn-off point (So) and gray level threshold (St) may be: 1) environmentally-reactive variables determined via a lookup table or computational algorithms within the processing block labeled 'Proc' , or 2) provided by the 'programmable instructions' port on 'Proc', or 3) be fixed values pre-programmed within 'Proc', or 4) any combination of the above.
- So and St may be fixed values within the f ? processing block.
- the encoding exponent a and the decoding exponent ⁇ are known quantities, as assumed in Eq( 7 ), then the final signal linearity beyond the gray level threshold St is determined solely by the value of the linearity modifier exponent ⁇ ; i.e., ⁇ is equivalent to the previously defined end-to-end linearity power exponent ⁇ .
- the encoding exponent a is typically known based on the source of the image data, and the decoding exponent ⁇ is either given by the manufacturer of the display and/or can be determined by testing.
- Eq( 7 ) offers a specific example of the processes described in this section based on a specific method of signal encoding/decoding, but the general process is the same for any other method of encoding/decoding.
- Eq( 7 ) An example of the functionality of Eq( 7 ) is illustrated in Figure 16.
- the requested black level offset has been set at 0.1
- the requested gray level turn-off has been set to 0.02
- the gray level threshold has been set to 0.05
- the linearity modifier exponent ⁇ has been set to 1.20.
- Eq(8) The BT.709 encoding process is described by Eq(8).
- the 1 st condition in Eq(8) is intended to prevent a nearly infinite slope in the transform function for small signals (i.e., darkest gray levels), as would be the case for a purely power-law function, that would be problematic for noise at such low levels.
- the BT.1886 decoding process is simply a power-law transformation as described by Eq(9).
- the encoded image signal S e and environmentally- reactive control signal S a are fed into the image signal processing block labeled f v which, in an exemplary embodiment, produces a pre-decoding image signal S P according to Eq(10) below which represents the processing functionality of block f ? , accepting the input signals S a and S e and outputting the signal S p .
- the break point at S s 0.018 in the encoding process described by Eq(8) leads to the two sets of conditional cases as expressed in Eq(10).
- the 2 nd set of conditions in Eq(10) is applicable when the encoded signal level S e is greater than 0.081, leading to three more sub- conditions 2a-2c that are dependent on the encoded signal level S e relative to the black level transition parameters So and St.
- Eq(10) a sine function has been implemented for the black level transition for conditions lb and 2b, although there many functions that could be used for this purpose.
- Gamma symbolized by ⁇ generally refers to the mathematic exponent in a power function S Y that transforms the scaling of gray levels (on a sub-pixel basis) in an image.
- the exemplary embodiments of the system can select a desired yfor the display, depending on the data from an ambient light sensor or, as shown in Figure 8, data related to the sunset and sunrise times for the location of the display, without requiring the use of actual data from an ambient light sensor. Taking this concept even further, the embodiment below allows various display settings, specifically gamma ( ⁇ ) or the black level to be selected based on artificial ambient light sensor data (AAS data), without requiring the use of actual data from an ambient light sensor.
- AAS data artificial ambient light sensor data
- anomalies in the display environment can sometimes create variations in the ambient light sensor data that can cause the display to change brightness levels drastically, even though the surrounding environment has not changed quite as drastically.
- the ambient light sensor may be positioned within a shadow while the rest of the display is not. This select- shadowing can be caused by a number of obstructions, including but not limited to light posts, trees, passing vehicles, and/or construction equipment.
- Other anomalies can create variability in the ambient light sensor data, including variations in: the response of each different sensor, the response of the sensor over temperature changes, variations in the positioning of the light sensor in each display, and variations in the typical ambient environment of the display over time.
- the system can function without the use of data from the ambient light sensor. This however does typically limit some of the functionality of the system and its benefits, specifically power saving benefits, and can sometimes produce drastic changes in the display luminance.
- the following embodiments provide a system and method for controlling the luminance of an electronic display by producing artificial ambient light sensor data (AAS).
- AAS artificial ambient light sensor data
- generating artificial ambient sensor data involves defining the following parameters:
- the artificial ambient sensor (AAS) data can be calculated in the following manner, where t 1 provides the time in transition (i.e. t 1 varies between zero and t sr ).
- AAS for sunrise (t 1 * HA)/t sr .
- the AAS for sunset can be calculated in the following manner, where t 1 provides the time in transition (i.e. t 1 varies between zero and t ss ).
- AAS for sunset HA - (t 1 * HA)/t ss .
- the desired backlight level can be determined from any of the ambient light vs. display settings described above.
- FIGURE 18 provides a logical flowchart for performing an embodiment that uses the AAS technique during sunset/sunrise transition times while using a nighttime/daytime level for the remaining times.
- the sunset transition period and the sunrise transition period may be similar or substantially the same. In this case, it may not be necessary to have two transition periods. Instead, one transition period may be used.
- FIGURE 19 provides a logical flowchart for performing an embodiment that uses the AAS technique with only a single transition period while using nighttime/daytime instructions for the remaining times.
- the system and method can also utilize local weather information to further tailor the display settings, without requiring actual data from the ambient light sensor.
- the local weather information can be obtained from available web APIs or other online weather information which may be accessed at a predetermined time interval (ex. every 15 minutes).
- a weather factor WF is used where:
- C 1 Clearness percentage with a higher percentage representing a clear sky and a lower percentage representing a large amount of cloud cover.
- the inversion could be used, where a higher percentage represents more cloud cover and a lower percentage represents less cloud cover. Either technique can be used by a person of ordinary skill in the art.
- the artificial ambient sensor (AAS) data can be calculated in the following manner.
- AAS for sunrise (t 1 * (HA*WF))/t sr .
- AAS for sunset (HA*WF) - (t 1 * (HA*WF))/t ss .
- AAS HA*WF.
- the desired display settings can be determined from any of the ambient light levels vs. display settings described above.
- FIGURE 20 provides a logical flowchart for performing the advanced embodiment that uses the AAS technique during sunset/sunrise transition times as well as the daytime while factoring in the local weather information.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Controls And Circuits For Display Device (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Description
Claims
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562206050P | 2015-08-17 | 2015-08-17 | |
US15/043,135 US10321549B2 (en) | 2015-05-14 | 2016-02-12 | Display brightness control based on location data |
US15/043,100 US9924583B2 (en) | 2015-05-14 | 2016-02-12 | Display brightness control based on location data |
US201662314073P | 2016-03-28 | 2016-03-28 | |
PCT/US2016/047412 WO2017031237A1 (en) | 2015-08-17 | 2016-08-17 | Electronic display with environmental adaptation of display characteristics based on location |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3338273A1 true EP3338273A1 (en) | 2018-06-27 |
EP3338273A4 EP3338273A4 (en) | 2019-09-18 |
Family
ID=58051720
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP16837775.2A Withdrawn EP3338273A4 (en) | 2015-08-17 | 2016-08-17 | Electronic display with environmental adaptation of display characteristics based on location |
Country Status (6)
Country | Link |
---|---|
EP (1) | EP3338273A4 (en) |
JP (1) | JP2018525650A (en) |
KR (1) | KR102130667B1 (en) |
AU (1) | AU2016308187B2 (en) |
CA (1) | CA2985673C (en) |
WO (1) | WO2017031237A1 (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10255884B2 (en) | 2011-09-23 | 2019-04-09 | Manufacturing Resources International, Inc. | System and method for environmental adaptation of display characteristics |
US10321549B2 (en) | 2015-05-14 | 2019-06-11 | Manufacturing Resources International, Inc. | Display brightness control based on location data |
US10593255B2 (en) | 2015-05-14 | 2020-03-17 | Manufacturing Resources International, Inc. | Electronic display with environmental adaptation of display characteristics based on location |
US10607520B2 (en) | 2015-05-14 | 2020-03-31 | Manufacturing Resources International, Inc. | Method for environmental adaptation of display characteristics based on location |
US10782276B2 (en) | 2018-06-14 | 2020-09-22 | Manufacturing Resources International, Inc. | System and method for detecting gas recirculation or airway occlusion |
US11022635B2 (en) | 2018-05-07 | 2021-06-01 | Manufacturing Resources International, Inc. | Measuring power consumption of an electronic display assembly |
US11526044B2 (en) | 2020-03-27 | 2022-12-13 | Manufacturing Resources International, Inc. | Display unit with orientation based operation |
US12022635B2 (en) | 2021-03-15 | 2024-06-25 | Manufacturing Resources International, Inc. | Fan control for electronic display assemblies |
US12027132B1 (en) | 2023-06-27 | 2024-07-02 | Manufacturing Resources International, Inc. | Display units with automated power governing |
US12105370B2 (en) | 2023-06-23 | 2024-10-01 | Manufacturing Resources International, Inc. | Fan control for electronic display assemblies |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8125163B2 (en) | 2008-05-21 | 2012-02-28 | Manufacturing Resources International, Inc. | Backlight adjustment system |
US10586508B2 (en) | 2016-07-08 | 2020-03-10 | Manufacturing Resources International, Inc. | Controlling display brightness based on image capture device data |
CN113870810A (en) * | 2021-11-08 | 2021-12-31 | 合肥杰发科技有限公司 | Color adjusting method and device and display equipment |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2801798B2 (en) * | 1991-07-10 | 1998-09-21 | パイオニア株式会社 | Navigation system |
US7795574B2 (en) * | 2004-02-23 | 2010-09-14 | Xenonics, Inc. | Low-light viewing device for displaying image based on visible and near infrared light |
JP4475008B2 (en) * | 2004-05-25 | 2010-06-09 | 株式会社デンソー | Brightness adjustment device, display device, and program |
JP2006106345A (en) * | 2004-10-05 | 2006-04-20 | Seiko Epson Corp | Video display device |
TWI326443B (en) * | 2004-10-27 | 2010-06-21 | Chunghwa Picture Tubes Ltd | Dynamic gamma correction circuit, method thereof and plane display device |
KR100679689B1 (en) * | 2005-01-26 | 2007-02-06 | 주식회사 에스티월 | System for lighting using GPS reciever |
EP1686777A1 (en) * | 2005-01-31 | 2006-08-02 | Research In Motion Limited | Method for and mobile device having a geographical postion and ambient dependent backlight of a display |
EP1720149A3 (en) * | 2005-05-02 | 2007-06-27 | Semiconductor Energy Laboratory Co., Ltd. | Display device |
JP5119636B2 (en) * | 2006-09-27 | 2013-01-16 | ソニー株式会社 | Display device and display method |
KR101330353B1 (en) * | 2008-08-08 | 2013-11-20 | 엘지디스플레이 주식회사 | Liquid Crystal Display and Driving Method thereof |
JP2010181487A (en) * | 2009-02-03 | 2010-08-19 | Sanyo Electric Co Ltd | Display device |
JP4585601B1 (en) * | 2009-09-14 | 2010-11-24 | 株式会社東芝 | Video display device and video display method |
US20110102630A1 (en) * | 2009-10-30 | 2011-05-05 | Jason Rukes | Image capturing devices using device location information to adjust image data during image signal processing |
US9286020B2 (en) * | 2011-02-03 | 2016-03-15 | Manufacturing Resources International, Inc. | System and method for dynamic load sharing between electronic displays |
US8885443B2 (en) * | 2011-09-20 | 2014-11-11 | Rashed Farhan Sultan Marzouq | Apparatus for making astronomical calculations |
US9210759B2 (en) * | 2012-11-19 | 2015-12-08 | Express Imaging Systems, Llc | Luminaire with ambient sensing and autonomous control capabilities |
US9536325B2 (en) * | 2013-06-09 | 2017-01-03 | Apple Inc. | Night mode |
US9530342B2 (en) * | 2013-09-10 | 2016-12-27 | Microsoft Technology Licensing, Llc | Ambient light context-aware display |
-
2016
- 2016-08-17 AU AU2016308187A patent/AU2016308187B2/en active Active
- 2016-08-17 KR KR1020177036455A patent/KR102130667B1/en active IP Right Grant
- 2016-08-17 JP JP2017558690A patent/JP2018525650A/en active Pending
- 2016-08-17 WO PCT/US2016/047412 patent/WO2017031237A1/en unknown
- 2016-08-17 EP EP16837775.2A patent/EP3338273A4/en not_active Withdrawn
- 2016-08-17 CA CA2985673A patent/CA2985673C/en active Active
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10255884B2 (en) | 2011-09-23 | 2019-04-09 | Manufacturing Resources International, Inc. | System and method for environmental adaptation of display characteristics |
US10321549B2 (en) | 2015-05-14 | 2019-06-11 | Manufacturing Resources International, Inc. | Display brightness control based on location data |
US10412816B2 (en) | 2015-05-14 | 2019-09-10 | Manufacturing Resources International, Inc. | Display brightness control based on location data |
US10593255B2 (en) | 2015-05-14 | 2020-03-17 | Manufacturing Resources International, Inc. | Electronic display with environmental adaptation of display characteristics based on location |
US10607520B2 (en) | 2015-05-14 | 2020-03-31 | Manufacturing Resources International, Inc. | Method for environmental adaptation of display characteristics based on location |
US11022635B2 (en) | 2018-05-07 | 2021-06-01 | Manufacturing Resources International, Inc. | Measuring power consumption of an electronic display assembly |
US11656255B2 (en) | 2018-05-07 | 2023-05-23 | Manufacturing Resources International, Inc. | Measuring power consumption of a display assembly |
US11774428B2 (en) | 2018-06-14 | 2023-10-03 | Manufacturing Resources International, Inc. | System and method for detecting gas recirculation or airway occlusion |
US10782276B2 (en) | 2018-06-14 | 2020-09-22 | Manufacturing Resources International, Inc. | System and method for detecting gas recirculation or airway occlusion |
US11293908B2 (en) | 2018-06-14 | 2022-04-05 | Manufacturing Resources International, Inc. | System and method for detecting gas recirculation or airway occlusion |
US11526044B2 (en) | 2020-03-27 | 2022-12-13 | Manufacturing Resources International, Inc. | Display unit with orientation based operation |
US11815755B2 (en) | 2020-03-27 | 2023-11-14 | Manufacturing Resources International, Inc. | Display unit with orientation based operation |
US12007637B2 (en) | 2020-03-27 | 2024-06-11 | Manufacturing Resources International, Inc. | Display unit with orientation based operation |
US12022635B2 (en) | 2021-03-15 | 2024-06-25 | Manufacturing Resources International, Inc. | Fan control for electronic display assemblies |
US12105370B2 (en) | 2023-06-23 | 2024-10-01 | Manufacturing Resources International, Inc. | Fan control for electronic display assemblies |
US12027132B1 (en) | 2023-06-27 | 2024-07-02 | Manufacturing Resources International, Inc. | Display units with automated power governing |
Also Published As
Publication number | Publication date |
---|---|
CA2985673A1 (en) | 2017-02-23 |
AU2016308187A1 (en) | 2017-11-30 |
EP3338273A4 (en) | 2019-09-18 |
WO2017031237A1 (en) | 2017-02-23 |
KR20180008757A (en) | 2018-01-24 |
CA2985673C (en) | 2021-03-23 |
KR102130667B1 (en) | 2020-07-06 |
AU2016308187B2 (en) | 2019-10-31 |
JP2018525650A (en) | 2018-09-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10593255B2 (en) | Electronic display with environmental adaptation of display characteristics based on location | |
US10607520B2 (en) | Method for environmental adaptation of display characteristics based on location | |
AU2016308187B2 (en) | Electronic display with environmental adaptation of display characteristics based on location | |
US10255884B2 (en) | System and method for environmental adaptation of display characteristics | |
ES2808177T3 (en) | High dynamic range image optimization for particular displays | |
JP6495552B2 (en) | Dynamic range coding for images and video | |
CN109219961B (en) | Method and apparatus for encoding and decoding HDR video | |
KR102337438B1 (en) | Encoding and decoding of HDR video | |
ES2825699T3 (en) | High dynamic range imaging and optimization for home screens | |
US10217198B2 (en) | Simple but versatile dynamic range coding | |
CN117321626A (en) | Content optimized ambient light HDR video adaptation | |
CN117321625A (en) | Display optimized HDR video contrast adaptation | |
CN117296076A (en) | Display optimized HDR video contrast adaptation | |
CN117280381A (en) | Display optimized ambient light HDR video adaptation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20171108 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
RIC1 | Information provided on ipc code assigned before grant |
Ipc: H05B 37/00 20060101ALI20190520BHEP Ipc: H05B 37/02 20060101ALI20190520BHEP Ipc: G09G 5/10 20060101ALI20190520BHEP Ipc: G09G 5/00 20060101AFI20190520BHEP |
|
A4 | Supplementary search report drawn up and despatched |
Effective date: 20190821 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G09G 5/00 20060101AFI20190814BHEP Ipc: H05B 37/00 20060101ALI20190814BHEP Ipc: G09G 5/10 20060101ALI20190814BHEP Ipc: H05B 37/02 20060101ALI20190814BHEP |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN |
|
18W | Application withdrawn |
Effective date: 20200206 |