US20120256943A1 - Image Range Expansion Control Methods and Apparatus - Google Patents

Image Range Expansion Control Methods and Apparatus Download PDF

Info

Publication number
US20120256943A1
US20120256943A1 US13/442,708 US201213442708A US2012256943A1 US 20120256943 A1 US20120256943 A1 US 20120256943A1 US 201213442708 A US201213442708 A US 201213442708A US 2012256943 A1 US2012256943 A1 US 2012256943A1
Authority
US
United States
Prior art keywords
expansion
image data
metadata
target display
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/442,708
Inventor
Robin Atkins
Steve Margerm
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dolby Laboratories Licensing Corp
Original Assignee
Dolby Laboratories Licensing Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dolby Laboratories Licensing Corp filed Critical Dolby Laboratories Licensing Corp
Priority to US13/442,708 priority Critical patent/US20120256943A1/en
Assigned to DOLBY LABORATORIES LICENSING CORPORATION reassignment DOLBY LABORATORIES LICENSING CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ATKINS, ROBIN, MARGERM, STEVE
Publication of US20120256943A1 publication Critical patent/US20120256943A1/en
Priority to US14/721,345 priority patent/US9501817B2/en
Priority to US15/332,454 priority patent/US10395351B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/92Dynamic range modification of images or parts thereof based on global image properties
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/162Detection; Localisation; Normalisation using pixel segmentation or colour matching
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • G09G5/04Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed using circuits for interfacing with colour displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4854End-user interface for client configuration for modifying image parameters, e.g. image brightness, contrast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/84Generation or processing of descriptive data, e.g. content descriptors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20004Adaptive image processing
    • G06T2207/20008Globally adaptive
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0271Adjustment of the gradation levels within the range of the gradation scale, e.g. by redistribution or clipping
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/066Adjustment of display parameters for control of contrast
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0673Adjustment of display parameters for control of gamma adjustment, e.g. selecting another gamma curve
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/08Arrangements within a display terminal for setting, manually or automatically, display parameters of the display terminal
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0407Resolution change, inclusive of the use of different resolutions for different screen areas
    • G09G2340/0428Gradation resolution change
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/06Colour space transformation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/04Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller

Definitions

  • the invention relates to displaying images and relates specifically to methods and apparatus involving the adjustment of image data for display on target displays.
  • Display apparatus may employ any of a wide range of technologies. Some non-limiting examples are plasma displays, liquid crystal displays (LCDs), cathode ray tube (CRT) displays, organic light emitting diode (OLED) displays, projection displays that use any of various light sources in combination with various spatial light modulation technologies, and so on.
  • Displays may be of a wide variety of types used in any of a wide variety of applications.
  • the term display encompasses without limitation apparatus such as: televisions, computer displays, media player displays, displays in hand-held devices, displays used on control panels for equipment of different kinds, electronic game displays, digital cinema displays, special purpose displays such as virtual reality displays, advertising displays, stadium displays, medical imaging displays, and so on.
  • Different displays may have different capabilities in areas such as: black level, maximum brightness (display peak luminance), color gamut, and so on.
  • the appearance of displayed images is also affected by the environment in which a display is being viewed. For example, the luminance of ambient lighting, the color of ambient lighting and screen reflections can all affect the appearance of displayed images.
  • Artifacts may include, for example, one or more of banding, quantization artifacts, visible macroblock edges, objectionable film grain and the like.
  • banding quantization artifacts
  • visible macroblock edges visible macroblock edges
  • objectionable film grain objectionable film grain
  • This invention may be embodied in a wide variety of ways. These include, without limitation, displays (such as television, digital cinema displays, specialty displays such as advertising displays, gaming displays, virtual reality displays, vehicle simulator displays and the like, displays on portable devices), image processing apparatus which may be integrated with a display, stand alone, or integrated with other apparatus such as a media player or the like and media carrying non-transitory instructions which, when executed by a data processor cause the data processor to execute a method according to the invention.
  • displays such as television, digital cinema displays, specialty displays such as advertising displays, gaming displays, virtual reality displays, vehicle simulator displays and the like, displays on portable devices
  • image processing apparatus which may be integrated with a display, stand alone, or integrated with other apparatus such as a media player or the like and media carrying non-transitory instructions which, when executed by a data processor cause the data processor to execute a method according to the invention.
  • One non-limiting example aspect of the invention provides a method for image processing.
  • the method comprises obtaining image data and metadata associated with the image data.
  • the method processes the metadata with information characterizing a target display to determine a maximum safe expansion for an attribute of the image data and a maximum available expansion for the attribute of the image data.
  • the method processes the image data to expand the attribute of the image data by the lesser of the maximum safe expansion and the maximum available expansion.
  • the attribute is dynamic range and processing the image data comprises applying a tone mapping curve to the image data. In some embodiments the attribute comprises a color gamut.
  • the image processing apparatus comprises a decoder configured to extract image data and metadata associated with the image data from a signal and a controller configured to process the metadata with information characterizing a target display to determine a maximum safe expansion for an attribute of the image data and a maximum available expansion for the attribute of the image data.
  • the controller is further configured to set an expansion amount equal to the smaller of the maximum safe expansion and the maximum available expansion.
  • the apparatus comprises an expander configured to expand the attribute of the image data by the expansion amount to yield modified image data.
  • FIG. 1 is a flow chart that illustrates a method according to an example embodiment of the invention.
  • FIG. 2 is a block diagram illustrating apparatus according to an example embodiment.
  • FIG. 2A is a flow chart illustrating a method that may be performed in a controller in the apparatus of FIG. 2 or similar apparatus.
  • FIG. 3 is a block diagram illustrating apparatus according to an alternative embodiment.
  • FIG. 1 is a flow chart that illustrates a method 10 according to an example embodiment of the invention.
  • Method 10 may, for example, be practiced at a target display or in an image processing device upstream from a target display.
  • Method 10 receives image data and adjusts the image data for display on the target display.
  • the adjustment comprises expanding one or more of dynamic range and color gamut.
  • Block 12 receives image data 14 comprising image content to be displayed on a particular target display.
  • Block 12 may comprise, for example, receiving a data stream comprising the image data 14 , accessing a file or other data structure in a data store containing the image data 14 , or the like.
  • Block 15 receives metadata 16 associated with image data 14 .
  • Block 15 may comprise, for example, extracting metadata 16 from a side stream, decoding metadata 16 that has been encoded in image data 14 , obtaining metadata 16 from a server or other repository by way of a data communication network, accessing a data structure containing metadata 16 or the like.
  • metadata 16 and image data 14 are both contained in a common data container and blocks 12 and 15 respectively comprise extracting the image data 14 and associated metadata 16 from the data container.
  • Metadata 16 contains information that is indicative of a degree to which image data 14 can be expanded ‘safely’ in one or more respects.
  • safely means without introducing visible artifacts that unacceptably degrade image quality.
  • information that may be present as metadata 16 is a direct indication of an acceptable degree of expansion (e.g. a number that directly indicates the degree to which the dynamic range (e.g. luminance range) may be safely expanded).
  • Another example is information specifying characteristics of a gamut of a source display on which the content of image data 14 was approved.
  • the gamut characteristics may include one or more of: the colors of primaries of the source display, luminance range of the source display, black and white levels of the source display, points on a three-dimensional gamut boundary of the source display, or the like.
  • Another example is information specifying encoding parameters for image data 14 such as bit depth, encoding quality, color sub-sampling, and the like.
  • Another example of the makeup of metadata 16 is a combination of various ones of the information described above.
  • Block 18 determines from metadata 16 (or metadata 16 and image data 14 in some embodiments) one or more maximum safe expansions that may be applied to image data 14 .
  • block 18 may determine a safe dynamic range expansion and/or a safe chromaticity expansion.
  • Block 19 determines a maximum available expansion that may be applied to image data 14 to fully exploit the capabilities of target display 30 .
  • block 19 may determine the maximum available expansion based on metadata 16 (where the metadata 16 specifies capabilities of the source display). For example, if the source display has a peak luminance of 600 nits and target display 30 has a peak luminance of 1800 nits then block 19 may determine that the maximum available expansion is expansion by a factor of 3 (by dividing 1800 by 600 for example).
  • the maximum safe expansion may be less than the maximum available expansion. For example, in a case where the maximum safe expansion is 2 an expansion of 2 or less will be applied. In the example case where an expansion of 2 is applied, the resulting peak luminance would be 1200 nits (2 ⁇ 600 nits), for this example.
  • block 19 may obtain information regarding one or more capabilities of target display 30 by, for example, interrogating target display 30 to determine its capabilities by way of an EDID, E-EDID, DisplayID or similar functionality, retrieving stored information regarding the characteristics of target display 30 from a local data store or an accessible server or the like.
  • Block 20 compares the maximum safe expansion(s) from block 18 to the maximum available expansion(s) from block 19 . If the maximum safe expansion equals or exceeds the maximum available expansion then image data 14 may be expanded by the maximum available expansion for display on the target display. If the maximum safe expansion is less than the maximum available expansion then the expansion of image data 14 should be limited to the maximum safe expansion for display on the target display. Block 20 provides an expansion value equal to the lesser of the maximum safe expansion and the maximum available expansion for one or more image characteristics that may be subjected to expansion.
  • Block 22 adjusts image data 14 for display on target display 30 by expanding image data 14 according to the expansion value from block 20 .
  • block 22 expands to an intermediate range that is below the maximum capabilities of target display 30 . This avoids or keeps to an acceptable level distortions and other artifacts particular to image data 14 that would have been introduced and/or raised to unacceptable levels by expansion exceeding the maximum safe expansion but still within the capabilities of target display 30 .
  • Some embodiments may provide an optional override control that a user may operate to select an expansion exceeding a maximum safe expansion.
  • the override control may have the effect of causing a maximum safe expansion to be set to a very high value or the effect of causing the apparatus to ignore safe expansion limits.
  • Operating the control may have the effect of causing the maximum available expansion to be used.
  • a user control may permit a user to manually select an expansion. The selection may allow the user to cause the expansion to vary essentially continuously or may allow selection from a set of discrete expansions. An indicator, display or other feedback device may warn the user when the selected expansion exceeds the maximum safe expansion.
  • block 22 may expand some aspects of image data 14 by the the maximum available expansion and other aspects by less than the maximum available expansion.
  • chromaticity may be expanded by a maximum available expansion to take full advantage of an expanded color gamut of target display 30 while luminance is expanded by a maximum safe luminance expansion which is less than a maximum available luminance expansion.
  • block 22 performs expansion(s) according to sigmoidal expansion functions as described, for example, in U.S. application No. 61/453107 filed on 15 Mar. 2011 and entitled METHODS AND APPARATUS FOR IMAGE DATA TRANSFORMATION which is hereby incorporated herein by reference. That application describes image data expansion based on parameterized sigmoidal tone curve functions.
  • Block 22 may perform expansion of image data 14 according to other suitable mappings that are constrained to limit the amount of expansion to not exceed the maximum safe expansion.
  • the target peak luminance used to calculate the tone curve may be altered from a default value of the maximum capability of the target display, to a peak luminance corresponding to the maximum safe range of expansion indicated by metadata.
  • Block 24 stores and/or forwards the adjusted image data for display on the target display.
  • Block 26 displays the adjusted image data on the target display.
  • image data 14 comprises video data.
  • the maximum safe expansion may be different for different portions of the video (e.g. for different scenes).
  • Method 10 may be applied separately for such different portions of the video.
  • Steps in the method of FIG. 1 may be executed by one or more programmed data processors, by hardwired logic circuits, by configurable logic circuits such as field programmable gate arrays (FPGAs) combinations thereof and the like.
  • programmed data processors by hardwired logic circuits, by configurable logic circuits such as field programmable gate arrays (FPGAs) combinations thereof and the like.
  • FPGAs field programmable gate arrays
  • a television 50 has a signal input 52 connected to receive a signal containing video content for display from any of a tuner 53 A, an external video signal input 53 B and a data input 53 C.
  • Data input 53 C may, for example, be connected to the internet.
  • the signal is delivered to a decoder 54 which includes a metadata reader 54 A.
  • Decoder 54 is connected to supply decoded video data to an image processing system 55 comprising a dynamic range (luminance) expander 55 A, a color gamut expander 55 B and a controller 55 C configured to control dynamic range expander 55 A and color gamut expander 55 B as described below.
  • Image processing system may optionally perform any of a wide variety of additional image processing functions as are known in the art.
  • a signal is received at input 52 .
  • the signal is decoded to extract video data 57 and metadata 58 .
  • the metadata includes the black point 58 A, white point 58 B and color gamut 58 C of a source display (not shown) on which the video data was viewed for approval and also a maximum dynamic range expansion 58 D and a maximum color gamut expansion 58 E.
  • Image processing controller 55 C has access to or built into itself information specifying the black point 59 A, white point 59 B and color gamut 59 C of display 50 . Controller 55 C may perform a method 60 as shown in FIG. 2A .
  • Block 62 compares black point 59 A and white point 59 B of target display 50 to the black point 58 A and white point 58 B of the source display. From this comparison, controller 55 C determines whether it should cause dynamic range expander 55 A to: compress the dynamic range of the video data (e.g. by applying a tone mapping curve to produce altered video data having a lower dynamic range than video data 57 ) or expand the dynamic range, or leave the dynamic range unaltered.
  • Dynamic range expansion may be beneficial in cases where black point 59 A and white point 59 B of target display 50 are more widely separated than black point 58 A and white point 58 B of the source display.
  • Block 62 may comprise taking a ratio (R TRG /R SRC ) of the dynamic range of the target display R TRG to the dynamic range of the source display R SRC and branching depending upon whether that ratio is greater than, equal to or less than one.
  • block 65 configures dynamic range expander 55 A to do nothing or causes the video data to bypass dynamic range expander 55 A.
  • block 66 picks the smaller of the ratio R TRG /R SRC and the maximum dynamic range expansion 58 D.
  • Block 67 then configures dynamic range expander 55 A to expand the video data for display on television 50 by the amount determined by block 66 (or a smaller amount).
  • television 50 may perform either, both or neither of dynamic range expansion/compression and color gamut expansion/compression. This is not mandatory. Television 50 could be made to address dynamic range expansion/compression and not color gamut expansion/compression or vice versa. Color gamut expansion or compression may be handled in a similar manner to dynamic range expansion or compression.
  • block 70 compares the color gamut 59 C of television 50 to the color gamut 58 C of the source display. If the color gamut 58 C of the source display is larger than color gamut 59 C of television 50 then block 70 branches to block 72 which configures color gamut expander 55 B to compress (compression may include clipping, remapping etc.) the gamut of video data 57 to fit within color gamut 59 C.
  • block 70 branches to block 74 which leaves the color gamut of video data 72 unaltered or causes color gamut expander 55 B to be bypassed.
  • block 70 branches to block 76 which determines a maximum amount of color gamut expansion such that after color-gamut expansion the video data will still be within the color gamut of television 50 .
  • Block 77 outputs the smaller of the output of block 76 and the maximum safe color gamut expansion 58 E.
  • Block 78 configures color gamut expander 55 B to expand the color gamut of video data 57 by the amount determined by block 77 or a smaller amount.
  • the image data is applied to display driver 80 which drives display 82 to display images based on the adjusted image data.
  • FIG. 2 shows an optional safe expansion calculator stage 56 .
  • Safe expansion calculator 56 is configured to calculate maximum safe dynamic range expansion and/or maximum safe color gamut expansion based on information other than metadata 58 D and 58 E.
  • safe expansion calculator 56 may calculate maximum safe dynamic range expansion 58 D and maximum safe color gamut expansion 58 E in the case that metadata indicating these values is not present in metadata 58 .
  • Safe expansion calculator may determine maximum safe dynamic range expansion from information such as: bit depth of the video signal; and/or encoding quality of the video signal. Metadata indicating values for these parameters may be included in metadata 58 .
  • Safe expansion calculator may, for example, comprise a lookup table which receives as inputs bit depth and encoding quality of the video signal and produces a maximum safe dynamic range expansion as an output.
  • safe expansion calculator 56 is integrated with or associated with decoder 54 . In some embodiments, safe expansion calculator 56 is connected to receive from decoder 54 information about a received video signal such as bit depth of the video signal, encoding quality of the video signal, and/or other information useful for estimating a maximum safe expansion of the video signal.
  • FIG. 3 illustrates apparatus 50 A according to an alternative embodiment in which a target display is separate from the apparatus that includes the image processing system 55 .
  • FIG. 3 applies the same reference numbers used in FIG. 2 for elements having the same or similar functions.
  • Apparatus 50 A includes an interface 90 for data communication with a separate target display 50 B by way of a data communication path 91 .
  • Apparatus 50 A also includes a user interface 94 .
  • Apparatus 50 A is configured to obtain characteristics of target display 50 B by way of interface 90 (for example by reading information from an EDID, E-EDID or DisplayID data structure hosted in target display 50 B).
  • apparatus 50 A may receive information characterizing target display 50 B by way of user interface 94 .
  • Data processed by image processing system 55 is passed to target display 50 B by way of a data communication path 96 that may be wired, wireless, electronic, optical or the like.
  • Data communication path 92 and data communication path 96 may be separate or combined.
  • Apparatus, systems, modules and components described herein may comprise software, firmware, hardware, or any combination(s) of software, firmware, or hardware suitable for the purposes described herein.
  • Such software, firmware, hardware and combinations thereof may reside on personal computers, set top boxes, media players, video projectors, servers, displays (such as televisions, computer monitors, and the like) and other devices suitable for the purposes described herein.
  • aspects of the system can be embodied in a special purpose computer or data processor that is specifically programmed, configured, or constructed to perform one or more of the computer-executable instructions explained in detail herein.
  • Image processing and processing steps as described above may be performed in hardware, software or suitable combinations of hardware and software.
  • image processing may be performed by a data processor (such as one or more microprocessors, graphics processors, digital signal processors or the like) executing software and/or firmware instructions which cause the data processor to implement methods as described herein.
  • a data processor such as one or more microprocessors, graphics processors, digital signal processors or the like
  • firmware instructions which cause the data processor to implement methods as described herein.
  • Such methods may also be performed by logic circuits which may be hard configured or configurable (such as, for example logic circuits provided by a field-programmable gate array “FPGA”).
  • FPGA field-programmable gate array
  • Image processing and processing steps as described above may operate on and/or produce image data (including without limitation video data, tone mapping curves, metadata, and the like) embodied in computer-readable signals carried on non-transitory media.
  • Certain implementations of the invention comprise computer processors which execute software instructions which cause the processors to perform a method of the invention.
  • processors in a display, personal computer, set top box, media player, video projector, server, or the like may implement methods as described herein by executing software instructions in a program memory accessible to the processors.
  • the program product may comprise any non-transitory medium which carries a set of computer-readable signals comprising instructions which, when executed by a data processor, cause the data processor to execute a method of the invention.
  • a program product may comprise instructions which cause a data processor in a display to adjust the image data for display on the display.
  • Program products according to the invention may be in any of a wide variety of forms.
  • the program product may comprise, for example, media such as magnetic data storage media including floppy diskettes, hard disk drives, optical data storage media including CD ROMs, DVDs, electronic data storage media including ROMs, flash RAM, hardwired or preprogrammed chips (e.g., EEPROM semiconductor chips), nanotechnology memory, or the like.
  • the computer-readable signals on the program product may optionally be compressed or encrypted.
  • Computer instructions, data structures, and other data used in the practice of the technology may be distributed over the Internet or over other networks (including wireless networks), on a propagated signal on a propagation medium (e.g., an electromagnetic wave(s), a sound wave, etc.) over a period of time, or they may be provided on any analog or digital network (packet switched, circuit switched, or other scheme).
  • a component e.g. a software module, processor, assembly, device, circuit, input, tuner, decoder, reader, expander, controller, calculator, driver, interface, etc.
  • reference to that component should be interpreted as including as equivalents of that component any component which performs the function of the described component (i.e., that is functionally equivalent), including components which are not structurally equivalent to the disclosed structure which performs the function in the illustrated exemplary embodiments of the invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

Image data is adjusted for display on a target display. Maximum safe expansions for one or more attributes of the image data are compared to maximum available expansions for the attributes. An amount of expansion is selected that does not exceed either of the maximum safe expansion and the maximum available expansion. Artifacts caused by over expansion may be reduced or avoided.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims priority to U.S. Provisional Patent Application No. 61/473,691 filed on Apr. 8, 2011, which is incorporated herein by reference in its entirety.
  • TECHNICAL FIELD
  • The invention relates to displaying images and relates specifically to methods and apparatus involving the adjustment of image data for display on target displays.
  • BACKGROUND OF THE INVENTION
  • Display apparatus may employ any of a wide range of technologies. Some non-limiting examples are plasma displays, liquid crystal displays (LCDs), cathode ray tube (CRT) displays, organic light emitting diode (OLED) displays, projection displays that use any of various light sources in combination with various spatial light modulation technologies, and so on. Displays may be of a wide variety of types used in any of a wide variety of applications. For example, the term display encompasses without limitation apparatus such as: televisions, computer displays, media player displays, displays in hand-held devices, displays used on control panels for equipment of different kinds, electronic game displays, digital cinema displays, special purpose displays such as virtual reality displays, advertising displays, stadium displays, medical imaging displays, and so on.
  • Different displays may have different capabilities in areas such as: black level, maximum brightness (display peak luminance), color gamut, and so on. The appearance of displayed images is also affected by the environment in which a display is being viewed. For example, the luminance of ambient lighting, the color of ambient lighting and screen reflections can all affect the appearance of displayed images.
  • With the increasing availability of high-performance displays (e.g. displays that have high peak luminance and/or broad color gamuts) comes the problem of how to adjust images for optimum viewing on a particular display or type of displays. Addressing this problem in simplistic ways can result in noticeable artifacts in displayed images. For example, consider the case where an image that appears properly on a display having a moderate peak luminance is displayed on a target display having a very high peak luminance. If one expands the luminance range of the image data to take advantage of the high peak luminance of the target display, the result may be poor due to objectionable artifacts that are rendered apparent by the range expansion. Artifacts may include, for example, one or more of banding, quantization artifacts, visible macroblock edges, objectionable film grain and the like. On the other hand, if the image is displayed on the target display without range expansion, no benefit is gained from the high peak luminance that the target display can achieve.
  • There is a need for apparatus and methods for processing and/or displaying images that can exploit the capabilities of target displays to provide enhanced viewing while reducing or avoiding undesirable artifacts.
  • SUMMARY OF THE INVENTION
  • This invention may be embodied in a wide variety of ways. These include, without limitation, displays (such as television, digital cinema displays, specialty displays such as advertising displays, gaming displays, virtual reality displays, vehicle simulator displays and the like, displays on portable devices), image processing apparatus which may be integrated with a display, stand alone, or integrated with other apparatus such as a media player or the like and media carrying non-transitory instructions which, when executed by a data processor cause the data processor to execute a method according to the invention.
  • One non-limiting example aspect of the invention provides a method for image processing. The method comprises obtaining image data and metadata associated with the image data. The method processes the metadata with information characterizing a target display to determine a maximum safe expansion for an attribute of the image data and a maximum available expansion for the attribute of the image data. The method processes the image data to expand the attribute of the image data by the lesser of the maximum safe expansion and the maximum available expansion.
  • In some embodiments the attribute is dynamic range and processing the image data comprises applying a tone mapping curve to the image data. In some embodiments the attribute comprises a color gamut.
  • Another non-limiting example aspect of the invention provides image processing apparatus. The image processing apparatus comprises a decoder configured to extract image data and metadata associated with the image data from a signal and a controller configured to process the metadata with information characterizing a target display to determine a maximum safe expansion for an attribute of the image data and a maximum available expansion for the attribute of the image data. The controller is further configured to set an expansion amount equal to the smaller of the maximum safe expansion and the maximum available expansion. The apparatus comprises an expander configured to expand the attribute of the image data by the expansion amount to yield modified image data.
  • Further aspects of the invention and features of specific embodiments of the invention are described below.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings illustrate non-limiting embodiments of the invention.
  • FIG. 1 is a flow chart that illustrates a method according to an example embodiment of the invention.
  • FIG. 2 is a block diagram illustrating apparatus according to an example embodiment.
  • FIG. 2A is a flow chart illustrating a method that may be performed in a controller in the apparatus of FIG. 2 or similar apparatus.
  • FIG. 3 is a block diagram illustrating apparatus according to an alternative embodiment.
  • DESCRIPTION OF THE INVENTION
  • Throughout the following description, specific details are set forth in order to provide a more thorough understanding of the invention. However, the invention may be practiced without these particulars. In other instances, well known elements have not been shown or described in detail to avoid unnecessarily obscuring the invention. Accordingly, the specification and drawings are to be regarded in an illustrative, rather than a restrictive, sense.
  • FIG. 1 is a flow chart that illustrates a method 10 according to an example embodiment of the invention. Method 10 may, for example, be practiced at a target display or in an image processing device upstream from a target display. Method 10 receives image data and adjusts the image data for display on the target display. The adjustment comprises expanding one or more of dynamic range and color gamut.
  • Block 12 receives image data 14 comprising image content to be displayed on a particular target display. Block 12 may comprise, for example, receiving a data stream comprising the image data 14, accessing a file or other data structure in a data store containing the image data 14, or the like.
  • Block 15 receives metadata 16 associated with image data 14. Block 15 may comprise, for example, extracting metadata 16 from a side stream, decoding metadata 16 that has been encoded in image data 14, obtaining metadata 16 from a server or other repository by way of a data communication network, accessing a data structure containing metadata 16 or the like. In some embodiments, metadata 16 and image data 14 are both contained in a common data container and blocks 12 and 15 respectively comprise extracting the image data 14 and associated metadata 16 from the data container.
  • Metadata 16 contains information that is indicative of a degree to which image data 14 can be expanded ‘safely’ in one or more respects. Here ‘safely’ means without introducing visible artifacts that unacceptably degrade image quality. One example of information that may be present as metadata 16 is a direct indication of an acceptable degree of expansion (e.g. a number that directly indicates the degree to which the dynamic range (e.g. luminance range) may be safely expanded). Another example is information specifying characteristics of a gamut of a source display on which the content of image data 14 was approved. The gamut characteristics may include one or more of: the colors of primaries of the source display, luminance range of the source display, black and white levels of the source display, points on a three-dimensional gamut boundary of the source display, or the like. Another example is information specifying encoding parameters for image data 14 such as bit depth, encoding quality, color sub-sampling, and the like. Another example of the makeup of metadata 16 is a combination of various ones of the information described above.
  • Block 18 determines from metadata 16 (or metadata 16 and image data 14 in some embodiments) one or more maximum safe expansions that may be applied to image data 14. For example, block 18 may determine a safe dynamic range expansion and/or a safe chromaticity expansion.
  • Block 19 determines a maximum available expansion that may be applied to image data 14 to fully exploit the capabilities of target display 30. In cases where method 10 is defined with knowledge of the capabilities of target display 30 (e.g. where method 10 is performed by a system integrated with target display 30) block 19 may determine the maximum available expansion based on metadata 16 (where the metadata 16 specifies capabilities of the source display). For example, if the source display has a peak luminance of 600 nits and target display 30 has a peak luminance of 1800 nits then block 19 may determine that the maximum available expansion is expansion by a factor of 3 (by dividing 1800 by 600 for example). In some cases the maximum safe expansion may be less than the maximum available expansion. For example, in a case where the maximum safe expansion is 2 an expansion of 2 or less will be applied. In the example case where an expansion of 2 is applied, the resulting peak luminance would be 1200 nits (2×600 nits), for this example.
  • As another example, block 19 may obtain information regarding one or more capabilities of target display 30 by, for example, interrogating target display 30 to determine its capabilities by way of an EDID, E-EDID, DisplayID or similar functionality, retrieving stored information regarding the characteristics of target display 30 from a local data store or an accessible server or the like.
  • Block 20 compares the maximum safe expansion(s) from block 18 to the maximum available expansion(s) from block 19. If the maximum safe expansion equals or exceeds the maximum available expansion then image data 14 may be expanded by the maximum available expansion for display on the target display. If the maximum safe expansion is less than the maximum available expansion then the expansion of image data 14 should be limited to the maximum safe expansion for display on the target display. Block 20 provides an expansion value equal to the lesser of the maximum safe expansion and the maximum available expansion for one or more image characteristics that may be subjected to expansion.
  • Block 22 adjusts image data 14 for display on target display 30 by expanding image data 14 according to the expansion value from block 20. In cases where the maximum safe expansion is less than the maximum available expansion, block 22 expands to an intermediate range that is below the maximum capabilities of target display 30. This avoids or keeps to an acceptable level distortions and other artifacts particular to image data 14 that would have been introduced and/or raised to unacceptable levels by expansion exceeding the maximum safe expansion but still within the capabilities of target display 30.
  • Some embodiments may provide an optional override control that a user may operate to select an expansion exceeding a maximum safe expansion. For example, the override control may have the effect of causing a maximum safe expansion to be set to a very high value or the effect of causing the apparatus to ignore safe expansion limits. Operating the control may have the effect of causing the maximum available expansion to be used. In another example, a user control may permit a user to manually select an expansion. The selection may allow the user to cause the expansion to vary essentially continuously or may allow selection from a set of discrete expansions. An indicator, display or other feedback device may warn the user when the selected expansion exceeds the maximum safe expansion.
  • Advantageously, block 22 may expand some aspects of image data 14 by the the maximum available expansion and other aspects by less than the maximum available expansion. For example, for a particular choice of image data 14 and target display 30, chromaticity may be expanded by a maximum available expansion to take full advantage of an expanded color gamut of target display 30 while luminance is expanded by a maximum safe luminance expansion which is less than a maximum available luminance expansion.
  • In some example embodiments, block 22 performs expansion(s) according to sigmoidal expansion functions as described, for example, in U.S. application No. 61/453107 filed on 15 Mar. 2011 and entitled METHODS AND APPARATUS FOR IMAGE DATA TRANSFORMATION which is hereby incorporated herein by reference. That application describes image data expansion based on parameterized sigmoidal tone curve functions. Block 22 may perform expansion of image data 14 according to other suitable mappings that are constrained to limit the amount of expansion to not exceed the maximum safe expansion. For example, the target peak luminance used to calculate the tone curve may be altered from a default value of the maximum capability of the target display, to a peak luminance corresponding to the maximum safe range of expansion indicated by metadata.
  • Block 24 stores and/or forwards the adjusted image data for display on the target display. Block 26 displays the adjusted image data on the target display.
  • In some embodiments, image data 14 comprises video data. In such embodiments, the maximum safe expansion may be different for different portions of the video (e.g. for different scenes). Method 10 may be applied separately for such different portions of the video.
  • Steps in the method of FIG. 1 may be executed by one or more programmed data processors, by hardwired logic circuits, by configurable logic circuits such as field programmable gate arrays (FPGAs) combinations thereof and the like.
  • Application Example
  • One application example is illustrated in FIG. 2. A television 50 has a signal input 52 connected to receive a signal containing video content for display from any of a tuner 53A, an external video signal input 53B and a data input 53C. Data input 53C may, for example, be connected to the internet. The signal is delivered to a decoder 54 which includes a metadata reader 54A. Decoder 54 is connected to supply decoded video data to an image processing system 55 comprising a dynamic range (luminance) expander 55A, a color gamut expander 55B and a controller 55C configured to control dynamic range expander 55A and color gamut expander 55B as described below. Image processing system may optionally perform any of a wide variety of additional image processing functions as are known in the art.
  • In operation, a signal is received at input 52. The signal is decoded to extract video data 57 and metadata 58. In one example, the metadata includes the black point 58A, white point 58B and color gamut 58C of a source display (not shown) on which the video data was viewed for approval and also a maximum dynamic range expansion 58D and a maximum color gamut expansion 58E.
  • Image processing controller 55C has access to or built into itself information specifying the black point 59A, white point 59B and color gamut 59C of display 50. Controller 55C may perform a method 60 as shown in FIG. 2A. Block 62 compares black point 59A and white point 59B of target display 50 to the black point 58A and white point 58B of the source display. From this comparison, controller 55C determines whether it should cause dynamic range expander 55A to: compress the dynamic range of the video data (e.g. by applying a tone mapping curve to produce altered video data having a lower dynamic range than video data 57) or expand the dynamic range, or leave the dynamic range unaltered. Dynamic range expansion may be beneficial in cases where black point 59A and white point 59B of target display 50 are more widely separated than black point 58A and white point 58B of the source display. Block 62 may comprise taking a ratio (RTRG/RSRC) of the dynamic range of the target display RTRG to the dynamic range of the source display RSRC and branching depending upon whether that ratio is greater than, equal to or less than one.
  • In the case that compression is called for (RTRG/RSRC<1) block 64 configures dynamic range expander 55A to compress the video data for display on television 50.
  • In the case that no compression or expansion is called for (RTRG/RSRC=1) block 65 configures dynamic range expander 55A to do nothing or causes the video data to bypass dynamic range expander 55A.
  • In the case that expansion is called for (RTRG/RSRC>1) block 66 picks the smaller of the ratio RTRG/RSRC and the maximum dynamic range expansion 58D. Block 67 then configures dynamic range expander 55A to expand the video data for display on television 50 by the amount determined by block 66 (or a smaller amount).
  • Consider the specific example where the comparison of the black and white points of the source and target displays indicates that the target display is capable of a dynamic range 1.5 times that of the source display (e.g. RTRG/RSRC=1.5 thus a dynamic range expansion up to a maximum of 1.5 is possible). Suppose that metadata 58D indicates a maximum safe dynamic range expansion of 1.2. Then block 66 outputs 1.2 because 1.2<1.5 and block 67 configures dynamic range expander 55A to expand the dynamic range of the video data by a factor of 1.2.
  • In the illustrated embodiment, television 50 may perform either, both or neither of dynamic range expansion/compression and color gamut expansion/compression. This is not mandatory. Television 50 could be made to address dynamic range expansion/compression and not color gamut expansion/compression or vice versa. Color gamut expansion or compression may be handled in a similar manner to dynamic range expansion or compression.
  • In the illustrated method 60, block 70 compares the color gamut 59C of television 50 to the color gamut 58C of the source display. If the color gamut 58C of the source display is larger than color gamut 59C of television 50 then block 70 branches to block 72 which configures color gamut expander 55B to compress (compression may include clipping, remapping etc.) the gamut of video data 57 to fit within color gamut 59C.
  • If the color gamut 58C of the source display and the color gamut 59C of television 50 are equal (to within a desired precision) then block 70 branches to block 74 which leaves the color gamut of video data 72 unaltered or causes color gamut expander 55B to be bypassed.
  • If the color gamut 58C of the source display lies within the color gamut 59C of television 50 such that color gamut expansion is possible then block 70 branches to block 76 which determines a maximum amount of color gamut expansion such that after color-gamut expansion the video data will still be within the color gamut of television 50. Block 77 outputs the smaller of the output of block 76 and the maximum safe color gamut expansion 58E. Block 78 configures color gamut expander 55B to expand the color gamut of video data 57 by the amount determined by block 77 or a smaller amount.
  • The image data is applied to display driver 80 which drives display 82 to display images based on the adjusted image data.
  • FIG. 2 shows an optional safe expansion calculator stage 56. Safe expansion calculator 56 is configured to calculate maximum safe dynamic range expansion and/or maximum safe color gamut expansion based on information other than metadata 58D and 58E. For example, safe expansion calculator 56 may calculate maximum safe dynamic range expansion 58D and maximum safe color gamut expansion 58E in the case that metadata indicating these values is not present in metadata 58. Safe expansion calculator may determine maximum safe dynamic range expansion from information such as: bit depth of the video signal; and/or encoding quality of the video signal. Metadata indicating values for these parameters may be included in metadata 58. Safe expansion calculator may, for example, comprise a lookup table which receives as inputs bit depth and encoding quality of the video signal and produces a maximum safe dynamic range expansion as an output. In some embodiments, safe expansion calculator 56 is integrated with or associated with decoder 54. In some embodiments, safe expansion calculator 56 is connected to receive from decoder 54 information about a received video signal such as bit depth of the video signal, encoding quality of the video signal, and/or other information useful for estimating a maximum safe expansion of the video signal.
  • FIG. 3 illustrates apparatus 50A according to an alternative embodiment in which a target display is separate from the apparatus that includes the image processing system 55. FIG. 3 applies the same reference numbers used in FIG. 2 for elements having the same or similar functions.
  • Apparatus 50A includes an interface 90 for data communication with a separate target display 50B by way of a data communication path 91. Apparatus 50A also includes a user interface 94. Apparatus 50A is configured to obtain characteristics of target display 50B by way of interface 90 (for example by reading information from an EDID, E-EDID or DisplayID data structure hosted in target display 50B). In addition or in the alternative, apparatus 50A may receive information characterizing target display 50B by way of user interface 94. Data processed by image processing system 55 is passed to target display 50B by way of a data communication path 96 that may be wired, wireless, electronic, optical or the like. Data communication path 92 and data communication path 96 may be separate or combined.
  • Apparatus, systems, modules and components described herein (including without limitation inputs, tuners, decoders, readers, expanders, controllers, calculators, drivers, interfaces, and the like) may comprise software, firmware, hardware, or any combination(s) of software, firmware, or hardware suitable for the purposes described herein. Such software, firmware, hardware and combinations thereof may reside on personal computers, set top boxes, media players, video projectors, servers, displays (such as televisions, computer monitors, and the like) and other devices suitable for the purposes described herein. Furthermore, aspects of the system can be embodied in a special purpose computer or data processor that is specifically programmed, configured, or constructed to perform one or more of the computer-executable instructions explained in detail herein.
  • Image processing and processing steps as described above may be performed in hardware, software or suitable combinations of hardware and software. For example, such image processing may be performed by a data processor (such as one or more microprocessors, graphics processors, digital signal processors or the like) executing software and/or firmware instructions which cause the data processor to implement methods as described herein. Such methods may also be performed by logic circuits which may be hard configured or configurable (such as, for example logic circuits provided by a field-programmable gate array “FPGA”). Image processing and processing steps as described above may operate on and/or produce image data (including without limitation video data, tone mapping curves, metadata, and the like) embodied in computer-readable signals carried on non-transitory media.
  • Certain implementations of the invention comprise computer processors which execute software instructions which cause the processors to perform a method of the invention. For example, one or more processors in a display, personal computer, set top box, media player, video projector, server, or the like may implement methods as described herein by executing software instructions in a program memory accessible to the processors.
  • Some aspects of the invention may also be provided in the form of a program product. The program product may comprise any non-transitory medium which carries a set of computer-readable signals comprising instructions which, when executed by a data processor, cause the data processor to execute a method of the invention. For example, such a program product may comprise instructions which cause a data processor in a display to adjust the image data for display on the display. Program products according to the invention may be in any of a wide variety of forms. The program product may comprise, for example, media such as magnetic data storage media including floppy diskettes, hard disk drives, optical data storage media including CD ROMs, DVDs, electronic data storage media including ROMs, flash RAM, hardwired or preprogrammed chips (e.g., EEPROM semiconductor chips), nanotechnology memory, or the like. The computer-readable signals on the program product may optionally be compressed or encrypted. Computer instructions, data structures, and other data used in the practice of the technology may be distributed over the Internet or over other networks (including wireless networks), on a propagated signal on a propagation medium (e.g., an electromagnetic wave(s), a sound wave, etc.) over a period of time, or they may be provided on any analog or digital network (packet switched, circuit switched, or other scheme).
  • The teachings of the technology provided herein can be applied to other systems, not necessarily the system described above. The elements and acts of the various examples described above can be combined to provide further examples. Aspects of the system can be modified, if necessary, to employ the systems, functions, and concepts of the various references described above to provide yet further examples of the technology.
  • Where a component (e.g. a software module, processor, assembly, device, circuit, input, tuner, decoder, reader, expander, controller, calculator, driver, interface, etc.) is referred to above, unless otherwise indicated, reference to that component (including a reference to a “means”) should be interpreted as including as equivalents of that component any component which performs the function of the described component (i.e., that is functionally equivalent), including components which are not structurally equivalent to the disclosed structure which performs the function in the illustrated exemplary embodiments of the invention.
  • These and other changes can be made to the system in light of the above Description. While the above description describes certain examples of the system, and describes the best mode contemplated, no matter how detailed the above appears in text, the system can be practiced in many ways. Details of the system and method for classifying and transferring information may vary considerably in its implementation details, while still being encompassed by the system disclosed herein. As noted above, particular terminology used when describing certain features or aspects of the system should not be taken to imply that the terminology is being redefined herein to be restricted to any specific characteristics, features, or aspects of the system with which that terminology is associated. In general, the terms used in the following claims should not be construed to limit the system to the specific examples disclosed in the specification, unless the above Description section explicitly and restrictively defines such terms. Accordingly, the actual scope of the system encompasses not only the disclosed examples, but also all equivalent ways of practicing or implementing the technology under the claims.
  • From the foregoing, it will be appreciated that specific examples of systems and methods have been described herein for purposes of illustration, but that various modifications, alterations, additions and permutations may be made without deviating from the spirit and scope of the invention. The embodiments described herein are only examples. Those skilled in the art will appreciate that certain features of embodiments described herein may be used in combination with features of other embodiments described herein, and that embodiments described herein may be practised or implemented without all of the features ascribed to them herein. Such variations on described embodiments that would be apparent to the skilled addressee, including variations comprising mixing and matching of features from different embodiments, are within the scope of this invention.
  • As will be apparent to those skilled in the art in the light of the foregoing disclosure, many alterations and modifications are possible in the practice of this invention without departing from the spirit or scope thereof. Accordingly, the scope of the invention is to be construed in accordance with the substance defined by the following claims.

Claims (14)

1. A method for image processing, the method comprising:
obtaining image data and metadata associated with the image data;
processing the metadata with information characterizing a target display to determine a maximum safe expansion for an attribute of the image data and a maximum available expansion for the attribute of the image data; and,
processing the image data to expand the attribute of the image data by the lesser of the maximum safe expansion and the maximum available expansion.
2. A method according to claim 1 wherein the attribute is dynamic range and processing the image data comprises applying a tone mapping curve to the image data.
3. A method according to claim 2 wherein the metadata comprises metadata indicative of a bit depth of the image data and the method comprises determining the maximum safe expansion based at least in part on the bit depth.
4. A method according to claim 3 wherein the metadata comprises metadata indicative of an encoding quality level of the image data and the method comprises determining the maximum safe expansion based at least in part on the encoding quality level.
5. A method according to claim 3 wherein determining the maximum safe expansion comprises using the bit depth as a key to look up a value for the maximum safe expansion in a lookup table.
6. A method according to claim 2 wherein the metadata comprises metadata indicative of a dynamic range of a source display and determining the maximum available expansion comprises computing a ratio of a dynamic range of the target display and the dynamic range of the source display.
7. A method according to claim 1 wherein the attribute comprises a color gamut.
8. A method according to claim 1 wherein the metadata comprises metadata directly indicative of the maximum safe expansion.
9. A method according to claim 8 wherein the attribute is one of a plurality of attributes and the metadata comprises metadata directly indicative of the maximum safe expansion for each of the plurality of attributes.
10. A method according to claim 9 wherein the plurality of attributes comprises dynamic range and color gamut.
11. Image processing apparatus comprising:
a decoder configured to extract image data and metadata associated with the image data from a signal;
a controller configured to process the metadata with information characterizing a target display to determine a maximum safe expansion for an attribute of the image data and a maximum available expansion for the attribute of the image data and to set an expansion amount equal to the smaller of the maximum safe expansion and the maximum available expansion; and
an expander configured to expand the attribute of the image data by the expansion amount to yield modified image data.
12. Apparatus according to claim 11 integrated with the target display wherein the target display is connected to display the modified image data.
13. Apparatus according to claim 12 wherein the target display comprises a television or a digital cinema projector.
14. Apparatus according to claim 11 comprising an interface for data communication with the target display wherein the controller is configured to access the information characterizing the target display by way of the interface.
US13/442,708 2011-04-08 2012-04-09 Image Range Expansion Control Methods and Apparatus Abandoned US20120256943A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US13/442,708 US20120256943A1 (en) 2011-04-08 2012-04-09 Image Range Expansion Control Methods and Apparatus
US14/721,345 US9501817B2 (en) 2011-04-08 2015-05-26 Image range expansion control methods and apparatus
US15/332,454 US10395351B2 (en) 2011-04-08 2016-10-24 Image range expansion control methods and apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161473691P 2011-04-08 2011-04-08
US13/442,708 US20120256943A1 (en) 2011-04-08 2012-04-09 Image Range Expansion Control Methods and Apparatus

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/721,345 Continuation US9501817B2 (en) 2011-04-08 2015-05-26 Image range expansion control methods and apparatus

Publications (1)

Publication Number Publication Date
US20120256943A1 true US20120256943A1 (en) 2012-10-11

Family

ID=46229176

Family Applications (3)

Application Number Title Priority Date Filing Date
US13/442,708 Abandoned US20120256943A1 (en) 2011-04-08 2012-04-09 Image Range Expansion Control Methods and Apparatus
US14/721,345 Active US9501817B2 (en) 2011-04-08 2015-05-26 Image range expansion control methods and apparatus
US15/332,454 Active 2032-06-26 US10395351B2 (en) 2011-04-08 2016-10-24 Image range expansion control methods and apparatus

Family Applications After (2)

Application Number Title Priority Date Filing Date
US14/721,345 Active US9501817B2 (en) 2011-04-08 2015-05-26 Image range expansion control methods and apparatus
US15/332,454 Active 2032-06-26 US10395351B2 (en) 2011-04-08 2016-10-24 Image range expansion control methods and apparatus

Country Status (2)

Country Link
US (3) US20120256943A1 (en)
EP (1) EP2518719B1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120229526A1 (en) * 2011-03-11 2012-09-13 Calgary Scientific Inc. Method and system for remotely calibrating display of image data
US20130076974A1 (en) * 2011-09-26 2013-03-28 Dolby Laboratories Licensing Corporation Image Formats and Related Methods and Apparatuses
US20140016000A1 (en) * 2012-04-26 2014-01-16 Canon Kabushiki Kaisha Image processing apparatus
US9288499B2 (en) 2011-12-06 2016-03-15 Dolby Laboratories Licensing Corporation Device and method of improving the perceptual luminance nonlinearity-based image data exchange across different display capabilities
WO2016038950A1 (en) * 2014-09-11 2016-03-17 ソニー株式会社 Image-processing device, and image-processing method
US9684976B2 (en) 2013-03-13 2017-06-20 Qualcomm Incorporated Operating system-resident display module parameter selection system
US10242650B2 (en) 2011-12-06 2019-03-26 Dolby Laboratories Licensing Corporation Perceptual luminance nonlinearity-based image data exchange across different display capabilities
US10250862B2 (en) 2016-09-12 2019-04-02 Onkyo Corporation Video processing device
WO2019098778A1 (en) * 2017-11-17 2019-05-23 Samsung Electronics Co., Ltd. Display apparatus, method for controlling the same and image providing apparatus
US10565695B2 (en) * 2014-07-28 2020-02-18 Sony Corporation Apparatus and method for transmitting and receiving high dynamic range images
US10902567B2 (en) * 2015-11-24 2021-01-26 Koninklijke Philips N.V. Handling multiple HDR image sources
US11202050B2 (en) * 2016-10-14 2021-12-14 Lg Electronics Inc. Data processing method and device for adaptive image playing
US20220164931A1 (en) * 2019-04-23 2022-05-26 Dolby Laboratories Licensing Corporation Display management for high dynamic range images

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10200571B2 (en) 2016-05-05 2019-02-05 Nvidia Corporation Displaying an adjusted image according to ambient light conditions
US10354394B2 (en) 2016-09-16 2019-07-16 Dolby Laboratories Licensing Corporation Dynamic adjustment of frame rate conversion settings
WO2019041112A1 (en) * 2017-08-28 2019-03-07 华为技术有限公司 Image processing method and apparatus
US10977809B2 (en) 2017-12-11 2021-04-13 Dolby Laboratories Licensing Corporation Detecting motion dragging artifacts for dynamic adjustment of frame rate conversion settings
CN108733818B (en) * 2018-05-21 2021-04-02 上海世脉信息科技有限公司 Big data sample expansion method based on multi-scene multi-data-source verification
US11676260B2 (en) * 2019-09-26 2023-06-13 Kla Corporation Variation-based segmentation for wafer defect detection
WO2022250696A1 (en) * 2021-05-28 2022-12-01 Hewlett-Packard Development Company, L.P. Static display metadata modifications

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6754384B1 (en) * 2000-08-30 2004-06-22 Eastman Kodak Company Method for processing an extended color gamut digital image using an image information parameter
US7011413B2 (en) * 2002-05-09 2006-03-14 Seiko Epson Corporation Image processing system, projector, program, information storage medium, and image processing method
US20100134496A1 (en) * 2008-12-01 2010-06-03 Vasudev Bhaskaran Bit resolution enhancement
US7911479B2 (en) * 2007-11-20 2011-03-22 Xerox Corporation Gamut mapping
US8570438B2 (en) * 2009-04-21 2013-10-29 Marvell World Trade Ltd. Automatic adjustments for video post-processor based on estimated quality of internet video content

Family Cites Families (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2853727B2 (en) 1994-02-22 1999-02-03 日本ビクター株式会社 Reproduction protection method and protection reproduction device
US6335983B1 (en) * 1998-09-28 2002-01-01 Eastman Kodak Company Representing an extended color gamut digital image in a limited color gamut color space
US6771323B1 (en) 1999-11-15 2004-08-03 Thx Ltd. Audio visual display adjustment using captured content characteristics
KR100844816B1 (en) 2000-03-13 2008-07-09 소니 가부시끼 가이샤 Method and apparatus for generating compact transcoding hints metadata
EP1327360A1 (en) 2000-10-11 2003-07-16 Koninklijke Philips Electronics N.V. Scalable coding of multi-media objects
JP3814494B2 (en) 2001-04-27 2006-08-30 松下電器産業株式会社 Color management apparatus and color management system
US7035460B2 (en) * 2002-05-31 2006-04-25 Eastman Kodak Company Method for constructing an extended color gamut digital image from a limited color gamut digital image
US7176935B2 (en) * 2003-10-21 2007-02-13 Clairvoyante, Inc. Gamut conversion system and methods
WO2005125178A1 (en) 2004-06-14 2005-12-29 Thx, Ltd Content display optimizer
US20070098083A1 (en) 2005-10-20 2007-05-03 Visharam Mohammed Z Supporting fidelity range extensions in advanced video codec file format
US8994744B2 (en) * 2004-11-01 2015-03-31 Thomson Licensing Method and system for mastering and distributing enhanced color space content
US8482614B2 (en) 2005-06-14 2013-07-09 Thx Ltd Content presentation optimizer
US7656462B2 (en) 2005-06-17 2010-02-02 Martin Weston Systems and methods for modifying master film for viewing at different viewing locations
US7583406B2 (en) * 2005-08-23 2009-09-01 Eastman Kodak Company Color transforms for concave device gamuts
JP4102847B2 (en) * 2006-06-30 2008-06-18 シャープ株式会社 Image data providing apparatus, image display apparatus, image display system, image data providing apparatus control method, image display apparatus control method, control program, and recording medium
KR20080031555A (en) 2006-10-04 2008-04-10 삼성전자주식회사 Method and apparatus for transmitting/receiving data
US20080095228A1 (en) 2006-10-20 2008-04-24 Nokia Corporation System and method for providing picture output indications in video coding
KR20080049360A (en) 2006-11-30 2008-06-04 삼성전자주식회사 The method of transmitting color gamut and the image device thereof
CN101543084A (en) * 2006-11-30 2009-09-23 Nxp股份有限公司 Device and method for processing color image data
WO2008095037A2 (en) * 2007-01-30 2008-08-07 Fergason Patent Properties, Llc Image acquistion and display system and method using information derived from an area of interest in a video image implementing system synchronized brightness control and use of metadata
US20080195977A1 (en) 2007-02-12 2008-08-14 Carroll Robert C Color management system
CN101641949B (en) 2007-04-03 2012-02-01 汤姆逊许可公司 Methods and systems for displays with chromatic correction with differing chromatic ranges
WO2009005495A1 (en) * 2007-06-29 2009-01-08 Thomson Licensing System and method for matching colors on displays with different modulation transfer functions
JP4517308B2 (en) 2007-12-13 2010-08-04 ソニー株式会社 Information processing apparatus and method, program, and information processing system
US8207720B2 (en) * 2008-07-18 2012-06-26 Infineon Technologies Austria Ag Methods and apparatus for power supply load dump compensation
JP5361283B2 (en) 2008-08-21 2013-12-04 キヤノン株式会社 Color processing apparatus and method
GB0816768D0 (en) 2008-09-12 2008-10-22 Pandora Int Ltd Colour editing
JP4770921B2 (en) * 2008-12-01 2011-09-14 日本電気株式会社 Gateway server, file management system, file management method and program
FR2933837A1 (en) 2008-12-10 2010-01-15 Thomson Licensing Video images sequence coding method for e.g. satellite distribution network, involves coding auxiliary transcoding aid data e.g. coding parameters, into supplemental enhancement information message of coded image data stream
WO2010083493A1 (en) * 2009-01-19 2010-07-22 Dolby Laboratories Licensing Corporation Image processing and displaying methods for devices that implement color appearance models
KR20100092693A (en) 2009-02-13 2010-08-23 엘지전자 주식회사 A method for processing a video signal and a system thereof
US8760461B2 (en) * 2009-05-13 2014-06-24 Stmicroelectronics, Inc. Device, system, and method for wide gamut color space support
JP5212736B2 (en) 2009-05-22 2013-06-19 ソニー株式会社 Information processing apparatus and method, and program
JP5577415B2 (en) * 2010-02-22 2014-08-20 ドルビー ラボラトリーズ ライセンシング コーポレイション Video display with rendering control using metadata embedded in the bitstream
TWI538474B (en) 2011-03-15 2016-06-11 杜比實驗室特許公司 Methods and apparatus for image data transformation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6754384B1 (en) * 2000-08-30 2004-06-22 Eastman Kodak Company Method for processing an extended color gamut digital image using an image information parameter
US7011413B2 (en) * 2002-05-09 2006-03-14 Seiko Epson Corporation Image processing system, projector, program, information storage medium, and image processing method
US7911479B2 (en) * 2007-11-20 2011-03-22 Xerox Corporation Gamut mapping
US20100134496A1 (en) * 2008-12-01 2010-06-03 Vasudev Bhaskaran Bit resolution enhancement
US8570438B2 (en) * 2009-04-21 2013-10-29 Marvell World Trade Ltd. Automatic adjustments for video post-processor based on estimated quality of internet video content

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Allan G. Rempel, Matthew Trentacoste, Helge Seetzen, H. David Young, Wolfgang Heidrich, Lorne Whitehead, and Greg Ward. 2007. Ldr2Hdr: on-the-fly reverse tone mapping of legacy video and photographs. In ACM SIGGRAPH 2007 papers (SIGGRAPH '07). ACM, New York, NY, USA, , Article 39 *

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8890906B2 (en) * 2011-03-11 2014-11-18 Calgary Scientific Inc. Method and system for remotely calibrating display of image data
US20120229526A1 (en) * 2011-03-11 2012-09-13 Calgary Scientific Inc. Method and system for remotely calibrating display of image data
US9420196B2 (en) * 2011-09-26 2016-08-16 Dolby Laboratories Licensing Corporation Image formats and related methods and apparatuses
US20130076974A1 (en) * 2011-09-26 2013-03-28 Dolby Laboratories Licensing Corporation Image Formats and Related Methods and Apparatuses
US9685120B2 (en) * 2011-09-26 2017-06-20 Dolby Laboratories Licensing Corporation Image formats and related methods and apparatuses
US8988552B2 (en) * 2011-09-26 2015-03-24 Dolby Laboratories Licensing Corporation Image formats and related methods and apparatuses
US20150161967A1 (en) * 2011-09-26 2015-06-11 Dolby Laboratories Licensing Corporation Image Formats and Related Methods and Apparatuses
US9202438B2 (en) * 2011-09-26 2015-12-01 Dolby Laboratories Licensing Corporation Image formats and related methods and apparatuses
US20160050353A1 (en) * 2011-09-26 2016-02-18 Dolby Laboratories Licensing Corporation Image formats and related methods and apparatuses
US20160335961A1 (en) * 2011-09-26 2016-11-17 Dolby Laboratories Licensing Corporation Image Formats and Related Methods and Apparatuses
US9288499B2 (en) 2011-12-06 2016-03-15 Dolby Laboratories Licensing Corporation Device and method of improving the perceptual luminance nonlinearity-based image data exchange across different display capabilities
US10242650B2 (en) 2011-12-06 2019-03-26 Dolby Laboratories Licensing Corporation Perceptual luminance nonlinearity-based image data exchange across different display capabilities
US11887560B2 (en) 2011-12-06 2024-01-30 Dolby Laboratories Licensing Corporation Perceptual luminance nonlinearity-based image data exchange across different display capabilities
US11600244B2 (en) 2011-12-06 2023-03-07 Dolby Laboratories Licensing Corporation Perceptual luminance nonlinearity-based image data exchange across different display capabilities
US9521419B2 (en) 2011-12-06 2016-12-13 Dolby Laboratories Licensing Corproation Perceptual luminance nonlinearity-based image data exchange across different display capabilities
US11587529B2 (en) 2011-12-06 2023-02-21 Dolby Laboratories Licensing Corporation Perceptual luminance nonlinearity-based image data exchange across different display capabilities
US10957283B2 (en) 2011-12-06 2021-03-23 Dolby Laboratories Licensing Corporation Perceptual luminance nonlinearity-based image data exchange across different display capabilities
US9685139B2 (en) 2011-12-06 2017-06-20 Dolby Laboratories Licensing Corporation Perceptual luminance nonlinearity-based image data exchange across different display capabilities
US10621952B2 (en) 2011-12-06 2020-04-14 Dolby Laboratories Licensing Corporation Perceptual luminance nonlinearity-based image data exchange across different display capabilities
US9959837B2 (en) 2011-12-06 2018-05-01 Dolby Laboratories Licensin Corporation Perceptual luminance nonlinearity-based image data exchange across different display capabilities
US9697799B2 (en) 2011-12-06 2017-07-04 Dolby Laboratories Licensing Corporation Perceptual luminance nonlinearity-based image data exchange across different display capabilities
US20140016000A1 (en) * 2012-04-26 2014-01-16 Canon Kabushiki Kaisha Image processing apparatus
US9282249B2 (en) * 2012-04-26 2016-03-08 Canon Kabushiki Kaisha Image processing apparatus that transmits image data outside the apparatus
US9684976B2 (en) 2013-03-13 2017-06-20 Qualcomm Incorporated Operating system-resident display module parameter selection system
US10565695B2 (en) * 2014-07-28 2020-02-18 Sony Corporation Apparatus and method for transmitting and receiving high dynamic range images
JPWO2016038950A1 (en) * 2014-09-11 2017-06-22 ソニー株式会社 Image processing apparatus and image processing method
US10402681B2 (en) 2014-09-11 2019-09-03 Sony Corporation Image processing apparatus and image processing method
US9665964B2 (en) 2014-09-11 2017-05-30 Sony Corporation Image processing apparatus and image processing method
WO2016038950A1 (en) * 2014-09-11 2016-03-17 ソニー株式会社 Image-processing device, and image-processing method
US10032300B2 (en) 2014-09-11 2018-07-24 Sony Corporation Image processing apparatus and image processing method
US9501855B2 (en) 2014-09-11 2016-11-22 Sony Corporation Image processing apparatus and image processing method
US10902567B2 (en) * 2015-11-24 2021-01-26 Koninklijke Philips N.V. Handling multiple HDR image sources
US10250862B2 (en) 2016-09-12 2019-04-02 Onkyo Corporation Video processing device
US11202050B2 (en) * 2016-10-14 2021-12-14 Lg Electronics Inc. Data processing method and device for adaptive image playing
WO2019098778A1 (en) * 2017-11-17 2019-05-23 Samsung Electronics Co., Ltd. Display apparatus, method for controlling the same and image providing apparatus
US11122245B2 (en) 2017-11-17 2021-09-14 Samsung Electronics Co., Ltd. Display apparatus, method for controlling the same and image providing apparatus
US20220164931A1 (en) * 2019-04-23 2022-05-26 Dolby Laboratories Licensing Corporation Display management for high dynamic range images
US11803948B2 (en) * 2019-04-23 2023-10-31 Dolby Laboratories Licensing Corporation Display management for high dynamic range images

Also Published As

Publication number Publication date
US10395351B2 (en) 2019-08-27
EP2518719A2 (en) 2012-10-31
EP2518719A3 (en) 2013-03-06
US20170039690A1 (en) 2017-02-09
EP2518719B1 (en) 2016-05-18
US9501817B2 (en) 2016-11-22
US20150254823A1 (en) 2015-09-10

Similar Documents

Publication Publication Date Title
US10395351B2 (en) Image range expansion control methods and apparatus
US9984446B2 (en) Video tone mapping for converting high dynamic range (HDR) content to standard dynamic range (SDR) content
US10235946B2 (en) Apparatus and method for controlling liquid crystal display brightness, and liquid crystal display device
US9230338B2 (en) Graphics blending for high dynamic range video
US20190304407A1 (en) Transitioning between video priority and graphics priority
US20170061901A1 (en) Apparatus and method for controlling liquid crystal display brightness, and liquid crystal display device
US20180061026A1 (en) Display method and display device
US9894314B2 (en) Encoding, distributing and displaying video data containing customized video content versions
US11122245B2 (en) Display apparatus, method for controlling the same and image providing apparatus
US11496798B2 (en) Video rendering system
EP3296985A1 (en) Image-processing device, image-processing method, and program
CN112334970B (en) Source side tone mapping based on native gamut and brightness of a display
CN108141576B (en) Display device and control method thereof
KR20190132072A (en) Electronic apparatus, method for controlling thereof and recording media thereof
US20230197035A1 (en) Color gamut compression and extension
US10929954B2 (en) Methods and apparatus for inline chromatic aberration correction
US20140181158A1 (en) Media file system with associated metadata
US20130258199A1 (en) Video processor and video processing method
US10863215B2 (en) Content providing apparatus, method of controlling the same, and recording medium thereof
KR20140061103A (en) Display apparatus and method for image output thereof
US20210224963A1 (en) Information processing apparatus, information processing method, image processing apparatus, image processing method, and program
US20230141114A1 (en) Display apparatus and control method thereof
JP2015141370A (en) Image display device and control method of image display device
JP2019211519A (en) Display device
US20240153426A1 (en) Electronic device and an operating method thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: DOLBY LABORATORIES LICENSING CORPORATION, CALIFORN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ATKINS, ROBIN;MARGERM, STEVE;REEL/FRAME:028015/0582

Effective date: 20110415

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION