US9613409B1 - Image converting method and related image converting module - Google Patents

Image converting method and related image converting module Download PDF

Info

Publication number
US9613409B1
US9613409B1 US15/012,871 US201615012871A US9613409B1 US 9613409 B1 US9613409 B1 US 9613409B1 US 201615012871 A US201615012871 A US 201615012871A US 9613409 B1 US9613409 B1 US 9613409B1
Authority
US
United States
Prior art keywords
pixel
display device
specular
image
segment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US15/012,871
Inventor
Wan-Ching Tsai
Chao-Wei Ho
Chia-Jung Hsu
Chih-Chia Kuo
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Novatek Microelectronics Corp
Original Assignee
Novatek Microelectronics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Novatek Microelectronics Corp filed Critical Novatek Microelectronics Corp
Priority to US15/012,871 priority Critical patent/US9613409B1/en
Assigned to NOVATEK MICROELECTRONICS CORP. reassignment NOVATEK MICROELECTRONICS CORP. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HO, CHAO-WEI, HSU, CHIA-JUNG, KUO, CHIH-CHIA, TSAI, WAN-CHING
Application granted granted Critical
Publication of US9613409B1 publication Critical patent/US9613409B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/92Dynamic range modification of images or parts thereof based on global image properties
    • G06T5/009
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • G09G3/3406Control of illumination source
    • G09G3/342Control of illumination source using several illumination sources separately controlled corresponding to different display panel areas, e.g. along one dimension such as lines
    • G09G3/3426Control of illumination source using several illumination sources separately controlled corresponding to different display panel areas, e.g. along one dimension such as lines the different display panel areas being distributed in two dimensions, e.g. matrix
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • G09G3/3406Control of illumination source
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0626Adjustment of display parameters for control of overall brightness
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/08Arrangements within a display terminal for setting, manually or automatically, display parameters of the display terminal
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/06Colour space transformation

Definitions

  • the present invention relates to an image converting method and related image converting module, and more particularly, to an image converting method capable of converting a low dynamic range image to a high dynamic range image and related image converting module.
  • the dynamic range of a scene is defined to be a ratio of the highest scene luminance to the lowest scene luminance.
  • a conventional display device such as a liquid crystal display (LCD) or a television has a dynamic range of about 250:1, and covers about half of the visible color gamut.
  • the human vision system HVS has a dynamic range greater than 10,000:1, and is capable of distinguishing about 10,000 colors at a given brightness.
  • the image displayed on the conventional display device corresponds to intensities spanning 256 gray levels. That is, each color channel (red, green, and blue) is determined by 8 bits. Therefore, the minimum gray level is equal to 0, and the maximum gray level is equal to 255. From the above description, it is obvious that the dynamic range in the real-world environment far exceeds the representable dynamic range shown on the conventional display device.
  • a high-end display device may feature high dynamic range (HDR), to expand its contrast ratio and to display more realistic, natural images.
  • HDR high dynamic range
  • LDR low dynamic range
  • the present invention discloses an image converting method capable of converting a low dynamic range image to a high dynamic range image and related image converting module.
  • the present invention discloses an image converting method for a display device.
  • the image converting method comprises calculating a plurality of segment averages of pixel values in a plurality segments of an input image of a low dynamic range of luminance; acquiring the maximum among the plurality of segment averages as a first threshold; calculating a plurality of local averages of pixel values in adjacent segments of the input image; acquiring the maximum among the plurality of local averages as a second threshold; counting the number of pixel values exceeding the first threshold in the plurality of segments as a plurality of first pixel counts and counting the number of pixel values exceeding the second threshold in the plurality of segments as a plurality of second pixel counts; generating a specular map and a confidence according to the plurality of first pixel counts and the plurality of second pixel counts; mapping the input image according to the specular map, to generate an intermediate image of a high dynamic range of luminance; and blending the input image and the intermediate image according to the confidence, to generate an output image.
  • the present invention discloses an image converting module for a display device.
  • the image converting module comprises a metering unit, a detecting unit, a mapping unit, and a blending unit.
  • the metering unit is utilized to calculate a plurality of segment averages of pixel values in a plurality segments of the input image, calculate a plurality of local averages of pixel values in adjacent segments of the input image, count the number of pixel values exceeding a first threshold in the plurality of segments as a plurality of first pixel counts, and count the number of pixel values exceeding a second threshold in the plurality of segments as a plurality of second pixel counts.
  • the detecting unit is utilized to acquire the maximum among the plurality of segment averages as a first threshold, acquire the maximum among the plurality of local averages as a second threshold, generate a control signal to the metering unit to instruct the first threshold and the second threshold, generate a specular map and a confidence according to the plurality of first pixel counts and the plurality of second pixel counts.
  • the mapping unit is utilized to map the input image according to the specular map, to generate an intermediate image.
  • the blending unit is utilized to blend the input image and the intermediate image according to the confidence, to generate an output image, wherein the ratio between the input image and the intermediate image is determined by the confidence.
  • the computing cost of the image converting module is reduced.
  • the power efficiency of the image converting module is improved and manufacturing cost of the image converting module is decreased, therefore.
  • FIG. 1 is a schematic diagram of an image converting module according to an example of the present invention.
  • FIG. 2 is a schematic diagram of related signals of the image converting module shown in FIG. 1 .
  • FIG. 3 is a schematic diagram of relationships between related signals of the image converting module shown in FIG. 1 .
  • FIG. 4 is a schematic diagram of relationships between related signals of the image converting module shown in FIG. 1 .
  • FIG. 5 is a schematic diagram of an image converting module according to an example of the present invention.
  • FIG. 6 is a schematic diagram of relationships between related signals of the image converting module shown in FIG. 5 .
  • FIG. 7 is a schematic diagram of relationships between related signals of the image converting module shown in FIG. 1 .
  • FIG. 8 is a flow chart of a process according to an example of the present invention.
  • FIG. 1 is a schematic diagram of an image converting module 10 according to an example of the present invention.
  • the image converting module 10 may be utilized in a display device such as a liquid crystal display (LCD), a tablet, or a television, and is not limited herein.
  • the image converting module 10 comprises a metering unit 100 , a detecting unit 102 , a mapping unit 104 , and a blending unit 106 .
  • the metering unit 100 is utilized to determine segment based information SBI of a plurality of segments in an input image IMG_I, which has a low dynamic range (LDR) of luminance or brightness, according to a control signal CS.
  • LDR low dynamic range
  • the detecting unit 102 Based on the segment based information SBI, the detecting unit 102 generates the control signal CS to the metering unit 100 , generates a specular map SM to the mapping unit 104 , and generates a confidence CONF to the blending unit 106 .
  • the mapping unit 104 is utilized to select mapping curves according to the specular map SM and accordingly map pixel data of the input image IMG_I of the low dynamic range to an intermediate image IMG_H of high dynamic range (HDR) of luminance or brightness.
  • the blending unit 106 is utilized to blend the input image IMG_I of the low dynamic range and the intermediate image IMG_H of the high dynamic range according to the confidence CONF, to generate an output image IMG_O capable of being displayed by the display device featuring with high dynamic range of luminance or brightness.
  • the metering unit 100 analyzes the input image IMG_I based on the information of each segment (i.e. segment based information SBI) rather than each pixel in the input image IMG_I.
  • the computing cost of converting the input image IMG_I with low dynamic range to the output image IMG_O with high dynamic range is reduced, therefore, so as to improve power efficiency and reduce manufacturing cost of the image converting module 10 .
  • FIG. 2 is a schematic diagram of related signals of the image converting module 10 shown in FIG. 1 .
  • the metering unit 100 divides the input image IMG_I into a plurality of segments SEG_ 11 -SEG_mn forming an m*n matrix, wherein each of segments SEG_ 11 -SEG_mn comprises multiple pixels of the input image IMG_I.
  • m and n are 24 and 12, respectively.
  • the metering unit 100 calculates segment averages AVG_ 11 -AVG_mn of pixel values (e.g.
  • the detecting unit 102 acquires the maximum value among the segment averages AVG_ 11 -AVG_mn as a threshold TH_HI.
  • the threshold TH_H 1 is the average pixel value of the lightest segment among the segments SEG_ 11 -SEG_mn.
  • the detecting unit 102 calculates averages of multiple neighborhood/adjacent averages among the segment averages AVG_ 11 -AVG_mn as local averages AVGL_ 11 -AVGL_hk, and acquires the maximum value among the local averages AVGL_ 11 -AVGL_hk as a threshold TH_LO. In an example, the detecting unit 102 calculates the average of segment averages AVG_ 11 , AVG_ 12 , AVG_ 21 , and AVG_ 22 as the local average AVGL_ 11 (i.e.
  • AVGL_ 11 (AVG_ 11 +AVG_ 12 +AVG_ 21 +AVG_ 22 )/4), calculates the average of segment averages AVG_ 31 , AVG_ 32 , AVG_ 41 , and AVG_ 42 as the local average AVGL_ 21 , and so on. That is, the detecting unit 102 averages the pixel values of every 4 segments that form a 2*2 matrix among the segments SEG_ 11 -SEG_mn as the local averages AVGL_ 11 -AVGL_hk. Under such a condition, h becomes m/2 and k becomes N/2.
  • the number of neighborhood/adjacent segments for calculating each of the local averages AVGL_ 11 -AVGL_hk may be altered according to different applications and design concepts. Because the local averages AVGL_ 11 -AVGL_hk are acquired by averaging the multiple neighborhood/adjacent segment averages among the segment averages AVG_ 11 -AVG_mn, the threshold TH_LO (i.e. the maximum value among the local averages AVGL_ 11 -AVGL_hk) is smaller than the threshold TH_HI (i.e. the maximum value among the segment averages AVG_ 11 -AVG_mn) and the threshold TH_LO is closer to the average pixel value of the input image IMG_I than the threshold TH_HI does.
  • the detecting unit 102 After acquiring the thresholds TH_HI and TH_LO, the detecting unit 102 transmits the control signal CS to the metering unit 100 to indicate the thresholds TH_HI and TH_LO.
  • the metering unit 100 counts the number of pixel values exceeding the threshold TH_HI in each of segments SEG_ 11 -SEG_mn to acquire pixel counts PC_HI_ 11 -PC_HI_mn and counts the number of pixel values exceeding the threshold TH_LO in each of segments SEG_ 11 -SEG_mn to acquire pixel counts PC_LO_ 11 -PC_LO_mn.
  • the metering unit 100 uses a simple method to acquire the pixel counts PC_HI_ 11 -PC_HI_mn and PC_LO_ 11 -PC_LO_mn as references of determining the brightness information of segments SEG_ 11 -SEG_mn.
  • the detecting unit 102 receives the pixel counts PC_HI_ 11 -PC_HI_mn and PC_LO_ 11 -PC_LO_mn from the metering unit 100 and accordingly generates specular values SM_ 11 -SM_mn of the specular map SM.
  • specular values SM_ 11 -SM_mn are corresponding to the segments SEG_ 11 -SEG_mn, respectively, and positively correlate to the pixel counts PC_HI_ 11 -PC_HI_mn and PC_LO_ 11 -PC_LO_mn.
  • SM_ij_tmp is proportional to the pixel counts PC_HI_ij and PC_LO_ij
  • SPE_SLOP is utilized for adjusting the relation between SM_ij_tmp and the specular values SM_ij.
  • FIG. 3 is a schematic diagram of relationships between SM_ij_tmp and specular value SM_ij.
  • the specular value SM_ij increases with SM_ij_tmp in the slope of SPE_slop.
  • the specular value SM_ij keeps at the value SPE_max.
  • the detecting unit 102 generates the specular values SM_ 11 -SM_ij corresponding to the segments SEG_ 11 -SEG_mn.
  • the mapping unit 104 selects the mapping curve of each of pixel values in the input image IMG_I according to the specular map SM.
  • the mapping unit 104 converts the specular map SM of segment base into a specular map SM_PIX of pixel base by interpolating the specular values SM_ 11 -SM_mn according to the relationships among the segments SM_ 11 -SM_mn.
  • the mapping unit 104 determines mapping curve utilizing for adjusting (e.g. mapping) each pixel value of the input image IMG_I and performs the adjusting process to generate an intermediate image IMG_H of the high dynamic range.
  • the mapping unit 104 may interpolate 16 mapping curves according to the each specular value in the specular map SM_PIX.
  • the mapping curve corresponding to the specular value greater than 959 is 16 th mapping curve and the mapping curve corresponding to the specular value smaller than 959 is acquired by interpolating 1 st to 15 th mapping curves according to its value.
  • the method of interpolating mapping curves or the number of mapping curves may vary according to different applications and design concepts, and are not limited herein.
  • the blending unit 106 blends the image IMG_I of the low dynamic range with the intermediate image IMG_H of the high dynamic range to generate the output image IMG_O of the high dynamic range, wherein the ratio between the input image IMG_I and the intermediate image IMG_H is determined by the confidence CONF.
  • the ratio of intermediate image IMG_H to the image IMG_I decreases when the bright area in the input image IMG_I increases. That is, the confidence CONF and the bright area in the input image IMG_I are negative correlated.
  • the detecting unit 102 counts a number of relative bright segments among the segments SEG_ 11 -SEG_mn to determine the bright area in the input image IMG_I.
  • the detecting unit 102 may determine whether a segment is relative bright based on the pixel counts PC_HI_ 11 -PC_HI_mn and PC_LO_ 11 -PC_LO_mn.
  • SPE_RA number ⁇ ⁇ of ⁇ ⁇ PC_S ⁇ _ ⁇ 11 - PC_S ⁇ _mn > TH_CONF ⁇ _PIX number ⁇ ⁇ of ⁇ ⁇ SEG_ ⁇ 11 - SEG_mn ( 4 )
  • CONF ⁇ C_MAX , SPE_RA ⁇ TH_RA C_MAX - MIN ⁇ ( ( SPE_RA - TH_RA ) ⁇ SPE_RA ⁇ _Slop , C_MAX ) , SPE_RA > TH_RA ( 5 )
  • the detecting unit 102 calculates sums PC_S_ 11 -PC_S_mn of the segments SEG_ 11 -SEG_mn by respectively adding the pixel counts PC_HI_ 11 -PC_HI_mn and PC_LO_ 11 -PC_LO_mn, acquires a value SPE_RA by dividing the number of sums that are greater than a threshold TH_S (i.e. the number of bright segments) by the number of segments SEG_ 11 -SEG_mn, and generates the confidence CONF according to the value SPE_RA.
  • TH_S i.e. the number of bright segments
  • the confidence CONF keeps at a maximum confidence C_MAX to maximize the ratio of the intermediate image IMG_H to the image IMG_I for generating the output image IMG_O when the value SPE_RA is not greater than a threshold TH_RA; and the ratio of the intermediate image IMG_H to the image IMG_I when generating the output image IMG_O decreases with the value SPE_RA when the value SPE_RA is greater than the threshold TH_RA, wherein the minimum value of confidence CONF is 0.
  • the ratio of the intermediate image IMG_H to the image IMG_I for generating the output image IMG_O is inverse proportional to the number of bright segments in the input image IMG_I.
  • the bright segment is defined as the segment whose number of pixel values greater than threshold TH_HI or TH_LO is greater than the threshold TH_CONF_PIX.
  • the method of determining the bright segments in the input image IMG_I can be altered according to different applications and design concepts.
  • the image converting module 10 divides the input image IMG_I into the segments SEG_ 11 -SEG_mn and analyzes the input image IMG_I on the basis of the segments SEG_ 11 -SEG_mn, to generate the specular map SM utilized for mapping the input image IMG_I of the low dynamic range to the intermediate image IMG_H of the high dynamic range and the confidence CONF utilized for adjusting the ratio between the intermediate image IMG_H and the input image IMG_I for generating the output image IMG_O.
  • the computing cost of analyzing the input image IMG_I is therefore decreased, so as to improve the power efficiency and reduce the manufacturing cost of the converting module 10 .
  • the detecting unit 102 filters the pixel counts PC_HI_ 11 -PC_HI_mn and PC_LO_ 11 -PC_LO_mn before generating the specular map SM.
  • the detecting unit 102 may use an infinite impulse response (IIR) filter on each of the pixel counts PC_HI_ 11 -PC_HI_mn and PC_LO_ 11 -PC_LO_mn, to adjust the time domain stability of the pixel counts PC_HI_ 11 -PC_HI_mn and PC_LO_ 11 -PC_LO_mn.
  • IIR infinite impulse response
  • the detecting unit 102 may use a low pass filter on the pixel counts PC_HI_ 11 -PC_HI_mn and PC_LO_ 11 -PC_LO_mn corresponding to adjacent segments among the segments SEG_ 11 -SEG_mn, to adjust the image stability among the pixel counts PC_HI_ 11 -PC_HI_mn and PC_LO_ 11 -PC_LO_mn.
  • FIG. 5 is a schematic diagram of an image converting module 50 according to an example of the present invention.
  • the Image converting module 50 is similar to the image converting module 10 shown in FIG. 1 , thus the components and signals with the similar functions use the same symbols.
  • the image converting module 50 may be utilized in a display device such as a LCD, a tablet, or a television, and is not limited herein.
  • a display device such as a LCD, a tablet, or a television
  • a local dimming control unit 508 is added in the image converting module 50 , to generate a local dimming control signal LD_CON according to the segment information SBI and/or the specular map SM, wherein the local dimming control signal LD_CON is utilized to control backlight intensities BL_ 11 -BL_mn of backlights in the display device.
  • the detecting unit 102 may also acquire the local dimming control signal LD_CON and accordingly adjust the specular map SM. That is, the image converting module 50 not only adjusts the local dimming control signal LD_CON based on the segment based information SBI but also changes the specular map SM according to the local dimming control signal LD_CON.
  • the local dimming control unit 508 generates the local dimming control signal LD_CON according to the averages AVG_ 11 -AVG_mn of the segments SEG_ 11 -SEG_mn, the pixel counts PC_HI_ 11 -PC_HI_mn, PC_LO_ 11 -PC_LO_mn, the thresholds TH_HI, TH_LO, and/or the specular map SM.
  • the local dimming control unit 508 may adjust the backlight intensities BL_ 11 -BL_mn of the backlights corresponding to segments SEG_ 11 -SEG_mn according to the averages AVG_ 11 -AVG_mn of the segments SEG_ 11 -SEG_mn.
  • the backlight intensities BL_ 11 -BL_mn of the backlights corresponding to the segments SEG_ 11 -SEG_mn are proportional to the averages AVG_ 11 -AVG_mn, respectively.
  • the local dimming control unit 508 may use the pixel counts PC_HI_ 11 -PC_HI_mn as the reference of generating the local dimming control signal LD_CON.
  • the pixel counts PC_HI_ 11 -PC_HI_mn are greater, the number of pixel value exceeding the threshold TH_HI in each segment is greater.
  • the local dimming control unit 508 makes the backlight intensity BL_ij of backlight corresponding to the segment SEG_ij proportional to the corresponding pixel counts PC_HI_ij, wherein i ⁇ m and j ⁇ n.
  • the local dimming control unit 508 may further refer to the threshold TH_HI when adopting the pixel counts PC_HI_ 11 -PC_HI_mn as the reference of generating the local dimming control signal LD_CON.
  • the correlation between each of the pixel counts PC_HI_ 11 -PC_HI_mn and the brightness of corresponding segment among the segments SEG_ 11 -SEG_mn is more positive when the threshold TH_HI is greater.
  • the local dimming control unit 508 increases the backlight intensities BL_ 11 -BL_mn of the backlights corresponding to the segments SEG_ 11 -SEG_mn based on the same pixel counts PC_HI_ 11 -PC_HI_mn when the threshold TH_HI becomes greater.
  • the local dimming control unit 508 generates the local dimming control signal LD_CON according to the averages AVG_ 11 -AVG_mn, the pixel counts PC_HI_ 11 -PC_HI_mn, and the threshold TH_HI.
  • WGT is proportional to the pixel count PC_HI_ij and the maximum of WGT is 1.
  • FIG. 6 is a schematic diagram of relationship between WGT and the pixel count PC_HI_ij shown in equation (6).
  • WGT increases with the pixel count PC_HI_ij for a slope WGT_SLOP before reaching the maximum 1 and keeps at 1 when the pixel count PC_HI_ij is greater than a threshold TH_WGT.
  • the local dimming control unit 508 adjusts the backlight intensities BL_ 11 -BL_mn corresponding to segments SEG_ 11 -SEG_mn according to the segment based information SBI.
  • the local dimming control unit 508 generates the local dimming control signal LD_CON according to the specular map SM.
  • the backlight intensities BL_ 11 -BL_mn instructed by the local dimming control signal LD_CON are proportional to the specular values SM_ 11 -SM_mn, respectively. As a result, the contrast ratio of the display device is improved.
  • the detecting unit 502 adjusts the specular map SM according to the local dimming control signal LD_CON. Because the backlight intensities BL_ 11 -BL_mn are proportional to the brightness of the segments SEG_ 11 -SEG_mn, the detecting unit 502 adjusts the specular values SM_ 11 -SM_mn according to the backlight intensities BL_ 11 -BL_mn to make the specular values SM_ 11 -SM_mn positively correlative with the backlight intensities BL_ 11 -BL_mn. As a result, the contrast ratio of the display device is further improved.
  • the detecting unit 102 directly generates the specular map SM_PIX without referring to the segment based information SBI.
  • the image converting module 10 uses an edge preserved filter (e.g. a bilateral filter) to blur the input image IMG_I while preserving edge details in the input image IMG_I and accordingly acquires a filtered image IMG_F.
  • the detecting unit 102 generates the specular map SM according to pixel values of the filtered image IMG_F.
  • FIG. 7 is a schematic diagram of relationship between a pixel value P in the filtered image IMG_F and the corresponded specular value SM_P of the specular map SM_PIX. As shown in FIG.
  • the specular value SM_P is 0 when the pixel value P is smaller than a threshold TH 0 , increases with the pixel value P at a constant slope SW_SLOP when the pixel value P is between the thresholds TH 0 and TH 1 , and keeps at the maximum SM_MAX when the pixel value P is greater than the threshold TH 1 .
  • the detecting unit 102 generates the specular map SM_PIX and the mapping unit 104 accordingly selects mapping curve of mapping each pixel value in the input image IMG_I to generate the intermediate image IMG_H of the high dynamic range.
  • the confidence CONF is determined by a difference between the thresholds TH_HI and TH_LO, wherein the confidence CONF is proportional to the difference between the thresholds TH_HI and TH_LO.
  • the blending unit 108 therefore can decide the ratio of blending the intermediate image IMG_H and the input image IMG_I and accordingly generate the output image IMG_O.
  • the processes of the image converting module 10 and 50 converting the input image IMG_I with low dynamic range to the output image IMG_O with high dynamic range can be summarized into a process 80 shown in FIG. 8 .
  • the process 80 can be utilized in a display device, such as a TV, a LCD, or an electronic product with a display panel, to converts an input image with the low dynamic range of luminance or brightness to an output image with the high dynamic range of luminance or brightness.
  • the process 80 comprises the following steps:
  • Step 800 Start.
  • Step 802 Calculate a plurality of segment averages of pixel values in a plurality of segments of the input image.
  • Step 804 Acquire the maximum among the plurality of segment averages as a first threshold.
  • Step 806 Calculate a plurality of local averages of pixel values in adjacent segments of the input image.
  • Step 808 Acquire the maximum among the plurality of segment averages as a second threshold.
  • Step 810 Count the number of pixel values exceeding the first threshold in the plurality of segments as a plurality of first pixel counts and count the number of pixel values exceeding the second threshold in the plurality of segments as a plurality of second pixel counts.
  • Step 812 Generate a specular map and a confidence according to the plurality first pixel counts and the plurality of second pixel counts.
  • Step 814 Map the input image according to the specular map, to generate an intermediate image.
  • Step 816 Blend the input image and the intermediate image according to the confidence, to generate the output image.
  • Step 818 End.
  • the input image is divided into a plurality of segments, wherein each of the plurality segments comprises multiple pixels of the input images.
  • the average pixel values in the plurality of segments are acquired as a plurality of segment averages and the average pixels values in adjacent segments among the plurality of segments are acquired as a plurality of local averages.
  • each of the plurality of local averages is corresponding to adjacent segments forming a rectangular matrix (e.g. 2*2 matrix).
  • the maximum among the plurality of segment averages is defined as a first threshold and the maximum among the plurality of local averages is defined as a second threshold.
  • the number of pixel values exceeding the first threshold in each of the plurality segments is counted, to generate a plurality of first pixel counts.
  • the number of pixel values exceeding the second threshold in each of the plurality segments is counted, to generate a plurality of second pixel counts.
  • each specular value in the specular map corresponding to the plurality segments is proportional to the first pixel count and the second pixel count corresponding to the same segment.
  • the confidence is generated by calculating the ratio between the number of the segments with a sum of the first pixel count and the second pixel count exceeding a confidence threshold among the plurality of segments and the total number of the plurality of segments.
  • the input image is mapped according to the specular map, to generate an intermediate image of the high dynamic range of luminance or brightness.
  • the specular values in the specular map are interpolated to generate the interpolated specular values corresponding to pixels of the input image, and the mapping curves of mapping pixel values of the input image are selected according to the interpolated specular values.
  • the output image is generated by blending the input image and the intermediate image.
  • the ratio between the input image and the intermediate image for generating the output image is determined by the confidence generated according to the plurality of first pixel counts and the plurality of second pixel counts. Because the information of converting the input image to the output image is acquired by analyzing the plurality of segments rather than pixel values in the input image, the computing cost of converting the input image to the output image is reduced. The power efficiency is improved and manufacturing cost is reduced, therefore. Note that, the information of the plurality of segments (e.g.
  • the segment averages, the local averages, the first threshold, the second threshold, the first pixel counts, or the second pixel counts) or the specular map also can be utilized to adjust backlight intensities of backlights configured in the display device.
  • the detailed operations of the process 80 can be referred to the above descriptions and are not narrated herein for brevity.
  • the above mentioned steps of the processes including suggested steps can be realized by means that could be hardware, firmware known as a combination of a hardware device and computer instructions and data that reside as read-only software on the hardware device, or an electronic system.
  • hardware can include analog, digital and mixed circuits such as microcircuits, microchips, or silicon chips.
  • the electronic system can include system on chip (SOC), system in package (Sip), computer on module (COM), and the image converting modules 10 and 50 .
  • the image converting modules of the above example converts the input image with the low dynamic range to the output image with the high dynamic range by analyzing the information of the segments rather than pixels of the input image.
  • the computing cost is therefore decreased.
  • the analyzed information of segments can be utilized in controlling the backlight intensities of the display device.
  • the contrast ratio of the display device is further improved.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

An image converting method includes calculating a plurality of segment averages of pixel values in a plurality segments of an input image; acquiring the maximum among the segment averages as a first threshold; calculating a plurality of local averages of pixel values in adjacent segments of input image; acquiring the maximum among local averages as a second threshold; counting the number of pixel values exceeding first threshold in the segments as a plurality of first pixel counts and counting the number of pixel values exceeding second threshold in the segments as a plurality of second pixel counts; generating a specular map and a confidence according to the first pixel counts and the second pixel counts; mapping the input image according to the specular map, to generate an intermediate image; and blending the input image and the intermediate image according to the confidence, to generate an output image.

Description

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to an image converting method and related image converting module, and more particularly, to an image converting method capable of converting a low dynamic range image to a high dynamic range image and related image converting module.
2. Description of the Prior Art
The dynamic range of a scene is defined to be a ratio of the highest scene luminance to the lowest scene luminance. Generally speaking, a conventional display device such as a liquid crystal display (LCD) or a television has a dynamic range of about 250:1, and covers about half of the visible color gamut. However, the human vision system (HVS) has a dynamic range greater than 10,000:1, and is capable of distinguishing about 10,000 colors at a given brightness.
Typically, the image displayed on the conventional display device corresponds to intensities spanning 256 gray levels. That is, each color channel (red, green, and blue) is determined by 8 bits. Therefore, the minimum gray level is equal to 0, and the maximum gray level is equal to 255. From the above description, it is obvious that the dynamic range in the real-world environment far exceeds the representable dynamic range shown on the conventional display device.
In recent years, a high-end display device may feature high dynamic range (HDR), to expand its contrast ratio and to display more realistic, natural images. Although the high-end display device is able to display the HDR images, most of image contents are still stored by the conventional format (e.g. low dynamic range (LDR) images). Thus, how to convert the LDR images to the HDR images capable of being displayed by the display device of HDR becomes a topic to be discussed.
SUMMARY OF THE INVENTION
In order to solve the above problems, the present invention discloses an image converting method capable of converting a low dynamic range image to a high dynamic range image and related image converting module.
In an aspect, the present invention discloses an image converting method for a display device. The image converting method comprises calculating a plurality of segment averages of pixel values in a plurality segments of an input image of a low dynamic range of luminance; acquiring the maximum among the plurality of segment averages as a first threshold; calculating a plurality of local averages of pixel values in adjacent segments of the input image; acquiring the maximum among the plurality of local averages as a second threshold; counting the number of pixel values exceeding the first threshold in the plurality of segments as a plurality of first pixel counts and counting the number of pixel values exceeding the second threshold in the plurality of segments as a plurality of second pixel counts; generating a specular map and a confidence according to the plurality of first pixel counts and the plurality of second pixel counts; mapping the input image according to the specular map, to generate an intermediate image of a high dynamic range of luminance; and blending the input image and the intermediate image according to the confidence, to generate an output image. The ratio between the input image and the intermediate image is determined by the confidence.
In another aspect, the present invention discloses an image converting module for a display device. The image converting module comprises a metering unit, a detecting unit, a mapping unit, and a blending unit. The metering unit is utilized to calculate a plurality of segment averages of pixel values in a plurality segments of the input image, calculate a plurality of local averages of pixel values in adjacent segments of the input image, count the number of pixel values exceeding a first threshold in the plurality of segments as a plurality of first pixel counts, and count the number of pixel values exceeding a second threshold in the plurality of segments as a plurality of second pixel counts. The detecting unit is utilized to acquire the maximum among the plurality of segment averages as a first threshold, acquire the maximum among the plurality of local averages as a second threshold, generate a control signal to the metering unit to instruct the first threshold and the second threshold, generate a specular map and a confidence according to the plurality of first pixel counts and the plurality of second pixel counts. The mapping unit is utilized to map the input image according to the specular map, to generate an intermediate image. The blending unit is utilized to blend the input image and the intermediate image according to the confidence, to generate an output image, wherein the ratio between the input image and the intermediate image is determined by the confidence.
Because the information of the input image of the low dynamic range of luminance is acquired by analyzing segments rather than pixels of the input image, the computing cost of the image converting module is reduced. The power efficiency of the image converting module is improved and manufacturing cost of the image converting module is decreased, therefore.
These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a schematic diagram of an image converting module according to an example of the present invention.
FIG. 2 is a schematic diagram of related signals of the image converting module shown in FIG. 1.
FIG. 3 is a schematic diagram of relationships between related signals of the image converting module shown in FIG. 1.
FIG. 4 is a schematic diagram of relationships between related signals of the image converting module shown in FIG. 1.
FIG. 5 is a schematic diagram of an image converting module according to an example of the present invention.
FIG. 6 is a schematic diagram of relationships between related signals of the image converting module shown in FIG. 5.
FIG. 7 is a schematic diagram of relationships between related signals of the image converting module shown in FIG. 1.
FIG. 8 is a flow chart of a process according to an example of the present invention.
DETAILED DESCRIPTION
Please refer to FIG. 1, which is a schematic diagram of an image converting module 10 according to an example of the present invention. The image converting module 10 may be utilized in a display device such as a liquid crystal display (LCD), a tablet, or a television, and is not limited herein. As shown in FIG. 1, the image converting module 10 comprises a metering unit 100, a detecting unit 102, a mapping unit 104, and a blending unit 106. The metering unit 100 is utilized to determine segment based information SBI of a plurality of segments in an input image IMG_I, which has a low dynamic range (LDR) of luminance or brightness, according to a control signal CS. Based on the segment based information SBI, the detecting unit 102 generates the control signal CS to the metering unit 100, generates a specular map SM to the mapping unit 104, and generates a confidence CONF to the blending unit 106. The mapping unit 104 is utilized to select mapping curves according to the specular map SM and accordingly map pixel data of the input image IMG_I of the low dynamic range to an intermediate image IMG_H of high dynamic range (HDR) of luminance or brightness. The blending unit 106 is utilized to blend the input image IMG_I of the low dynamic range and the intermediate image IMG_H of the high dynamic range according to the confidence CONF, to generate an output image IMG_O capable of being displayed by the display device featuring with high dynamic range of luminance or brightness. In this example, the metering unit 100 analyzes the input image IMG_I based on the information of each segment (i.e. segment based information SBI) rather than each pixel in the input image IMG_I. The computing cost of converting the input image IMG_I with low dynamic range to the output image IMG_O with high dynamic range is reduced, therefore, so as to improve power efficiency and reduce manufacturing cost of the image converting module 10.
Please refer to FIG. 2, which is a schematic diagram of related signals of the image converting module 10 shown in FIG. 1. As shown in FIG. 2, the metering unit 100 divides the input image IMG_I into a plurality of segments SEG_11-SEG_mn forming an m*n matrix, wherein each of segments SEG_11-SEG_mn comprises multiple pixels of the input image IMG_I. In an example, m and n are 24 and 12, respectively. Next, the metering unit 100 calculates segment averages AVG_11-AVG_mn of pixel values (e.g. luminance or brightness) in the segments SEG_11-SEG_mn, respectively, and indicates the segment averages AVG_11-AVG_mn to the detecting unit 102 via the segment based information SBI. The detecting unit 102 acquires the maximum value among the segment averages AVG_11-AVG_mn as a threshold TH_HI. In other words, the threshold TH_H1 is the average pixel value of the lightest segment among the segments SEG_11-SEG_mn. In addition, the detecting unit 102 calculates averages of multiple neighborhood/adjacent averages among the segment averages AVG_11-AVG_mn as local averages AVGL_11-AVGL_hk, and acquires the maximum value among the local averages AVGL_11-AVGL_hk as a threshold TH_LO. In an example, the detecting unit 102 calculates the average of segment averages AVG_11, AVG_12, AVG_21, and AVG_22 as the local average AVGL_11 (i.e. AVGL_11=(AVG_11+AVG_12+AVG_21+AVG_22)/4), calculates the average of segment averages AVG_31, AVG_32, AVG_41, and AVG_42 as the local average AVGL_21, and so on. That is, the detecting unit 102 averages the pixel values of every 4 segments that form a 2*2 matrix among the segments SEG_11-SEG_mn as the local averages AVGL_11-AVGL_hk. Under such a condition, h becomes m/2 and k becomes N/2. Note that, the number of neighborhood/adjacent segments for calculating each of the local averages AVGL_11-AVGL_hk may be altered according to different applications and design concepts. Because the local averages AVGL_11-AVGL_hk are acquired by averaging the multiple neighborhood/adjacent segment averages among the segment averages AVG_11-AVG_mn, the threshold TH_LO (i.e. the maximum value among the local averages AVGL_11-AVGL_hk) is smaller than the threshold TH_HI (i.e. the maximum value among the segment averages AVG_11-AVG_mn) and the threshold TH_LO is closer to the average pixel value of the input image IMG_I than the threshold TH_HI does.
After acquiring the thresholds TH_HI and TH_LO, the detecting unit 102 transmits the control signal CS to the metering unit 100 to indicate the thresholds TH_HI and TH_LO. The metering unit 100 counts the number of pixel values exceeding the threshold TH_HI in each of segments SEG_11-SEG_mn to acquire pixel counts PC_HI_11-PC_HI_mn and counts the number of pixel values exceeding the threshold TH_LO in each of segments SEG_11-SEG_mn to acquire pixel counts PC_LO_11-PC_LO_mn. Via appropriately setting the thresholds TH_HI and TH_Lo, the metering unit 100 uses a simple method to acquire the pixel counts PC_HI_11-PC_HI_mn and PC_LO_11-PC_LO_mn as references of determining the brightness information of segments SEG_11-SEG_mn. The detecting unit 102 receives the pixel counts PC_HI_11-PC_HI_mn and PC_LO_11-PC_LO_mn from the metering unit 100 and accordingly generates specular values SM_11-SM_mn of the specular map SM. The specular values SM_11-SM_mn are corresponding to the segments SEG_11-SEG_mn, respectively, and positively correlate to the pixel counts PC_HI_11-PC_HI_mn and PC_LO_11-PC_LO_mn. In an example, the equations of generating specular values SM_11-SM_mn can be expressed as:
SM_ij_tmp=(PC_HI_ij+PC_LO_ij)×(PC_HI_ij+1),i≦m,j≦n  (1)
SM_ij=MIN(SM_ij_tmp×SPE_SLOP,SM_MAX),i≦m,j≦n  (2)
In the equations (1) and (2), SM_ij_tmp is proportional to the pixel counts PC_HI_ij and PC_LO_ij, and SPE_SLOP is utilized for adjusting the relation between SM_ij_tmp and the specular values SM_ij. Please refer to FIG. 3, which is a schematic diagram of relationships between SM_ij_tmp and specular value SM_ij. As shown in FIG. 3, before reaching a value SM_max, the specular value SM_ij increases with SM_ij_tmp in the slope of SPE_slop. After reaching the value SM_max, the specular value SM_ij keeps at the value SPE_max. According to the equations (1) and (2), the detecting unit 102 generates the specular values SM_11-SM_ij corresponding to the segments SEG_11-SEG_mn.
Next, the mapping unit 104 selects the mapping curve of each of pixel values in the input image IMG_I according to the specular map SM. In an example, the mapping unit 104 converts the specular map SM of segment base into a specular map SM_PIX of pixel base by interpolating the specular values SM_11-SM_mn according to the relationships among the segments SM_11-SM_mn. According to the specular map SM_PIX of pixel base, the mapping unit 104 determines mapping curve utilizing for adjusting (e.g. mapping) each pixel value of the input image IMG_I and performs the adjusting process to generate an intermediate image IMG_H of the high dynamic range. For example, the mapping unit 104 may interpolate 16 mapping curves according to the each specular value in the specular map SM_PIX. In an example, the ranges of specular values in the specular map SM_PIX are 0-1023 (i.e. SM_MAX=1023) and 16 mapping curves are corresponding to 0, 63, 127, . . . , 959, respectively. The mapping curve corresponding to the specular value greater than 959 is 16th mapping curve and the mapping curve corresponding to the specular value smaller than 959 is acquired by interpolating 1st to 15th mapping curves according to its value. Note that, the method of interpolating mapping curves or the number of mapping curves may vary according to different applications and design concepts, and are not limited herein.
After the intermediate image IMG_H of the high dynamic range is generated, the blending unit 106 blends the image IMG_I of the low dynamic range with the intermediate image IMG_H of the high dynamic range to generate the output image IMG_O of the high dynamic range, wherein the ratio between the input image IMG_I and the intermediate image IMG_H is determined by the confidence CONF. When the bright area in the input image IMG_I is greater, the image IMG_I is less needed to be highlighted. Thus, the ratio of intermediate image IMG_H to the image IMG_I decreases when the bright area in the input image IMG_I increases. That is, the confidence CONF and the bright area in the input image IMG_I are negative correlated. In an example, the detecting unit 102 counts a number of relative bright segments among the segments SEG_11-SEG_mn to determine the bright area in the input image IMG_I. The detecting unit 102 may determine whether a segment is relative bright based on the pixel counts PC_HI_11-PC_HI_mn and PC_LO_11-PC_LO_mn. In an example, the equations of generating the confidence CONF can be expressed as:
PC_S_ij=PC_HI_ij+PC_LO_ij,i≦m,j≦n  (3)
SPE_RA = number of PC_S _ 11 - PC_S _mn > TH_CONF _PIX number of SEG_ 11 - SEG_mn ( 4 ) CONF = { C_MAX , SPE_RA TH_RA C_MAX - MIN ( ( SPE_RA - TH_RA ) × SPE_RA _Slop , C_MAX ) , SPE_RA > TH_RA ( 5 )
According to the equations (3)-(5), the detecting unit 102 calculates sums PC_S_11-PC_S_mn of the segments SEG_11-SEG_mn by respectively adding the pixel counts PC_HI_11-PC_HI_mn and PC_LO_11-PC_LO_mn, acquires a value SPE_RA by dividing the number of sums that are greater than a threshold TH_S (i.e. the number of bright segments) by the number of segments SEG_11-SEG_mn, and generates the confidence CONF according to the value SPE_RA. Please refer to FIG. 4, which is a schematic diagram of the relationship between the value SPE_RA and the confidence CONF. According to FIG. 4 and equation (5), the confidence CONF keeps at a maximum confidence C_MAX to maximize the ratio of the intermediate image IMG_H to the image IMG_I for generating the output image IMG_O when the value SPE_RA is not greater than a threshold TH_RA; and the ratio of the intermediate image IMG_H to the image IMG_I when generating the output image IMG_O decreases with the value SPE_RA when the value SPE_RA is greater than the threshold TH_RA, wherein the minimum value of confidence CONF is 0. In other words, the ratio of the intermediate image IMG_H to the image IMG_I for generating the output image IMG_O is inverse proportional to the number of bright segments in the input image IMG_I. In this example, the bright segment is defined as the segment whose number of pixel values greater than threshold TH_HI or TH_LO is greater than the threshold TH_CONF_PIX. The method of determining the bright segments in the input image IMG_I can be altered according to different applications and design concepts.
In the above example, the image converting module 10 divides the input image IMG_I into the segments SEG_11-SEG_mn and analyzes the input image IMG_I on the basis of the segments SEG_11-SEG_mn, to generate the specular map SM utilized for mapping the input image IMG_I of the low dynamic range to the intermediate image IMG_H of the high dynamic range and the confidence CONF utilized for adjusting the ratio between the intermediate image IMG_H and the input image IMG_I for generating the output image IMG_O. Because the information of the input image IMG_I is acquired by analyzing the segments SEG_11-SEG_mn rather than pixels in the input image IMG_I, the computing cost of analyzing the input image IMG_I is therefore decreased, so as to improve the power efficiency and reduce the manufacturing cost of the converting module 10.
According to different application and design concepts, those with ordinary skill in the art may observe appropriate alternations and modifications. In an example, the detecting unit 102 filters the pixel counts PC_HI_11-PC_HI_mn and PC_LO_11-PC_LO_mn before generating the specular map SM. For example, the detecting unit 102 may use an infinite impulse response (IIR) filter on each of the pixel counts PC_HI_11-PC_HI_mn and PC_LO_11-PC_LO_mn, to adjust the time domain stability of the pixel counts PC_HI_11-PC_HI_mn and PC_LO_11-PC_LO_mn. In addition, the detecting unit 102 may use a low pass filter on the pixel counts PC_HI_11-PC_HI_mn and PC_LO_11-PC_LO_mn corresponding to adjacent segments among the segments SEG_11-SEG_mn, to adjust the image stability among the pixel counts PC_HI_11-PC_HI_mn and PC_LO_11-PC_LO_mn.
Please refer to FIG. 5, which is a schematic diagram of an image converting module 50 according to an example of the present invention. The Image converting module 50 is similar to the image converting module 10 shown in FIG. 1, thus the components and signals with the similar functions use the same symbols. The image converting module 50 may be utilized in a display device such as a LCD, a tablet, or a television, and is not limited herein. In FIG. 5, a local dimming control unit 508 is added in the image converting module 50, to generate a local dimming control signal LD_CON according to the segment information SBI and/or the specular map SM, wherein the local dimming control signal LD_CON is utilized to control backlight intensities BL_11-BL_mn of backlights in the display device. In addition, the detecting unit 102 may also acquire the local dimming control signal LD_CON and accordingly adjust the specular map SM. That is, the image converting module 50 not only adjusts the local dimming control signal LD_CON based on the segment based information SBI but also changes the specular map SM according to the local dimming control signal LD_CON.
In an example, the local dimming control unit 508 generates the local dimming control signal LD_CON according to the averages AVG_11-AVG_mn of the segments SEG_11-SEG_mn, the pixel counts PC_HI_11-PC_HI_mn, PC_LO_11-PC_LO_mn, the thresholds TH_HI, TH_LO, and/or the specular map SM. For example, the local dimming control unit 508 may adjust the backlight intensities BL_11-BL_mn of the backlights corresponding to segments SEG_11-SEG_mn according to the averages AVG_11-AVG_mn of the segments SEG_11-SEG_mn. Because the averages AVG_11-AVG_mn represent the average brightness of the segments SEG_11-SEG_mn, the backlight intensities BL_11-BL_mn of the backlights corresponding to the segments SEG_11-SEG_mn are proportional to the averages AVG_11-AVG_mn, respectively.
In addition, the local dimming control unit 508 may use the pixel counts PC_HI_11-PC_HI_mn as the reference of generating the local dimming control signal LD_CON. When the pixel counts PC_HI_11-PC_HI_mn are greater, the number of pixel value exceeding the threshold TH_HI in each segment is greater. Thus, the local dimming control unit 508 makes the backlight intensity BL_ij of backlight corresponding to the segment SEG_ij proportional to the corresponding pixel counts PC_HI_ij, wherein i≦m and j≦n. The local dimming control unit 508 may further refer to the threshold TH_HI when adopting the pixel counts PC_HI_11-PC_HI_mn as the reference of generating the local dimming control signal LD_CON. The correlation between each of the pixel counts PC_HI_11-PC_HI_mn and the brightness of corresponding segment among the segments SEG_11-SEG_mn is more positive when the threshold TH_HI is greater. Thus, the local dimming control unit 508 increases the backlight intensities BL_11-BL_mn of the backlights corresponding to the segments SEG_11-SEG_mn based on the same pixel counts PC_HI_11-PC_HI_mn when the threshold TH_HI becomes greater.
In an example, the local dimming control unit 508 generates the local dimming control signal LD_CON according to the averages AVG_11-AVG_mn, the pixel counts PC_HI_11-PC_HI_mn, and the threshold TH_HI. The equations of the backlight intensities BL_11-BL_ij instructed by the local dimming control signal LD_CON can be expressed as:
BL_ij=AVG_ij×(1−WGT)+TH_HI×WGT,i≦m and j≦n  (6)
Wherein, WGT is proportional to the pixel count PC_HI_ij and the maximum of WGT is 1. Please refer to FIG. 6, which is a schematic diagram of relationship between WGT and the pixel count PC_HI_ij shown in equation (6). As shown in FIG. 6, WGT increases with the pixel count PC_HI_ij for a slope WGT_SLOP before reaching the maximum 1 and keeps at 1 when the pixel count PC_HI_ij is greater than a threshold TH_WGT. According to the equation (6) and FIG. 6, the local dimming control unit 508 adjusts the backlight intensities BL_11-BL_mn corresponding to segments SEG_11-SEG_mn according to the segment based information SBI.
In an example, the local dimming control unit 508 generates the local dimming control signal LD_CON according to the specular map SM. In this example, the backlight intensities BL_11-BL_mn instructed by the local dimming control signal LD_CON are proportional to the specular values SM_11-SM_mn, respectively. As a result, the contrast ratio of the display device is improved.
In an example, the detecting unit 502 adjusts the specular map SM according to the local dimming control signal LD_CON. Because the backlight intensities BL_11-BL_mn are proportional to the brightness of the segments SEG_11-SEG_mn, the detecting unit 502 adjusts the specular values SM_11-SM_mn according to the backlight intensities BL_11-BL_mn to make the specular values SM_11-SM_mn positively correlative with the backlight intensities BL_11-BL_mn. As a result, the contrast ratio of the display device is further improved.
In an example, the detecting unit 102 directly generates the specular map SM_PIX without referring to the segment based information SBI. In this example, the image converting module 10 uses an edge preserved filter (e.g. a bilateral filter) to blur the input image IMG_I while preserving edge details in the input image IMG_I and accordingly acquires a filtered image IMG_F. Next, the detecting unit 102 generates the specular map SM according to pixel values of the filtered image IMG_F. Please refer to FIG. 7, which is a schematic diagram of relationship between a pixel value P in the filtered image IMG_F and the corresponded specular value SM_P of the specular map SM_PIX. As shown in FIG. 7, the specular value SM_P is 0 when the pixel value P is smaller than a threshold TH0, increases with the pixel value P at a constant slope SW_SLOP when the pixel value P is between the thresholds TH0 and TH1, and keeps at the maximum SM_MAX when the pixel value P is greater than the threshold TH1. According to the FIG. 7, the detecting unit 102 generates the specular map SM_PIX and the mapping unit 104 accordingly selects mapping curve of mapping each pixel value in the input image IMG_I to generate the intermediate image IMG_H of the high dynamic range. In this example, the confidence CONF is determined by a difference between the thresholds TH_HI and TH_LO, wherein the confidence CONF is proportional to the difference between the thresholds TH_HI and TH_LO. The blending unit 108 therefore can decide the ratio of blending the intermediate image IMG_H and the input image IMG_I and accordingly generate the output image IMG_O.
The processes of the image converting module 10 and 50 converting the input image IMG_I with low dynamic range to the output image IMG_O with high dynamic range can be summarized into a process 80 shown in FIG. 8. The process 80 can be utilized in a display device, such as a TV, a LCD, or an electronic product with a display panel, to converts an input image with the low dynamic range of luminance or brightness to an output image with the high dynamic range of luminance or brightness. The process 80 comprises the following steps:
Step 800: Start.
Step 802: Calculate a plurality of segment averages of pixel values in a plurality of segments of the input image.
Step 804: Acquire the maximum among the plurality of segment averages as a first threshold.
Step 806: Calculate a plurality of local averages of pixel values in adjacent segments of the input image.
Step 808: Acquire the maximum among the plurality of segment averages as a second threshold.
Step 810: Count the number of pixel values exceeding the first threshold in the plurality of segments as a plurality of first pixel counts and count the number of pixel values exceeding the second threshold in the plurality of segments as a plurality of second pixel counts.
Step 812: Generate a specular map and a confidence according to the plurality first pixel counts and the plurality of second pixel counts.
Step 814: Map the input image according to the specular map, to generate an intermediate image.
Step 816: Blend the input image and the intermediate image according to the confidence, to generate the output image.
Step 818: End.
According to the process 80, the input image is divided into a plurality of segments, wherein each of the plurality segments comprises multiple pixels of the input images. The average pixel values in the plurality of segments are acquired as a plurality of segment averages and the average pixels values in adjacent segments among the plurality of segments are acquired as a plurality of local averages. In an example, each of the plurality of local averages is corresponding to adjacent segments forming a rectangular matrix (e.g. 2*2 matrix). The maximum among the plurality of segment averages is defined as a first threshold and the maximum among the plurality of local averages is defined as a second threshold. After acquiring the first threshold, the number of pixel values exceeding the first threshold in each of the plurality segments is counted, to generate a plurality of first pixel counts. Similarly, the number of pixel values exceeding the second threshold in each of the plurality segments is counted, to generate a plurality of second pixel counts.
According to the plurality of first pixel counts and the plurality of second pixel counts, a specular map and a confidence are generated. In an example, each specular value in the specular map corresponding to the plurality segments is proportional to the first pixel count and the second pixel count corresponding to the same segment. In addition, the confidence is generated by calculating the ratio between the number of the segments with a sum of the first pixel count and the second pixel count exceeding a confidence threshold among the plurality of segments and the total number of the plurality of segments. Next, the input image is mapped according to the specular map, to generate an intermediate image of the high dynamic range of luminance or brightness. In an example, the specular values in the specular map are interpolated to generate the interpolated specular values corresponding to pixels of the input image, and the mapping curves of mapping pixel values of the input image are selected according to the interpolated specular values.
Finally, the output image is generated by blending the input image and the intermediate image. The ratio between the input image and the intermediate image for generating the output image is determined by the confidence generated according to the plurality of first pixel counts and the plurality of second pixel counts. Because the information of converting the input image to the output image is acquired by analyzing the plurality of segments rather than pixel values in the input image, the computing cost of converting the input image to the output image is reduced. The power efficiency is improved and manufacturing cost is reduced, therefore. Note that, the information of the plurality of segments (e.g. the segment averages, the local averages, the first threshold, the second threshold, the first pixel counts, or the second pixel counts) or the specular map also can be utilized to adjust backlight intensities of backlights configured in the display device. The detailed operations of the process 80 can be referred to the above descriptions and are not narrated herein for brevity.
Please note that, the above mentioned steps of the processes including suggested steps can be realized by means that could be hardware, firmware known as a combination of a hardware device and computer instructions and data that reside as read-only software on the hardware device, or an electronic system. Examples of hardware can include analog, digital and mixed circuits such as microcircuits, microchips, or silicon chips. Examples of the electronic system can include system on chip (SOC), system in package (Sip), computer on module (COM), and the image converting modules 10 and 50.
To sum up, the image converting modules of the above example converts the input image with the low dynamic range to the output image with the high dynamic range by analyzing the information of the segments rather than pixels of the input image. The computing cost is therefore decreased. In addition, the analyzed information of segments can be utilized in controlling the backlight intensities of the display device. The contrast ratio of the display device is further improved.
Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.

Claims (9)

What is claimed is:
1. An image converting method for a display device, comprising:
calculating, by a computing module of the display device, a plurality of segment averages of pixel values in a plurality segments of an input image of a low dynamic range of luminance;
acquiring, by the computing module of the display device, the maximum among the plurality of segment averages as a first threshold;
calculating, by the computing module of the display device, a plurality of local averages of pixel values in adjacent segments of the input image;
acquiring, by the computing module of the display device, the maximum among the plurality of local averages as a second threshold;
counting, by the computing module of the display device, the number of pixel values exceeding the first threshold in the plurality of segments as a plurality of first pixel counts and counting the number of pixel values exceeding the second threshold in the plurality of segments as a plurality of second pixel counts;
generating, by the computing module of the display device, a specular map and a confidence according to the plurality of first pixel counts and the plurality of second pixel counts;
mapping, by the computing module of the display device, the input image according to the specular map, to generate an intermediate image of a high dynamic range of luminance;
blending, by the computing module of the display device, the input image and the intermediate image according to the confidence, to generate an output image, wherein the ratio between the input image and the intermediate image is determined by the confidence, and
displaying, by the display device, the output image.
2. The image converting method of claim 1, wherein the specular map comprises a plurality of specular values corresponding to the plurality of segments and each of the plurality of specular values is proportional to at least one of the first pixel counts and the second pixel counts corresponding to the same segment.
3. The image converting method of claim 2, wherein a first specular value corresponding to a first segment is calculate by the following equations:

SM_itmp=(PC_HI+PC_LO)×(PC_HI+1)

first specular value=MIN(SM_tmp×SPE_SLOP,SM_MAX)
wherein, PC_HI is the first pixel count of the first segment, PC_LO is the second pixel count of the first segment, SM_MAX is a predetermined maximum value of the plurality of specular values, and SPE_SLOP is a coefficient of adjusting relationships among the first specular value, PC_HI and PC_LO.
4. The image converting method of claim 1, wherein the step of generating, by the computing module of the display device, the confidence according to the plurality first pixel counts and the plurality of second pixel counts comprises:
summing, by the computing module of the display device, the plurality of first pixel counts and the plurality of second pixel counts corresponding to the same segment, to acquire a plurality of sums;
dividing, by the computing module of the display device, the number of sums exceeding a confidence threshold by the number of the plurality of segment, to acquire a confidence ratio; and
generating, by the computing module of the display device, the confidence according to the confidence ratio.
5. The image converting method of claim 4, wherein the confidence is generated by:
confidence = { C_MAX , SPE_RA TH_RA C_MAX - MIN ( ( SPE_RA - TH_RA ) × SPE_RA _Slop , C_MAX ) , SPE_RA > TH_RA
wherein, C_MAX is a predetermined maximum value of the confidence, TH_RA is the confidence threshold.
6. The image converting method of claim 1, wherein the step of mapping, by the computing module of the display device, the input image according to the specular map to generate the intermediate image comprises:
interpolating, by the computing module of the display device, a plurality of specular values of the specular map, to generate a plurality of pixel specular values corresponding to pixel values of the input image;
interpolating, by the computing module of the display device, a plurality of mapping curves according to the plurality of pixel specular values, to generate a plurality of pixel mapping curves corresponding to the pixel values of the input image; and
utilizing, by the computing module of the display device, the plurality of pixel mapping curves to map the pixel values of the input image, to generate the intermediate image.
7. The image converting method of claim 1, further comprising:
adjusting, by the computing module of the display device, a plurality of backlight intensities of backlights, which are configured in the display device and corresponding to the plurality of segments, according to at least one of the plurality of segment averages, the plurality of first pixel counts, the plurality of second pixel counts, the first threshold, the second threshold and the specular map.
8. The image converting method of claim 7, wherein a first backlight intensity of a first segment is calculated by:

first backlight intensity=AVG×(1−WGT)+TH_HI×WGT
wherein AVG is the segment average of the first segment, TH_HI is the first threshold, and WGT is a value proportional to the first pixel count of the first segment.
9. The image converting method of claim 7, further comprising:
adjusting, by the computing module of the display device, the specular map according to the plurality of backlight intensities.
US15/012,871 2016-02-02 2016-02-02 Image converting method and related image converting module Active US9613409B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/012,871 US9613409B1 (en) 2016-02-02 2016-02-02 Image converting method and related image converting module

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/012,871 US9613409B1 (en) 2016-02-02 2016-02-02 Image converting method and related image converting module

Publications (1)

Publication Number Publication Date
US9613409B1 true US9613409B1 (en) 2017-04-04

Family

ID=58419116

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/012,871 Active US9613409B1 (en) 2016-02-02 2016-02-02 Image converting method and related image converting module

Country Status (1)

Country Link
US (1) US9613409B1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190075263A1 (en) * 2017-09-01 2019-03-07 Semiconductor Components Industries, Llc Methods and apparatus for high dynamic range imaging
US10388231B2 (en) * 2016-05-27 2019-08-20 Boe Technology Group Co., Ltd. Method for controlling display device, control apparatus for display device and display device
US20200167903A1 (en) * 2018-11-27 2020-05-28 Novatek Microelectronics Corp. Image processing apparatus and method for high dynamic range effect
US20220415272A1 (en) * 2021-06-23 2022-12-29 HKC Corporation Limited Driving method of backlight module and display device
US20240404143A1 (en) * 2023-06-02 2024-12-05 Reinbow LLC Systems and methods for dynamic content placement based on pixel count confidences in pre-existing content
US12542975B2 (en) * 2022-03-07 2026-02-03 Canon Kabushiki Kaisha Image processing apparatus to generate composite image, control method, and recording medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7612804B1 (en) * 2005-02-15 2009-11-03 Apple Inc. Methods and apparatuses for image processing
US20100271512A1 (en) * 2009-04-23 2010-10-28 Haim Garten Multiple exposure high dynamic range image capture
US20130076947A1 (en) * 2011-08-30 2013-03-28 Keigo Hirakawa Single-shot high dynamic range imaging
US20140307960A1 (en) * 2013-04-15 2014-10-16 Qualcomm Incorporated Generation of ghost-free high dynamic range images
US20150356904A1 (en) * 2014-06-05 2015-12-10 Canon Kabushiki Kaisha Image processing apparatus and image processing method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7612804B1 (en) * 2005-02-15 2009-11-03 Apple Inc. Methods and apparatuses for image processing
US20100271512A1 (en) * 2009-04-23 2010-10-28 Haim Garten Multiple exposure high dynamic range image capture
US20130076947A1 (en) * 2011-08-30 2013-03-28 Keigo Hirakawa Single-shot high dynamic range imaging
US20140307960A1 (en) * 2013-04-15 2014-10-16 Qualcomm Incorporated Generation of ghost-free high dynamic range images
US20150356904A1 (en) * 2014-06-05 2015-12-10 Canon Kabushiki Kaisha Image processing apparatus and image processing method

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10388231B2 (en) * 2016-05-27 2019-08-20 Boe Technology Group Co., Ltd. Method for controlling display device, control apparatus for display device and display device
US20190075263A1 (en) * 2017-09-01 2019-03-07 Semiconductor Components Industries, Llc Methods and apparatus for high dynamic range imaging
US10708524B2 (en) * 2017-09-01 2020-07-07 Semiconductor Components Industries, Llc Methods and apparatus for high dynamic range imaging
US20200167903A1 (en) * 2018-11-27 2020-05-28 Novatek Microelectronics Corp. Image processing apparatus and method for high dynamic range effect
US10943337B2 (en) * 2018-11-27 2021-03-09 Novatek Microelectronics Corp. Image processing apparatus and method for high dynamic range effect
US20220415272A1 (en) * 2021-06-23 2022-12-29 HKC Corporation Limited Driving method of backlight module and display device
US11715432B2 (en) * 2021-06-23 2023-08-01 HKC Corporation Limited Driving method of backlight module and display device
US12542975B2 (en) * 2022-03-07 2026-02-03 Canon Kabushiki Kaisha Image processing apparatus to generate composite image, control method, and recording medium
US20240404143A1 (en) * 2023-06-02 2024-12-05 Reinbow LLC Systems and methods for dynamic content placement based on pixel count confidences in pre-existing content
US12536723B2 (en) * 2023-06-02 2026-01-27 Reinbow LLC Systems and methods for dynamic content placement based on pixel count confidences in pre-existing content

Similar Documents

Publication Publication Date Title
US9613409B1 (en) Image converting method and related image converting module
US8552946B2 (en) Display device, display driver and image display method
EP2413309B1 (en) Display device and display control method
US8314767B2 (en) Methods and systems for reducing view-angle-induced color shift
RU2471214C2 (en) Apparatus for controlling liquid crystal display, liquid crystal display, method of controlling liquid crystal display, program and data medium
US8761539B2 (en) System for high ambient image enhancement
US8643593B2 (en) Method and apparatus of compensating image in a backlight local dimming system
KR100843090B1 (en) Display device and method for improving image flicker
US20110115815A1 (en) Methods and Systems for Image Enhancement
EP2237258A1 (en) Image display device and image display method
US9076391B2 (en) High dynamic range display with rear modulator control
US9214015B2 (en) System for image enhancement
EP3731221B1 (en) Image brightness adjustment method, device and system, and computer-readable storage medium
KR101267304B1 (en) Methods and systems for reducing power consumption in dual modulation displays
US20130257886A1 (en) System for image enhancement
US20160267631A1 (en) Adaptive contrast enhancement apparatus and method
JP2017046045A (en) Image processing system
US20190057659A1 (en) Brightness compensation method and circuit
EP4372730A1 (en) Image enhancement method and apparatus, computer device, and storage medium
US7245308B2 (en) Display control device and display device
US8957845B2 (en) Display device
CN110648640A (en) Pixel compensation method, pixel compensation device and display device
US20110242420A1 (en) Smart grey level magnifier for digital display
KR20190073516A (en) Image processing apparatus, digital camera, image processing program, and recording medium
US8154663B2 (en) System and method for adaptive contrast enhancement of video signals

Legal Events

Date Code Title Description
AS Assignment

Owner name: NOVATEK MICROELECTRONICS CORP., TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TSAI, WAN-CHING;HO, CHAO-WEI;HSU, CHIA-JUNG;AND OTHERS;REEL/FRAME:037638/0342

Effective date: 20160126

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8