WO2021156924A1 - 表示制御装置、画像表示システム及び表示制御方法 - Google Patents

表示制御装置、画像表示システム及び表示制御方法 Download PDF

Info

Publication number
WO2021156924A1
WO2021156924A1 PCT/JP2020/003984 JP2020003984W WO2021156924A1 WO 2021156924 A1 WO2021156924 A1 WO 2021156924A1 JP 2020003984 W JP2020003984 W JP 2020003984W WO 2021156924 A1 WO2021156924 A1 WO 2021156924A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
image
unit
correction
history information
Prior art date
Application number
PCT/JP2020/003984
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
俊明 久保
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2020/003984 priority Critical patent/WO2021156924A1/ja
Priority to CN202080094426.1A priority patent/CN115039164A/zh
Priority to JP2021575117A priority patent/JP7258190B2/ja
Publication of WO2021156924A1 publication Critical patent/WO2021156924A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • G09G3/36Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source using liquid crystals

Definitions

  • This disclosure relates to a display control device and method, and an image display system.
  • the present disclosure particularly relates to a technique for adjusting the life of an image display provided with a non-light emitting display panel such as a liquid crystal panel.
  • the backlight converts the power supplied from the power supply into light, and the image is displayed by controlling the amount of light transmitted through the display panel. ing.
  • the life of such an image display is often determined by the life of the backlight.
  • the life of the backlight is determined by the deterioration over time due to light emission, and the deterioration depends on the amount of light emission. If the amount of light emitted is reduced, the life of the backlight is extended, but the visibility of the displayed image is reduced as the brightness of the screen is reduced.
  • Patent Document 1 in order to reduce power consumption, the amount of light emitted from the backlight is reduced, and the gradation value of the image data is corrected so as to compensate for the decrease, thereby suppressing the change in the brightness of the displayed image. Has been proposed.
  • the display control device is In a display control device that controls an image display including a backlight and a non-emission type display panel, Equipped with an image processing device and a life controller
  • the image processing device is A brightness correction unit that corrects the brightness of the input image and outputs the corrected image, A correction image brightness calculation unit that calculates the average brightness of the correction image output from the brightness correction unit, and a correction image brightness calculation unit.
  • An input image luminance calculation unit that calculates the average luminance of the input image, and A dimming rate calculation unit that calculates the dimming rate of the backlight based on the average brightness of the corrected image and the average brightness of the input image.
  • a display history information generation unit that generates display history information based on the dimming rate calculated by the dimming rate calculation unit, and a display history information generation unit. It is provided with a display history information storage unit that stores the display history information generated by the display history information generation unit.
  • the image processing device causes the backlight to emit light at the dimming rate, and displays the corrected image on the display panel.
  • the display history information represents the cumulative display time from the start of use of the image display to the present time, and the average value of the dimming rate during the period of display on the image display.
  • the life controller is The degree of correction in the brightness correction unit is controlled based on the display history information stored in the display history information storage unit, the life of the backlight at the start of use, and the target life of the backlight.
  • FIG. It is a block diagram which shows the image display system which concerns on Embodiment 1.
  • FIG. It is a block diagram which shows the structural example of the brightness correction part of FIG. It is a figure which shows an example of the gradation conversion curve used in the gradation conversion part of FIG. It is a block diagram which shows another structural example of the brightness correction part of FIG. It is a block diagram which shows another structural example of the brightness correction part of FIG. It is a figure which shows the example of the gradation conversion curve selected by the conversion curve selection part of FIG. It is a block diagram which shows another structural example of the brightness correction part of FIG. (A) to (f) are diagrams showing the process of generating the gradation conversion curve in the conversion curve generation unit of FIG. 7.
  • FIG. It is a block diagram which shows the computer which realizes all the functions of the display control device of FIG. 1 together with a backlight and a display panel. It is a flowchart which shows the process procedure in the computer 9 of FIG. It is a block diagram which shows the image display system which concerns on Embodiment 2.
  • FIG. It is a block diagram which shows the computer which realizes all the functions of the image processing apparatus of FIG. 11 and the computer which realizes all the functions of a life controller together with a backlight and a display panel.
  • FIG. 1 shows an image display system according to the first embodiment.
  • the image display system shown in FIG. 1 includes a display control device 2 and an image display 6.
  • the display control device 2 includes an image processing device 3 and a life controller 4.
  • the image processing device 3 includes an image input unit 11, a brightness correction unit 12, a corrected image brightness calculation unit 13, an input image brightness calculation unit 14, a dimming rate calculation unit 15, a display history information generation unit 17, and a display history information storage. It has a part 18.
  • the life controller 4 includes a life prediction unit 41 and a correction degree adjusting unit 42.
  • the image display 6 includes a backlight 7 and a display panel 8.
  • the backlight 7 emits light by electric power supplied from a power source (not shown).
  • the display panel 8 is composed of, for example, a liquid crystal panel, and spatially modulates the light from the backlight 7 to display an image.
  • the display panel 8 has a plurality of pixels, and each pixel has an R (red) sub-pixel, a G (green) sub-pixel, and a B (blue) sub-pixel.
  • the image input unit 11 receives an analog format or digital format image signal and outputs a digital format input image data Da.
  • the input image data Da is data in a format suitable for processing in the image processing device 3.
  • the image input unit 11 may include an A / D converter.
  • the input image data Da represents, for example, a color image.
  • the input image data Da has, for example, an R component value, a G component value, and a B component value for each pixel.
  • the input image data Da may have a luminance component value and a color difference component value for each pixel.
  • the brightness correction unit 12 corrects the input image data Da and outputs the corrected image data Db.
  • the correction in the brightness correction unit 12 is performed so that the image (corrected image) Db represented by the corrected image data Db is brighter than the image (input image) Da represented by the input image data Da. ..
  • the degree of correction in the brightness correction unit 12 is changed according to the control by the life controller 4 as described later.
  • the control information for control by the life controller 4 is indicated by Ce.
  • the display panel 8 displays an image based on the corrected image data Db output from the brightness correction unit 12.
  • the display panel 8 displays an image by spatially modulating the light from the backlight 7 based on the corrected image data Db.
  • the transmittance of the sub-pixels of the corresponding pixels of the display panel 8 is controlled according to the R component value, the G component value, and the B component value for each pixel included in the corrected image data Db.
  • FIG. 2 shows a configuration example of the brightness correction unit 12.
  • the brightness correction unit 12 shown in FIG. 2 includes a gradation conversion unit 201.
  • the gradation conversion unit 201 converts the gradation value of the input image data Da to brighten the image.
  • the R component value, the G component value, and the B component value may be separately gradation-converted, and the luminance component value is once used.
  • the color difference component value, the luminance component value may be gradation-converted, and then back-converted into the R component value, the G component value, and the B component value.
  • the gradation conversion curve TC shown in FIG. 3 is used for the gradation conversion.
  • the degree of correction that is, the degree of correction is large.
  • the gradation conversion curve shown in FIG. 3 is an example, and does not limit the present embodiment.
  • the gradation conversion curve TC may be corrected so that the average brightness of the corrected image Db is larger than the average brightness of the input image Da for most of the input image Da.
  • the gradation conversion unit 201 may be capable of transforming the gradation conversion curve TC according to the control information Ce from the life controller 4. Instead, the gradation conversion unit 201 stores a plurality of gradation conversion curves TC, and selects one of the plurality of gradation conversion curves TC according to the control information Ce from the life controller 4. It may be something to do.
  • FIG. 4 shows another configuration example of the brightness correction unit 12.
  • the brightness correction unit 12 shown in FIG. 4 includes a maximum component value calculation unit 202, an in-frame maximum value calculation unit 203, a gain determination unit 204, and a gain multiplication unit 205.
  • the maximum component value calculation unit 202 calculates the maximum value among the R component value Dar, the G component value Dag, and the B component value Dab, that is, the maximum component value M for each pixel of the input image data Da.
  • the maximum value calculation unit 203 in the frame calculates the maximum value Mmax of the maximum component values M calculated for all the pixels in the frame by the maximum component value calculation unit 202, that is, the maximum value Mmax in the frame.
  • the gain determination unit 204 determines the gain coefficient Ka for each frame based on the in-frame maximum value Mmax calculated by the in-frame maximum value calculation unit 203.
  • the gain coefficient Ka is calculated by, for example, the following equation (1).
  • Ka (Tmax / Mmax) x ⁇ (1)
  • Tmax is the maximum value (maximum gradation value) of the values that each component value can take. For example, when each component value is represented by 8 bits, the maximum gradation value is 255.
  • is an adjustment coefficient.
  • the gain multiplication unit 205 corrects the input image data Da with the gain coefficient Ka determined by the gain determination unit 204, and generates the corrected image data Db.
  • the gain coefficient Ka is determined for each frame, and the gain coefficient Ka determined for each frame is multiplied by the gradation value of the input image Da of the frame.
  • the R component value, the G component value, and the B component value may be separately multiplied by the gain coefficient Ka. It may be converted into a brightness component value and a color difference component value once, multiplied by a gain coefficient Ka by the brightness component value, and then inversely converted into an R component value, a G component value, and a B component value.
  • the gain determination unit 204 may change the adjustment coefficient ⁇ according to the control information Ce from the life controller 4.
  • FIG. 5 shows another configuration example of the brightness correction unit 12.
  • the brightness correction unit 12 shown in FIG. 5 includes a maximum component value calculation unit 202, an in-frame average value calculation unit 206, a conversion curve selection unit 207, and a gradation conversion unit 201c.
  • the maximum component value calculation unit 202 of FIG. 5 is the same as the maximum component value calculation unit 202 of FIG.
  • the in-frame average value calculation unit 206 calculates the average value of the maximum component value M calculated for all the pixels in the frame by the maximum component value calculation unit 202, that is, the in-frame average value Mave.
  • the conversion curve selection unit 207 stores a plurality of gradation conversion curves in advance, and stores the plurality of gradation conversion curves based on the in-frame average value Mave calculated by the in-frame average value calculation unit 206. Select one gradation conversion curve from the list. For example, the conversion curve selection unit 207 selects a gradation conversion curve having a stronger degree of correction as the in-frame average value Mave is smaller.
  • FIG. 6 shows an example of the gradation conversion curve stored in the conversion curve selection unit 207.
  • two gradation conversion curves TC1 and TC2 are stored.
  • Each of the two gradation conversion curves TC1 and TC2 has an upwardly convex shape.
  • the gradation conversion curve TC2 is located at the same position as or above the gradation conversion curve TC1. Therefore, it can be said that the gradation conversion curve TC2 has a stronger degree of brightness correction than the gradation conversion curve TC1.
  • the conversion curve selection unit 207 selects the gradation conversion curve TC1 if the in-frame average value Mave is larger than the threshold value Mth, and selects the gradation conversion curve TC2 if the in-frame average value Mave is equal to or less than the threshold value Mth. select.
  • the selected gradation conversion curve is represented by TCs.
  • the gradation conversion unit 201c converts the gradation value of the input image data Da using the gradation conversion curve TCs selected by the conversion curve selection unit 207, and generates the corrected image data Db.
  • the gradation conversion curve is selected for each frame, and the gradation conversion curve selected for each frame is used for gradation conversion of the input image Da of the frame.
  • the gradation conversion may be performed on the R component value, the G component value, and the B component value, or may be performed on the luminance component value, as in the gradation conversion unit 201 of FIG.
  • the conversion curve selection unit 207 may be capable of transforming the gradation conversion curve selected based on the in-frame average value Mave according to the control information Ce from the life controller 4.
  • the conversion curve selection unit 207 instead stores three or more gradation conversion curves, and stores the stored gradation according to the in-frame average value Mave and the control information Ce from the life controller 4. Any one of the conversion curves may be selected. For example, if the in-frame average value Mave is the same, the control information Ce is required to increase the degree of correction as compared with the case where the control information Ce is not required to change the degree of correction. May select a gradation conversion curve having a larger degree of correction. On the contrary, if the in-frame average value Mave is the same, the control information Ce is required to reduce the degree of correction as compared with the case where the control information Ce is not required to change the degree of correction. In that case, a gradation conversion curve having a smaller degree of correction may be selected.
  • FIG. 7 shows another configuration example of the brightness correction unit 12.
  • the brightness correction unit 12 shown in FIG. 7 includes a maximum component value calculation unit 202, a histogram generation unit 208, a conversion curve generation unit 209, and a gradation conversion unit 201c.
  • the maximum component value calculation unit 202 of FIG. 7 is the same as the maximum component value calculation unit 202 of FIG.
  • the histogram generation unit 208 has the maximum component value M calculated by the maximum component value calculation unit 202 for all the pixels in the frame, and has each gradation value, that is, the gradation of the maximum component value M described above. Generate a histogram showing the frequency of appearance for each value.
  • FIG. 8A shows an example of the generated histogram H (M).
  • the gradation value M on the horizontal axis is a value within the range of values that the maximum component value M can take, for example, a value from 0 to 255.
  • the appearance frequency H (M) corresponding to each gradation value M represents the number of pixels whose maximum component value M is equal to the gradation value.
  • the range of values that the appearance frequency H (M) can take is from 0 to the total number of pixels in the frame.
  • the conversion curve generation unit 209 generates a gradation conversion curve based on the histogram H (M) generated by the histogram generation unit 208.
  • Generation of the gradation conversion curve includes normalization of the histogram and generation of the cumulative histogram.
  • the conversion curve generation unit 209 normalizes the histogram H (M) shown in FIG. 8 (a) as shown in FIG. 8 (b), and generates a normalized histogram P (r). For example, both the gradation value on the horizontal axis and the frequency of appearance on the vertical axis are normalized so that the maximum value is 1.
  • the normalized gradation value is represented by r
  • the normalized frequency of appearance is represented by P (r).
  • the conversion curve generation unit 209 generates the normalized cumulative histogram T (r) of FIG. 8 (c) from the normalized histogram P (r) of FIG. 8 (b).
  • the normalized cumulative histogram T (r) referred to here represents the cumulative value of the normalized appearance frequency P (r) from 0 to the gradation value r for each normalized gradation value r.
  • the histogram H (M) shown in FIG. 8 (a) represents the distribution function of the maximum component value M
  • the normalized histogram P (r) shown in FIG. 8 (b) is the distribution of the normalized maximum component value r.
  • the normalized cumulative histogram T (r) shown in FIG. 8C represents a function and represents the cumulative density function of the normalized maximum component value r.
  • the conversion curve generation unit 209 superimposes the normalized cumulative histogram T (r) of FIG. 8C and the correction curve G (r) illustrated in FIG.
  • the correction curve G (r) in FIG. 8 (d) is created separately and is for brightening the image.
  • the horizontal axis and the vertical axis take values from 0 to 1.
  • correction curve shown in FIG. 8D is an example and does not limit the present embodiment.
  • the conversion curve generation unit 209 then performs inverse normalization on the normalized gradation conversion curve T2 (r) shown in FIG. 8 (e), and the gradation conversion curve shown in FIG. 8 (f). Generate TC.
  • the horizontal axis represents the input (before conversion) gradation value
  • the vertical axis represents the output (after conversion) gradation value.
  • Inverse normalization is performed so that 1 on the horizontal axis and 1 on the vertical axis in FIG. 8E both have the maximum value of the gradation value, for example, 255.
  • the gradation conversion unit 201c uses the gradation conversion curve TC generated by the conversion curve generation unit 209 to convert the gradation value of the input image data Da to generate the corrected image data Db.
  • the gradation conversion curve is generated for each frame, and the gradation conversion curve generated for each frame is used for the gradation conversion of the input image Da of the frame.
  • the gradation conversion may be performed on the R component value, the G component value, and the B component value in the same manner as in the gradation conversion unit 201 of FIG. 2 and the gradation conversion unit 20c of FIG. 7, with respect to the luminance component value. You may go.
  • the conversion curve generation unit 209 may be capable of deforming the correction curve G (r) according to the control information Ce from the life controller 4. Instead, the conversion curve generation unit 209 stores a plurality of correction curves G (r), and selects one of the plurality of correction curves G (r) according to the control information Ce from the life controller 4. It may be a thing.
  • the brightness correction unit 12 a unit other than the configuration examples shown in FIGS. 2, 4, 5, and 7 may be used. In short, most of the input images may be corrected so that the average brightness of the corrected image is brighter than the average brightness of the input image.
  • the corrected image brightness calculation unit 13 calculates the average brightness Lb of the corrected image Db.
  • the average brightness Lb of the corrected image Db is an average value of the brightness of all the pixels constituting the corrected image Db.
  • the brightness of the pixel is obtained from the R component value, the G component value, and the B component value of each of the pixels constituting the corrected image Db, and the brightness of all the pixels is averaged to obtain the average brightness Lb. Desired.
  • the input image brightness calculation unit 14 calculates the average brightness La of the input image Da.
  • the average brightness La of the input image Da is an average value of the brightness of all the pixels constituting the input image Da.
  • the brightness of the pixel is obtained from the R component value, the G component value, and the B component value of each of the pixels constituting the input image Da, and the brightness of all the pixels is averaged to obtain the average brightness La. Desired.
  • the ratio of the average brightness Lb of the corrected image Db to the average brightness La of the input image Da indicates the degree of brightness correction in the brightness correction unit 12.
  • the dimming rate calculation unit 15 calculates the dimming rate E based on the average brightness La calculated by the input image brightness calculation unit 14 and the average brightness Lb calculated by the corrected image brightness calculation unit 13.
  • the calculated dimming rate E is supplied to the backlight 7.
  • the backlight 7 adjusts the amount of light emitted based on the supplied dimming rate E. That is, it emits light with an adjusted amount of light emission. For example, the value obtained by multiplying the reference light emission amount by the dimming rate E is used as the adjusted light emission amount.
  • the reference light emission amount is, for example, the maximum light emission amount.
  • the adjustment of the light emission amount of the backlight 7 is generally performed so as to cancel the brightness correction by the brightness correction unit 12.
  • the adjustment parameter ⁇ is 1, the brightness correction unit 12 does not correct the brightness and the backlight 7 does not adjust the amount of light emitted, and the brightness correction unit 12 corrects the brightness and emits light by the backlight 7.
  • the average brightness of the image displayed by the image display 6 is the same as when the amount is adjusted.
  • the adjustment parameter ⁇ is a value smaller than 1, the average brightness of the image displayed by the image display 6 becomes lower.
  • the adjustment parameter ⁇ is made smaller than 1 when it is desired to prolong the life of the backlight even if the average brightness is lowered.
  • the process of determining the dimming rate E based on the average brightness La of the input image Da and the average brightness Lb of the corrected image Db and the process of changing the adjustment parameter ⁇ are both the dimming rate calculation unit 15 It is the adjustment of the degree of correction in.
  • the process of making the parameter ⁇ in the dimming rate calculation unit 15 smaller than 1 is an adjustment for further increasing the degree of correction in the dimming rate calculation unit 15. That is, it can be said that it is an additional adjustment.
  • the life of the backlight 7 can be extended by reducing the amount of light emitted from the backlight 7. Further, since the brightness correction unit 12 generates the correction image data Db representing the correction image Db brighter than the input image Da and modulates the display panel 8 with the correction image data Db, the visibility of the display image is maintained. can.
  • the image input unit 11, the brightness correction unit 12, the corrected image brightness calculation unit 13, the input image brightness calculation unit 14, and the dimming rate calculation unit 15 operate for each frame period.
  • the display history information generation unit 17 generates display history information.
  • the "generation" referred to here includes the first generation after the start of the image processing device 3 and the second and subsequent generations, that is, updates.
  • the display history information is generated for each TLa for a certain period of time, for example.
  • the above-mentioned first display history information may be generated, for example, immediately after the start-up, or when a certain period of time has elapsed from the start-up.
  • the display history information represents the cumulative display time from the start of use of the image display 6 to the present time and the average value of the dimming rate.
  • the average value of the dimming rate is the average value of the dimming rate during the period of display up to the present time.
  • the display history information generation unit 17 stores the generated display history information in the display history information storage unit 18. If the display history information is already stored, the information is rewritten.
  • the period D_ADD may be a one-frame period or a longer period.
  • the period D_ADD is equal to the period TLa.
  • the display history information generation unit 17 calculates the equation (5) for each frame period.
  • the display history information generation unit 17 measures the length of the period D_ADD.
  • the display history information generation unit 17 also stores the dimming rate for each frame period during the period D_ADD, and obtains the average value of the stored dimming rate for each frame over the period D_ADD.
  • the D_NOW'and E_NOW' calculated in this way are used for rewriting the display history information storage unit 18. By rewriting, D_NOW'and E_NOW' become new D_NOW and E_NOW.
  • the display history information storage unit 18 is composed of a non-volatile memory that can hold data even if it is not connected to a power source.
  • the writing or rewriting of the information in the display history information storage unit 18 may be performed every time the information is generated in the display history information generation unit 17, or may be performed every TLb for a certain period of time, and is predetermined. It may occur when an event occurs, for example, when the image display system is turned off.
  • the image processing device 3 is performed for each TLb for a certain period of time, the first writing after the start of the image processing device 3 may be performed immediately after the start-up, or may be performed when the TLb for a certain period of time has elapsed from the start-up.
  • the display history information generation unit 17 If the information generated by the display history information generation unit 17 is not immediately written to the display history information storage unit 18, the generated information is temporarily held in the display history information generation unit 17.
  • the life controller 4 controls the life based on the display history information stored in the display history information storage unit 18, the life L_ORG at the start of use of the backlight 7, and the target life TRGT stored in advance. conduct.
  • the life control is performed, for example, by adjusting the degree of correction in the brightness correction unit 12.
  • Control information Ce is supplied for controlling the brightness correction unit 12.
  • the dimming rate calculation unit 15 also adjusts the degree of correction accordingly.
  • the life control may include an additional adjustment of the degree of correction in the dimming rate calculation unit 5.
  • the control information for controlling the dimming rate calculation unit 5 is indicated by Cf.
  • the life means the time until the emission brightness drops to a certain value.
  • a constant value is, for example, half of the emission brightness at the start of use.
  • life L_ORG at the start of use it is assumed that the life L_ORG (100) when the light is continuously lit at a dimming rate of 100% is stored.
  • target life TRGT it is assumed that the target life TRGT (E) when the use is continued while adjusting the dimming rate E is stored.
  • Life control is performed every TLc for a certain period of time, for example.
  • the first life control after the start of the life controller 4 may be performed immediately after the start, or may be performed when TLc has elapsed for a certain period from the start.
  • the life prediction unit 41 uses the display history information stored in the display history information storage unit 18 and the life L_ORG (100) at the start of use stored in the life prediction unit 41 in advance to determine the remaining life of the backlight L_RST. To calculate.
  • L_RST the remaining life L_RST (100) when the light is continuously lit at a dimming rate of 100% is calculated by the following formula (6).
  • L_RST (100) L_ORG (100) -D_NOW x E_NOW (6)
  • L_ORG (100) is the life at the start of use when the light continues to be lit at a dimming rate of 100%.
  • D_NOW is the cumulative display time up to the present time.
  • E_NOW is the average dimming rate up to the present time.
  • L_RST (Eave) L_RST (100) / E_NOW (7) From the above, the life of the backlight 7 can be predicted.
  • the correction degree adjusting unit 42 calculates the target dimming rate E_RST based on the remaining life L_RST (100) and the target life TRGT (E) stored in the correction degree adjusting unit 42 in advance, and the calculated target dimming.
  • the brightness correction unit 12 is controlled based on the rate.
  • the target dimming rate E_RST is calculated as follows, for example.
  • the target value TRGT_RST (E) for the remaining life is calculated by the following formula (8).
  • TRGT_RST (E) TRGT (E) -D_NOW (8)
  • TRGT (E) is the target life at the start of use when the use is continued while adjusting the dimming rate E.
  • the target value of the remaining life TRGT_RST (E) calculated by the formula (8) is a target value of the future life when the use is continued while adjusting the dimming rate E.
  • E_RST L_RST (100) / TRGT_RST (E) (9)
  • the life L_ORG (100) when the light is continuously lit at a dimming rate of 100% is stored as the life L_ORG at the start of use, and is used for the process for calculating the target dimming rate E_RST. ..
  • this point is not essential.
  • the life when the light is continuously lit at a dimming rate other than 100% may be stored. In short, it suffices to store the life when the light is continuously lit at a constant dimming rate.
  • L_ORG ( ⁇ ) at the start of use when the light is continuously lit at a constant dimming rate ⁇ % is stored, and L_ORG ( ⁇ ) ⁇ 100 / ⁇ is used instead of L_ORG (100) in the formula (6). May be used.
  • the correction degree adjusting unit 42 controls the degree of brightness correction in the brightness correction unit 12 based on the target dimming rate E_RST calculated as described above. In controlling the degree of correction, the degree of brightness correction is adjusted as necessary.
  • the control for the brightness correction unit 12 is performed as follows, for example. If the dimming rate should be smaller, the degree of brightness correction should be increased. This is because if the degree of brightness correction is increased, the corrected image Db becomes brighter, and as a result, the dimming rate E calculated by the dimming rate calculation unit 15 becomes smaller.
  • the degree of brightness correction should be reduced.
  • the corrected image Db becomes darker (closer to the brightness of the input image Da), and as a result, the dimming rate E calculated by the dimming rate calculation unit 15 becomes larger. Because.
  • the control for the brightness correction unit 12 is performed as follows.
  • the degree of brightness correction is to be increased, the gradation conversion curve TC having a larger degree of correction (degree of brightening the image) is used in the gradation conversion in the gradation conversion unit 201.
  • the gradation conversion curve TC stored in the gradation conversion unit 201 may be deformed so that the degree of correction becomes larger.
  • the gradation conversion curve TC having a larger degree of correction may be selected.
  • the gradation conversion curve TC having a smaller degree of correction (degree of brightening the image) is used in the gradation conversion in the gradation conversion unit 201. ..
  • the gradation conversion curve TC stored in the gradation conversion unit 201 may be deformed so that the degree of correction becomes smaller.
  • the gradation conversion curve TC having a smaller degree of correction may be selected.
  • the control for the brightness correction unit 12 is performed as follows.
  • the degree of brightness correction is increased, the adjustment coefficient ⁇ used in the gain determination unit 204 is increased.
  • the adjustment coefficient ⁇ used in the gain determination unit 204 is made smaller.
  • the control for the brightness correction unit 12 is performed as follows.
  • a gradation conversion curve having a larger degree of correction is used in the gradation conversion in the gradation conversion unit 201c.
  • the threshold value Mth used in the conversion curve selection unit 207 may be made smaller so that a gradation conversion curve having a larger degree of correction can be easily selected.
  • the conversion curve selection unit 207 transforms the gradation conversion curve selected based on the in-frame average value Mave so that the degree of correction becomes larger according to the control information Ce from the life controller 4. You may.
  • the conversion curve selection unit 207 stores three or more gradation conversion curves
  • the gradation conversion curve having a larger degree of correction may be selected.
  • a gradation conversion curve having a larger degree of correction may be selected as compared with the case where the degree of correction is not required to be changed by the control information Ce.
  • the conversion curve selection unit 207 transforms the gradation conversion curve selected based on the in-frame average value Mave so that the degree of correction becomes smaller according to the control information Ce from the life controller 4. You may.
  • the conversion curve selection unit 207 stores three or more gradation conversion curves
  • the gradation conversion curve having a smaller degree of correction may be selected.
  • a gradation conversion curve having a smaller degree of correction may be selected as compared with the case where the degree of correction is not required to be changed by the control information Ce.
  • the control for the brightness correction unit 12 is performed as follows.
  • the conversion curve generation unit 209 uses a correction curve G (r) having a larger degree of correction (degree of brightening the image).
  • the correction curve G (r) may be deformed by the conversion curve generation unit 209 so that the degree of correction becomes larger.
  • the conversion curve generation unit 209 stores a plurality of correction curves G (r)
  • a correction curve having a larger degree of correction may be selected.
  • the conversion curve generation unit 209 uses a correction curve G (r) having a smaller degree of correction (degree of brightening the image).
  • the correction curve G (r) may be deformed by the conversion curve generation unit 209 so that the degree of correction becomes smaller.
  • the conversion curve generation unit 209 stores a plurality of correction curves G (r)
  • a correction curve having a smaller degree of correction may be selected.
  • the dimming rate calculated by the dimming rate calculation unit 15 also changes.
  • the correction degree adjusting unit 42 may control not only the brightness correction unit 12 but also the dimming rate calculation unit 15.
  • the adjustment parameter ⁇ used in the dimming rate calculation unit 15 may be adjusted. For example, when it is desired to lengthen the time until the end of the life (the time until the emission brightness drops to a constant value), the adjustment parameter ⁇ may be set to a smaller value.
  • the process of reducing the adjustment parameter ⁇ in the dimming rate calculation unit 15 adds to further increase the degree of correction in the dimming rate calculation unit 15. It can be said that it is a typical process.
  • the adjustment parameter ⁇ is reduced, the image displayed by the image display 6 becomes dark. Therefore, the process of reducing the adjustment parameter ⁇ is limited to the case where priority is given to making the display image usable until the target life is reached at the expense of the brightness of the displayed image.
  • the display control device 2 of FIG. 1 may be partially or wholly composed of a processing circuit.
  • the functions of each part of the display control device 2 may be realized by separate processing circuits, or the functions of a plurality of parts may be collectively realized by one processing circuit.
  • the processing circuit may be composed of hardware or software, that is, a programmed computer. Of the functions of each part of the display control device 2, a part may be realized by hardware and the other part may be realized by software.
  • FIG. 9 shows a computer 9 that realizes all the functions of the display control device 2 together with the backlight 7 and the display panel 8.
  • the computer 9 has a processor 91 and a memory 92.
  • the memory 92 stores a program for realizing the functions of each part of the display control device 2.
  • the memory 92 is also used for storing information stored in each part of the display control device 2 of FIG.
  • the memory 92 is also used as, for example, a display history information storage unit 18.
  • the processor 91 uses, for example, a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a microprocessor, a microcontroller, a DSP (Digital Signal Processor), or the like.
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • microprocessor a microcontroller
  • DSP Digital Signal Processor
  • the memory 92 includes, for example, a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Lead Only Memory), an EEPROM (Electrically Memory Memory, etc.) Alternatively, a photomagnetic disk or the like is used.
  • the processor 91 and the memory 92 may be realized by an LSI (Large Scale Integration) integrated with each other.
  • the processor 91 realizes the function of the display control device 2 by executing the program stored in the memory 92.
  • the functions of the display control device 2 include control of the amount of light emitted from the backlight 7 and control of the display on the display panel 8 (supply of a signal to the display panel 8).
  • the program may be provided over a network or may be recorded and provided on a recording medium, such as a non-temporary recording medium. That is, the program may be provided, for example, as a program product.
  • the computer of FIG. 9 includes a single processor, but may include two or more processors.
  • step ST1 the processor 91 corrects the input image data Da and generates the corrected image data Db. This process is equivalent to the process in the brightness correction unit 12 of FIG.
  • step ST2 the processor 91 calculates the average brightness Lb of the corrected image Db represented by the corrected image data Db generated in step ST1. This process is equivalent to the process in the corrected image luminance calculation unit 13 of FIG.
  • step ST3 the processor 91 calculates the average luminance La of the input image Da represented by the input image data Da. This process is equivalent to the process in the input image luminance calculation unit 14 of FIG. The process of step ST3 can be performed in parallel with the process of steps ST1 and ST2.
  • step ST4 the processor 91 calculates the dimming rate E based on the average luminance La calculated in step ST3 and the average luminance Lb calculated in step ST2. This process is equivalent to the process in the dimming rate calculation unit 15 of FIG.
  • step ST5 the processor 91 determines whether or not the display history information should be generated. For example, it is determined whether or not it is time to generate the display history information. If NO in step ST5, the process returns to steps ST1 and ST3, and if YES, the process proceeds to step ST6.
  • step ST6 the processor 91 generates display history information based on the dimming rate E calculated in step ST4.
  • the processing in steps ST5 and ST6 is equivalent to the processing in which the display history information generation unit 17 of FIG. 1 generates display history information.
  • step ST7 the processor 91 determines whether or not the display history information generated in step ST6 should be stored. For example, it is determined whether or not it is time to store the display history information. If NO in step ST7, the process returns to steps ST1 and ST3, and if YES, the process proceeds to step ST8.
  • step ST8 the processor 91 writes the display history information generated in step ST6 to the memory 92.
  • the processing of steps ST7 and ST8 is equivalent to the processing in which the display history information generation unit 17 of FIG. 1 stores the display history information in the display history information storage unit 18.
  • step ST9 the processor 91 determines whether or not the life control should be performed. For example, it is determined whether or not it is time to control the life. If NO in step ST9, the process returns to steps ST1 and ST3, and if YES, the process proceeds to step ST10.
  • step ST10 the processor 91 calculates the remaining life of the backlight based on the display history information stored in the memory 92 and the life L_ORG at the start of use.
  • the processing of steps ST9 and ST10 is equivalent to the processing in the life prediction unit 41 of FIG.
  • step ST11 the processor 91 calculates and calculates the target dimming rate E_RST based on the remaining life L_RST (100) calculated in step ST10 and the target life TRGT (E) stored in the memory 92. Adjust the degree of correction based on the target dimming rate. In adjusting the degree of correction, for example, the degree of correction in the brightness correction in step ST1 is adjusted. In adjusting the degree of correction, the adjustment parameter ⁇ used in step ST4 may also be adjusted.
  • the process of step ST11 is equivalent to the process of the correction degree adjusting unit 42 in FIG. 1 and the process of adjusting the correction degree performed by the brightness correction unit 12 according to the output of the correction degree adjusting unit 42.
  • the process of step ST11 may include a process of adjusting the degree of correction performed by the dimming rate calculation unit 15 according to the output of the degree of correction unit 42.
  • the life controller 4 performs life control for each TLc for a certain period of time.
  • the life controller 4 may instead perform life control when a predetermined event occurs, for example, when the power of the image display system is turned on.
  • the life controller 4 may perform life control every time the display history information of the display history information storage unit 18 is rewritten. In that case, the display history information generation unit 17 notifies the life prediction unit 41 that the information has been rewritten, and the life prediction unit 41 may calculate the remaining life L_RST based on this notification. The control information for this notification is indicated by Cw. Similar control information may be supplied to the correction degree adjusting unit 42.
  • the correction degree adjustment in the correction degree adjustment unit 42 may be performed at a timing independent of the calculation of the target dimming rate E_RST in the life prediction unit 41. For example, when the target dimming rate E_RST is calculated for each TLc for a certain period, the degree of correction may be adjusted for each TLd for a certain period different from the period TLc. The period TLd may be longer than the period TLc.
  • the adjustment of the degree of correction for the first time after the start of the image processing device 3 and the life controller 4 may be performed immediately after the start, and for a certain period from the start. It may be performed when the TLd has elapsed. Further, the degree of correction may be adjusted when a predetermined event occurs, for example, when the power of the image display system is turned on.
  • the backlight 7 is adjusted so as to be used until the target life while maintaining the visibility of the displayed image or minimizing the decrease in the visibility of the displayed image. It becomes possible to do.
  • FIG. 11 shows an image display system according to the second embodiment.
  • the image display system shown in FIG. 11 includes a display control device 2b and an image display 6.
  • the display control device 2b includes an image processing device 3b and a life controller 4b.
  • the image processing device 3b and the life controller 4b each have the same internal configuration as the image processing device 3 and the life controller 4 of the first embodiment and have the same functions, but the life controller 4b is the image processing device 3b. It is different from the image processing device 3 and the life controller 4 of the first embodiment in that it is provided so as to be separable from the image processing device 3.
  • the image processing device 3b and the image display device 6 constitute the image display device 1b of the present embodiment.
  • the image processing device 3b and the life controller 4b are provided with connectors 301 and 401, respectively, and the connectors 301 and 401 can be connected by the cable 501, and the cables 501 are not connected (non-connected state). It is possible to separate the life controller 4b from the image processing device 3b.
  • the life controller 4b is always separated from the image processing device 3b, and the life controller 4b is connected to the image processing device 3b only when necessary. In the connected state, the life controller 4b controls the life.
  • the life control may be performed, for example, as soon as the life controller 4b is connected to the image processing device 3b, that is, when the connection is detected. After that, as long as the connection state continues, it may be performed every TLc for a certain period of time.
  • the life controller 4b reads the information stored in the display history information storage unit 18, calculates the required degree of correction, and adjusts the degree of correction based on the calculation result. conduct.
  • the correction degree adjustment in the correction degree adjustment unit 42 in the life controller 4b may be performed at a timing independent of the calculation of the target dimming rate E_RST in the life prediction unit 41. For example, when the target dimming rate E_RST is calculated by the life prediction unit 41 for each TLc for a certain period, the correction degree of the correction degree adjustment unit 42 may be adjusted for each period TLd different from the above period TLc. good. The period TLd may be longer than the period TLc.
  • the connector 301 and the connector 401 are connected via the cable 501, but the connector 301 and the connector 401 that can be directly connected without the cable may be used.
  • an interface capable of wireless mutual communication is provided, and data can be exchanged between the image processing device 3b and the life controller 4b by wireless mutual communication. Is also good.
  • the life controller 4b is separated from the image processing device 3b in the non-communication state, and the life controller 4b is connected to the image processing device 3b in the communication state. It can be said that there is.
  • Each of the image processing device 3b and the life controller 4b of the display control device 2b of FIG. 11 may also be composed of a part or all of the processing circuit.
  • the functions of each part of the image processing device 3b and the life controller 4b may be realized by separate processing circuits, or the functions of a plurality of parts may be collectively realized by one processing circuit. ..
  • the processing circuit may be composed of hardware or software, that is, a programmed computer. Of the functions of each part of the image processing device 3b and the life controller 4b, a part may be realized by hardware and the other part may be realized by software.
  • FIG. 12 shows a first computer 9b that realizes all the functions of the image processing device 3b and a second computer 9c that realizes all the functions of the life controller 4b together with the backlight 7 and the display panel 8. ..
  • the computer 9b has a processor 91b and a memory 92b.
  • the memory 92b stores a program for realizing the functions of each part of the image processing device 3b.
  • the computer 9c has a processor 91c and a memory 92c.
  • the memory 92c stores a program for realizing the functions of each part of the life controller 4b.
  • the memory 92b is also used for storing information stored in each part of the image processing device 3b of FIG.
  • the memory 92b is also used, for example, as the display history information storage unit 18 of FIG.
  • the memory 92c is also used to store information stored in each part of the life controller 4b of FIG.
  • Each of the processors 91b and 91c uses, for example, a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a microprocessor, a microcontroller, a DSP (Digital Signal Processor), or the like.
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • microprocessor a microcontroller
  • DSP Digital Signal Processor
  • Each of the memory 92b and the memory 92c is, for example, a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory) or an EEPROM (Electric Memory) Semiconductor Memory.
  • RAM Random Access Memory
  • ROM Read Only Memory
  • flash memory an EPROM (Erasable Programmable Read Only Memory) or an EEPROM (Electric Memory) Semiconductor Memory.
  • EPROM Erasable Programmable Read Only Memory
  • EEPROM Electrical Memory
  • a magnetic disk, an optical disk, a photomagnetic disk, or the like is used.
  • the processor 91b and the memory 92b may be realized by an LSI integrated with each other.
  • the processor 91c and the memory 92c may be realized by LSIs integrated with each other.
  • the processor 91b realizes the function of the image processing device 3b by executing the program stored in the memory 92b.
  • the functions of the image processing device 3b include control of the amount of light emitted from the backlight 7 and control of display on the display panel 8 (supply of a signal to the display panel 8).
  • the processor 91c realizes the function of the life controller 4b by executing the program stored in the memory 92c.
  • the function of the life controller 4b includes calculation of the degree of correction.
  • the program may be provided over a network or may be recorded and provided on a recording medium, such as a non-temporary recording medium. That is, the program may be provided, for example, as a program product.
  • a recording medium such as a non-temporary recording medium. That is, the program may be provided, for example, as a program product.
  • Each of the computers 9b and 9c includes a single processor, but may include more than one processor.
  • the computer 9b and the computer 9c are separably connected to each other by the input / output interfaces 93b and 93c and the cable 94.
  • the procedure of processing by the processor 91b when the above-mentioned image processing device 3b is configured by the computer 9b of FIG. 12 will be described with reference to FIG.
  • the process of FIG. 13 can be performed even if the computer 9c is not connected to the computer 9b, and is started every time the input image data Da is input for one frame.
  • steps ST1 to ST8 in FIG. 13 is the same as that of steps ST1 to ST8 in FIG. However, instead of the processing in the image processing device 3 of FIG. 1, the same processing as the processing in the image processing device 3b of FIG. 11 is performed.
  • the process of FIG. 14 is performed with the computer 9c connected to the computer 9b, as shown in FIG.
  • the process of FIG. 14 may be started, for example, when the computer 9c is connected to the computer 9b, that is, as soon as the connection is detected. After that, as long as the connection state continues, it may be started every TLc for a certain period of time.
  • steps ST10 and ST11 in FIG. 14 is the same as that of steps ST10 and ST11 in FIG. 10, respectively. However, instead of the process in the life controller 4 of FIG. 1, the same process as the process in the life controller 4b of FIG. 11 is performed.
  • the input / output interface 93b and the input / output interface 93c are connected via the cable 94, but the input / output interfaces 93b and 93c provided with connectors that can be directly connected without the cable are used. You may.
  • an interface capable of wireless mutual communication may be provided so that the computer 9b and the computer 9c can exchange data by wireless mutual communication. ..
  • the computer 9c is separated from the computer 9b in the non-communication state, and the computer 9c is connected to the computer 9b in the communication state.
  • the life controller 4b since the life controller 4b is separable from the image processing device 3b, the life controller 4b can be separated when it is not needed, and the image display device 1b provided with the image processing device 3b or the image display device 1b or The convenience of the image display system provided with the image display device 1b is improved. For example, when the image display system or the image display device 1b is portable, by separating the life controller 4b, the size or weight is reduced by that amount, and the convenience of carrying is improved.
  • the life controller 4b does not have to be dedicated to the image display device 1b.
  • one life controller 4b is prepared for a plurality of image display devices 1b having the same specifications, and one life controller 4b is connected to the plurality of image display devices 1b in order, and each image processing device is connected. It is also possible to control the life of 3b.
  • a plurality of image display devices 1b having the same specifications and a plurality of life controllers 4b having the same specifications are prepared, and a plurality of the above-mentioned plurality of image display devices 1b are provided with respect to any one of the above-mentioned plurality of image display devices 1b. It is also possible to connect any one of the life controllers 4b to control the life.
  • the image processing device 3b and the life controller 4b are composed of the computers 9b and 9c, respectively, the computer 9c can be separated from the computer 9b, so that the same effect as described above can be obtained.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Crystallography & Structural Chemistry (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
PCT/JP2020/003984 2020-02-03 2020-02-03 表示制御装置、画像表示システム及び表示制御方法 WO2021156924A1 (ja)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/JP2020/003984 WO2021156924A1 (ja) 2020-02-03 2020-02-03 表示制御装置、画像表示システム及び表示制御方法
CN202080094426.1A CN115039164A (zh) 2020-02-03 2020-02-03 显示控制装置、图像显示系统及显示控制方法
JP2021575117A JP7258190B2 (ja) 2020-02-03 2020-02-03 表示制御装置、画像表示システム及び表示制御方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/003984 WO2021156924A1 (ja) 2020-02-03 2020-02-03 表示制御装置、画像表示システム及び表示制御方法

Publications (1)

Publication Number Publication Date
WO2021156924A1 true WO2021156924A1 (ja) 2021-08-12

Family

ID=77199817

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/003984 WO2021156924A1 (ja) 2020-02-03 2020-02-03 表示制御装置、画像表示システム及び表示制御方法

Country Status (3)

Country Link
JP (1) JP7258190B2 (zh)
CN (1) CN115039164A (zh)
WO (1) WO2021156924A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113627039A (zh) * 2021-10-11 2021-11-09 广州中大中鸣科技有限公司 一种灯光照明系统能耗的预测方法、装置及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008224780A (ja) * 2007-03-08 2008-09-25 Funai Electric Co Ltd テレビジョン装置および表示装置
US20130010001A1 (en) * 2011-07-04 2013-01-10 Shenzhen China Star Optoelectronics Technology Co. Ltd. Lcd display, a driving device for driving the lcd display, and a driving method for driving the lcd display
JP2013097140A (ja) * 2011-10-31 2013-05-20 Sanyo Electric Co Ltd 映像表示装置
JP2015118144A (ja) * 2013-12-17 2015-06-25 三菱電機株式会社 映像表示装置

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011104835A1 (ja) 2010-02-24 2011-09-01 富士通株式会社 表示装置、表示管理システム及び管理方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008224780A (ja) * 2007-03-08 2008-09-25 Funai Electric Co Ltd テレビジョン装置および表示装置
US20130010001A1 (en) * 2011-07-04 2013-01-10 Shenzhen China Star Optoelectronics Technology Co. Ltd. Lcd display, a driving device for driving the lcd display, and a driving method for driving the lcd display
JP2013097140A (ja) * 2011-10-31 2013-05-20 Sanyo Electric Co Ltd 映像表示装置
JP2015118144A (ja) * 2013-12-17 2015-06-25 三菱電機株式会社 映像表示装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113627039A (zh) * 2021-10-11 2021-11-09 广州中大中鸣科技有限公司 一种灯光照明系统能耗的预测方法、装置及存储介质
CN113627039B (zh) * 2021-10-11 2022-02-22 广州中大中鸣科技有限公司 一种灯光照明系统能耗的预测方法、装置及存储介质

Also Published As

Publication number Publication date
JP7258190B2 (ja) 2023-04-14
JPWO2021156924A1 (zh) 2021-08-12
CN115039164A (zh) 2022-09-09

Similar Documents

Publication Publication Date Title
US10761371B2 (en) Display device
US20110074803A1 (en) Methods and Systems for Ambient-Illumination-Selective Display Backlight Modification and Image Enhancement
US9685109B2 (en) Luminance boost method and system
JP5121464B2 (ja) ディスプレイ装置およびその輝度調整方法
US9607577B2 (en) Dynamic power and brightness control for a display screen
JP5666163B2 (ja) 光源駆動方法
JP5374802B2 (ja) 画像表示装置、画像表示方法、画像表示プログラムおよび画像表示プログラムを記録した記録媒体
EP1650736A1 (en) Backlight modulation for display
JP4969135B2 (ja) 画像調整方法
WO2011036692A1 (ja) 画像処理装置、画像表示装置
KR101073006B1 (ko) 표시장치 및 표시장치의 이미지 밝기조절방법
JP2007322882A (ja) 表示装置および表示制御方法
JP6395990B1 (ja) 表示装置
US20110001737A1 (en) Methods and Systems for Ambient-Adaptive Image Display
WO2021156924A1 (ja) 表示制御装置、画像表示システム及び表示制御方法
JP2006293328A (ja) トーンスケール調整および一定ハイパスゲインを用いてディスプレイ輝度を強化する方法およびシステム
JP2006308631A (ja) 画像表示装置、画像表示方法、画像表示プログラムおよび画像表示プログラムを記録した記録媒体
JP6648932B2 (ja) 表示装置及びその制御方法
JPWO2011007438A1 (ja) 表示装置および制御方法
JP6896507B2 (ja) 表示装置およびその制御方法
JP2000352953A (ja) 輝度むら低減装置および画像表示装置
CN114566116A (zh) 图像亮度控制方法及装置和显示控制器
JP2022175370A (ja) 表示装置及びその制御方法
JP2019161491A (ja) 表示装置および表示方法、プログラム、記憶媒体
CN115311989A (zh) 显示屏控制方法及其装置、系统、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20917577

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021575117

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20917577

Country of ref document: EP

Kind code of ref document: A1