CN117894266A - Integrated circuit, display device and driving method of display device - Google Patents

Integrated circuit, display device and driving method of display device Download PDF

Info

Publication number
CN117894266A
CN117894266A CN202311327418.9A CN202311327418A CN117894266A CN 117894266 A CN117894266 A CN 117894266A CN 202311327418 A CN202311327418 A CN 202311327418A CN 117894266 A CN117894266 A CN 117894266A
Authority
CN
China
Prior art keywords
compensation value
input
display
value
compensation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311327418.9A
Other languages
Chinese (zh)
Inventor
文东元
朴玟奎
李东焕
崔元准
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Display Co Ltd
Original Assignee
Samsung Display Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020230061361A external-priority patent/KR20240053510A/en
Application filed by Samsung Display Co Ltd filed Critical Samsung Display Co Ltd
Publication of CN117894266A publication Critical patent/CN117894266A/en
Pending legal-status Critical Current

Links

Landscapes

  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

The invention discloses an integrated circuit, a display device and a driving method of the display device, the display device comprises: a compensation value determination unit that generates a final compensation value for the input image; a timing control unit that receives an input gradation for the input image, and applies the final compensation value to the input gradation to generate an output gradation; and a pixel unit that displays an output image corresponding to the output gradation using pixels, wherein the compensation value determination unit determines a weight value based on a display frequency, a display luminance, and the input gradation, wherein the compensation value determination unit determines a compensation value based on the display frequency and a position of the pixels, and wherein the compensation value determination unit applies the weight value to the compensation value to generate the final compensation value.

Description

Integrated circuit, display device and driving method of display device
Technical Field
The invention relates to an integrated circuit, a display device and a driving method of the display device.
Background
With the development of information technology, importance of a display device as a connection medium between a user and information has emerged. In accordance with this, the use of display devices such as a liquid crystal display device (Liquid Crystal Display Device) and an organic light emitting display device (Organic Light Emitting Display Device) has increased.
The display device may include a plurality of pixels having the same circuit configuration. However, the larger the display device is, the larger the process variation between the plurality of pixels may be. Thus, the plurality of pixels can emit light with different luminance from each other for the same input gray scale. In addition, not only the process variation but also the plurality of pixels can emit light with different luminance from each other for the same input gray scale by other driving conditions of the display device.
Therefore, even for the same image, it is necessary to apply different compensation values for each case. However, the case of measuring and storing the compensation values in advance in all cases has a disadvantage of increasing the cost due to an increase in the tact time, an increase in the memory capacity, and the like.
Disclosure of Invention
The technical subject to be solved is to provide an integrated circuit, a display device and a driving method of the display device, which can calculate an appropriate image compensation value with minimum cost for various driving conditions.
A display device according to an embodiment of the present invention includes: a compensation value determination unit that generates a final compensation value for the input image; a timing control unit that receives an input gradation for the input image, and applies the final compensation value to the input gradation to generate an output gradation; and a pixel unit that displays an output image corresponding to the output gradation using pixels, wherein the compensation value determination unit determines a weight value based on a display frequency, a display luminance, and the input gradation, wherein the compensation value determination unit determines a compensation value based on the display frequency and a position of the pixels, and wherein the compensation value determination unit applies the weight value to the compensation value to generate the final compensation value.
The compensation value determining unit may include: a first weight value lookup table in which weight values based on a first display frequency, a reference display luminance, and a reference input gray scale are stored in advance; and a second weight lookup table in which weight values based on a second display frequency, the reference display luminance, and the reference input gray scale are stored in advance, the first display frequency being different from the second display frequency.
The compensation value determining unit may further include: a first compensation value lookup table in which compensation values based on the first display frequency and a reference position of the pixel are stored in advance; and a second compensation value lookup table in which compensation values based on the second display frequency and the reference position are stored in advance.
The compensation value determining unit may further include: a first multiplexer that receives an input display frequency and outputs the weight value included in the first weight value lookup table as a first weight value in the case where the input display frequency is the same as the first display frequency, and outputs the weight value included in the second weight value lookup table as the first weight value in the case where the input display frequency is the same as the second display frequency.
The compensation value determining unit may further include: and a luminance compensation unit configured to receive an input display luminance, select two reference display luminances having a smallest difference from the input display luminance, and interpolate the first weighted value corresponding to the selected two reference display luminances for each of the reference input grayscales, thereby generating a second weighted value for the input display luminance.
The compensation value determining unit may further include: and a gray level compensation unit configured to receive the input gray levels, select two reference input gray levels having the smallest difference from the input gray levels for the input gray levels, and interpolate two second weighted values of the two selected reference input gray levels, thereby generating a third weighted value for the input gray level.
The compensation value determining unit may further include: a second multiplexer that receives the input display frequency and outputs the compensation value included in the first compensation value lookup table as a first compensation value if the input display frequency is the same as the first display frequency, and outputs the compensation value included in the second compensation value lookup table as the first compensation value if the input display frequency is the same as the second display frequency.
The compensation value determining unit may further include: and a position compensation unit that interpolates the first compensation value to generate a second compensation value for a pixel that is not located at the reference position.
The compensation value determining unit may further include: and a final compensation value generation unit configured to apply the third weighted value to the second compensation value to generate the final compensation value.
The final compensation value generation unit may multiply the third weighted value by the second compensation value to generate the final compensation value, and the timing control unit may add the final compensation value to the input gradation to generate the output gradation.
The driving method of the display device according to an embodiment of the present invention may include: a step of generating a final compensation value for the input image; a step of applying the final compensation value to an input gray scale of the input image to generate an output gray scale; and displaying an output image corresponding to the output gray scale using pixels, the step of generating the final compensation value including: a step of determining a weighting value based on a display frequency, a display luminance, and the input gray scale; a step of determining a compensation value based on the display frequency and the position of the pixel; and a step of generating the final compensation value by applying the weighted value to the compensation value.
The display device may include: a first weight value lookup table in which weight values based on a first display frequency, a reference display luminance, and a reference input gray scale are stored in advance; and a second weight lookup table in which weight values based on a second display frequency, the reference display luminance, and the reference input gray scale are stored in advance, the first display frequency being different from the second display frequency.
The display device may further include: a first compensation value lookup table in which compensation values based on the first display frequency and a reference position of the pixel are stored in advance; and a second compensation value lookup table in which compensation values based on the second display frequency and the reference position are stored in advance.
The step of determining the weighted value may include: and outputting the weight value included in the first weight value lookup table as a first weight value when the input display frequency is the same as the first display frequency, and outputting the weight value included in the second weight value lookup table as the first weight value when the input display frequency is the same as the second display frequency.
The step of determining the weighted value may further comprise: and a step of selecting two reference display luminances having the smallest difference from the input display luminance, and interpolating the first weighted values corresponding to the two selected reference display luminances for each of the reference input grayscales, thereby generating a second weighted value for the input display luminance.
The step of determining the weighted value may further comprise: and selecting two reference input grayscales with smaller differences from the input grayscales for each of the input grayscales, and interpolating two second weighted values of the two selected reference input grayscales, thereby generating a third weighted value for the input grayscales.
The step of determining the compensation value may comprise: and outputting the compensation value included in the first compensation value lookup table as a first compensation value when the input display frequency is the same as the first display frequency, and outputting the compensation value included in the second compensation value lookup table as the first compensation value when the input display frequency is the same as the second display frequency.
The step of determining the compensation value may further comprise: and a step of interpolating the first compensation value to generate a second compensation value for a pixel not located at the reference position.
In the generating of the final compensation value, the third weighted value may be applied to the second compensation value to generate the final compensation value.
The third weighted value may be multiplied by the second compensation value to generate the final compensation value, and the final compensation value may be added to the input gradation to generate the output gradation.
An integrated circuit according to an embodiment of the present invention includes: a first circuit unit that generates a final compensation value for an input image; and a second circuit section that receives an input gradation for the input image and applies the final compensation value to the input gradation to generate an output gradation, the first circuit section determining a weight value based on a display frequency, a display luminance, and the input gradation, the first circuit section determining a compensation value based on the display frequency and a position of a pixel, the first circuit section applying the weight value to the compensation value to generate the final compensation value.
The first circuit portion may include: a first weight value lookup table in which weight values based on a first display frequency, a reference display luminance, and a reference input gray scale are stored in advance; and a second weight lookup table in which weight values based on a second display frequency, the reference display luminance, and the reference input gray scale are stored in advance, the first display frequency being different from the second display frequency.
The first circuit portion may further include: a first compensation value lookup table in which compensation values based on the first display frequency and a reference position of the pixel are stored in advance; and a second compensation value lookup table in which compensation values based on the second display frequency and the reference position are stored in advance.
The first circuit portion may further include: a first multiplexer that receives an input display frequency and outputs the weight value included in the first weight value lookup table as a first weight value in the case where the input display frequency is the same as the first display frequency, and outputs the weight value included in the second weight value lookup table as the first weight value in the case where the input display frequency is the same as the second display frequency.
The first circuit portion may further include: and a luminance compensation unit configured to receive an input display luminance, select two reference display luminances having a smallest difference from the input display luminance, and interpolate the first weighted value corresponding to the selected two reference display luminances for each of the reference input grayscales, thereby generating a second weighted value for the input display luminance.
The first circuit portion may further include: and a gray level compensation unit configured to receive the input gray levels, select two reference input gray levels having the smallest difference from the input gray levels for the input gray levels, and interpolate two second weighted values of the two selected reference input gray levels, thereby generating a third weighted value for the input gray level.
The first circuit portion may further include: a second multiplexer that receives the input display frequency and outputs the compensation value included in the first compensation value lookup table as a first compensation value if the input display frequency is the same as the first display frequency, and outputs the compensation value included in the second compensation value lookup table as the first compensation value if the input display frequency is the same as the second display frequency.
The first circuit portion may further include: and a position compensation unit that interpolates the first compensation value to generate a second compensation value for a pixel that is not located at the reference position.
The first circuit portion may further include: and a final compensation value generation unit configured to apply the third weighted value to the second compensation value to generate the final compensation value.
The final compensation value generation unit may generate the final compensation value by multiplying the third weighted value by the second compensation value, and the second circuit unit may generate the output gradation by adding the final compensation value to the input gradation.
According to the display device and the driving method thereof of the present invention, an appropriate image compensation value can be calculated for various driving conditions at a minimum cost.
Drawings
Fig. 1 is a diagram for explaining a display device according to an embodiment of the present invention.
Fig. 2 is a diagram for explaining a sub-pixel according to an embodiment of the present invention.
Fig. 3 is a diagram for explaining an exemplary driving method of the sub-pixel of fig. 2.
Fig. 4 is a diagram for explaining the compensation value determining section according to an embodiment of the present invention.
Fig. 5 is a diagram for explaining a first weight value according to an embodiment of the present invention.
Fig. 6 is a diagram for explaining the second weighting value according to an embodiment of the present invention.
Fig. 7 is a diagram for explaining a third weight value according to an embodiment of the present invention.
Fig. 8 is a diagram for explaining a first compensation value according to an embodiment of the present invention.
Fig. 9 and 10 are diagrams for explaining a second compensation value according to an embodiment of the present invention.
Fig. 11 is a block diagram of an electronic device according to an embodiment of the invention.
Detailed Description
Hereinafter, various embodiments of the present invention will be described in detail with reference to the accompanying drawings so that persons having ordinary knowledge in the art to which the present invention pertains can easily practice the present invention. The present invention may be embodied in a variety of different forms and is not limited to the embodiments described herein.
For the sake of clarity of the description of the present invention, parts not related to the description are omitted, and the same reference numerals are given to the same or similar constituent elements throughout the description. Thus, the previously described reference numerals may also be used in other figures.
The dimensions and thickness of each structure shown in the drawings are arbitrarily shown for convenience of explanation, and therefore the present invention is not necessarily limited to the illustration. In the drawings, thicknesses may be exaggerated for clarity of presentation of various layers and regions.
In addition, the expression "same" in the description may mean "substantially the same". That is, the person having ordinary knowledge may accept the same to the same extent. Other expressions may be those omitting "substantial".
Fig. 1 is a diagram for explaining a display device according to an embodiment of the present invention.
Referring to fig. 1, a display device 10 according to an embodiment of the present invention may include a processor 9, a timing control part 11, a data driving part 12, a scan driving part 13, a pixel part 14, a light emission driving part 15, and a compensation value determining part 16.
The processor 9 may provide input gray scales for an input image (or image frame). The input gray scale may include a first color gray scale, a second color gray scale, and a third color gray scale for each pixel. The first color gradation may be a gradation for expressing the first color, the second color gradation may be a gradation for expressing the second color, and the third color gradation may be a gradation for expressing the third color. The processor 9 may be an application processor (application processor), a CPU (central processing unit; central processing unit), a GPU (graphics processing unit; graphics processing unit), or the like.
In addition, the processor 9 may provide control signals for the input image. Such control signals may include a horizontal synchronization signal (horizontal synchronization signal, hsync), a vertical synchronization signal (vertical synchronization signal, vsync), and a data enable signal (data enable signal). The vertical synchronization signal may include a plurality of pulses, and may refer to a previous frame period ending and a current frame period starting with a point of time at which each pulse is generated as a reference. The interval between adjacent pulses of the vertical synchronization signal may correspond to one frame period. The horizontal synchronization signal may include a plurality of pulses, and may refer to a previous horizontal period (horizontal period) ending and a new horizontal period starting with a point in time at which each pulse is generated as a reference. The interval between adjacent pulses of the horizontal synchronization signal may correspond to a horizontal period. The data enable signal may have an enable level for a specific horizontal period and may have a disable level for the remaining periods. When the data enable signal is an enable level, it may refer to supplying color gray scale in a corresponding horizontal period.
The timing control section 11 may receive an input gradation for an input image. The timing control unit 11 may be configured as an integrated circuit integrated with the compensation value determination unit 16. In this case, the compensation value determining unit 16 may be referred to as a first circuit unit, and the timing control unit 11 may be referred to as a second circuit unit. However, in the integrated circuit, the first circuit portion and the second circuit portion may not always be physically separated from each other, and a part of the elements may be shared with each other. In another example, the timing control unit 11 and the compensation value determination unit 16 may be configured as separate circuits. In this case, the timing control section 11 may supply the input gradation and the necessary control signal to the compensation value determination section 16.
The compensation value determination section 16 may generate a final compensation value for the input image. The compensation value determination section 16 may determine the weighting value based on the display frequency, the display luminance, and the input gray scale. In addition, the compensation value determination section 16 may determine the compensation value based on the display frequency and the position of the pixel. The compensation value determination unit 16 may generate the final compensation value by applying the weighting value to the compensation value.
The timing control unit 11 may apply the final compensation value to the input gradation to generate the output gradation. For example, the timing control section 11 may add the final compensation value to the input gradation to generate the output gradation.
The timing control section 11 may supply the output gradation to the data driving section 12. In addition, the timing control section 11 may supply a clock signal, a scan start signal, or the like to the scan driving section 13. The timing control section 11 may supply a clock signal, a light emission suspension signal, or the like to the light emission driving section 15.
The data driving section 12 may generate data voltages to be supplied to the data lines DL1, DL2, DL3, and DLn using the output gradation and the control signal received from the timing control section 11. For example, the data driving unit 12 may sample the output gradation using a clock signal, and apply a data voltage corresponding to the output gradation to the data lines DL1 to DLn in pixel row units. n may be an integer greater than 0. The pixel row means sub-pixels connected to the same scan line and light emitting line.
According to the embodiment, the timing control section 11, the data driving section 12, and the compensation value determining section 16 may be configured as an integrated circuit. In this case, the compensation value determining unit 16 may be referred to as a first circuit unit, and the timing control unit 11 may be referred to as a second circuit unit. On the other hand, the data driving section 12 may be referred to as a third circuit section. However, in the integrated circuit, the first circuit portion, the second circuit portion, and the third circuit portion may not always be physically separated from one another, and some elements may be shared with one another.
The scan driving section 13 may generate scan signals to be supplied to the scan lines SL0, SL1, SL2, and SLm by receiving a clock signal, a scan start signal, and the like from the timing control section 11. For example, the scan driving unit 13 may sequentially supply the scan signals having the pulses of the on level to the scan lines SL1 to SLm. For example, the scan driving unit 13 may be configured in a shift register (shift register) format, and may generate the scan signal so as to sequentially transmit the scan start signal in the form of a pulse as an on level to the next stage circuit according to the control of the clock signal. m may be an integer greater than 0.
The light emission driving section 15 may receive a clock signal, a light emission suspension signal, or the like from the timing control section 11 to generate light emission signals to be supplied to the light emission lines EL1, EL2, EL3, and ELo. For example, the light emission driving unit 15 may sequentially supply the light emission signals of the pulses having the off-level to the light emission lines EL1 to ELo. For example, the light emission driving unit 15 may be configured in a shift register form, and may generate the light emission signal so as to sequentially transmit the light emission suspension signal in a pulse form as an off level to the next stage circuit according to the control of the clock signal. o may be an integer greater than 0.
The pixel section 14 includes sub-pixels SPij. Each sub-pixel SPij may be connected to a corresponding data line, scan line, and light emitting line. i and j may each be integers greater than 0. The sub-pixel SPij may mean a sub-pixel in which a scan transistor is connected to an i-th scan line and a j-th data line.
The pixel section 14 may include a first subpixel that emits light of a first color, a second subpixel that emits light of a second color, and a third subpixel that emits light of a third color. The first color, the second color, and the third color may be different colors from each other. For example, the first color may be one of red, green, and blue, the second color may be one of red, green, and blue that is not the first color, and the third color may be the remaining colors of red, green, and blue that are not the first and second colors. In addition, as the first to third colors, magenta (magenta), cyan (cyan), and yellow (yellow) may be used instead of red, green, and blue. However, in the present embodiment, for convenience of explanation, it is assumed that the first color is red, the second color is green, and the third color is blue. The first sub-pixel, the second sub-pixel, and the third sub-pixel may constitute one pixel. However, adjacent pixels may share one sub-pixel according to the structure of the pixel portion 14.
The pixel portion 14 may be made of diamond (diamond)) RGB-Stripe (RGB-Stripe), S-Stripe (S-Stripe), real RGB (Re)al RGB), conventional PENTILE (normal +.>) Etc. in various forms.
The sub-pixels SPij of the pixel section 14 are assumed to be arranged in the first direction DR1 and the second direction DR2 perpendicular to the first direction DR 1. In addition, the light emission direction of the sub-pixel SPij is assumed to be a third direction DR3 perpendicular to the first direction DR1 and the second direction DR 2.
Fig. 2 is a diagram for explaining a sub-pixel according to an embodiment of the present invention.
Referring to fig. 2, the subpixel SPij includes transistors T1, T2, T3, T4, T5, T6, T7, a storage capacitor Cst, and a light emitting element LD.
A circuit including P-type transistors will be described below as an example. However, a person skilled in the art can design a circuit composed of N-type transistors by changing the polarity of the voltage applied to the gate terminal. Similarly, a circuit composed of a combination of P-type transistors and N-type transistors can be designed by those skilled in the art. P-type transistors are generally referred to as transistors in which the amount of current increases as the voltage difference between the gate electrode and the source electrode increases in the negative direction. An N-type transistor is generally referred to as a transistor in which the amount of current increases when the voltage difference between the gate electrode and the source electrode increases in the positive direction. The transistor may be formed in various forms such as a TFT (thin film transistor; thin film transistor), a FET (field effect transistor; field effect transistor), a BJT (bipolar junction transistor; bipolar junction transistor), or the like.
The gate electrode of the first transistor T1 may be connected to the first node N1, the first electrode is connected to the second node N2, and the second electrode is connected to the third node N3. The first transistor T1 may be named as a driving transistor.
The gate electrode of the second transistor T2 may be connected to the scan line SLi1, the first electrode may be connected to the data line DLj, and the second electrode may be connected to the second node N2. The second transistor T2 may be named as a scan transistor.
The gate electrode of the third transistor T3 may be connected to the scan line SLi2, the first electrode is connected to the first node N1, and the second electrode is connected to the third node N3. The third transistor T3 may be named diode-connected transistor.
The gate electrode of the fourth transistor T4 may be connected to the scan line SLi3, the first electrode is connected to the first node N1, and the second electrode is connected to the initialization line INTL. The fourth transistor T4 may be named as a gate initialization transistor.
The gate electrode of the fifth transistor T5 may be connected to the i-th light emitting line ELi, the first electrode is connected to the first power line ELVDDL, and the second electrode is connected to the second node N2. The fifth transistor T5 may be named as a light emitting transistor. In another embodiment, the gate electrode of the fifth transistor T5 may also be connected to a light emitting line different from the light emitting line connected to the gate electrode of the sixth transistor T6.
The sixth transistor T6 may have a gate electrode connected to the i-th light emitting line ELi, a first electrode connected to the third node N3, and a second electrode connected to the anode of the light emitting element LD. The sixth transistor T6 may be named as a light emitting transistor. In another embodiment, the gate electrode of the sixth transistor T6 may also be connected to a light emitting line different from the light emitting line connected to the gate electrode of the fifth transistor T5.
The seventh transistor T7 may have a gate electrode connected to the scan line SLi4, a first electrode connected to the initialization line INTL, and a second electrode connected to the anode of the light emitting element LD. The seventh transistor T7 may be named as a light emitting element initializing transistor.
The first electrode of the storage capacitor Cst may be connected to the first power line ELVDDL, and the second electrode is connected to the first node N1.
The anode of the light emitting element LD may be connected to the second electrode of the sixth transistor T6, and the cathode may be connected to the second power line ELVSSL. The light emitting element LD may be a light emitting diode. The light-emitting element LD may be composed of an organic light-emitting element (organic light emitting diode), an inorganic light-emitting element (inorganic light emitting diode), a quantum dot/well light-emitting element (quantum dot/well light emitting diode), or the like. In the present embodiment, only one light emitting element LD is provided for each pixel, but in other embodiments, a plurality of light emitting elements may be provided for each pixel. In this case, the plurality of light emitting elements may be connected in series, parallel, series-parallel, or the like. The light emitting element LD of each sub-pixel SPij may emit light in one of the first color, the second color, and the third color.
The first power supply voltage may be applied to the first power supply line ELVDDL, the second power supply voltage may be applied to the second power supply line ELVSSL, and the initialization voltage may be applied to the initialization line INTL. For example, the first supply voltage may be greater than the second supply voltage. For example, the initialization voltage may be equal to or greater than the second power supply voltage. For example, the initialization voltage may correspond to the smallest data voltage among the data voltages corresponding to the output gray scale. In another example, the magnitude of the initialization voltage may be smaller than the magnitude of the data voltage corresponding to the color gray scale.
Fig. 3 is a diagram for explaining an exemplary driving method of the sub-pixel of fig. 2.
Hereinafter, for convenience of explanation, it is assumed that the scanning lines SLi1, SLi2, and SLi4 are the ith scanning line SLi, and that the scanning line SLi3 is the i-1 th scanning line SL (i-1). However, the connection relationship of the scanning lines SLi1, SLi2, SLi3, SLi4 may be varied according to the embodiment. For example, the scan line SLi4 may be the i-1 st scan line or the i+1 th scan line.
First, a light emission signal of an off-level (logic high level) is applied to the i-th light emission line ELi, a DATA voltage DATA (i-1) j for the i-1-th subpixel is applied to the DATA line DLj, and a scan signal of an on-level (logic low level) is applied to the scan line SLi 3. The high/low logic level may vary depending on whether the transistor is P-type or N-type.
At this time, the scan signal of the off level is applied to the scan lines SLi1, SLi2, and thus the second transistor T2 is in the off state, preventing the DATA voltage DATA (i-1) j for the i-1 th subpixel from being introduced to the i-th subpixel SPij.
At this time, the fourth transistor T4 is in an on state, and thus the first node N1 is connected to the initialization line INTL, thereby initializing the voltage of the first node N1. Since the off-level light emission signal is applied to the light emitting line ELi, the transistors T5 and T6 are turned off, and unnecessary light emission of the light emitting element LD is prevented from being generated in accordance with the initialization voltage application process.
Next, the data voltage DATAij for the i-th subpixel SPij is applied to the data line DLj, and the scan signal of the on level is applied to the scan lines SLi1 and SLi 2. Thus, the transistors T2, T1, and T3 are turned on, and the data line DLj is electrically connected to the first node N1. Accordingly, a compensation voltage, which is subtracted from the data voltage DATAij by the threshold voltage of the first transistor T1, is applied to the second electrode (i.e., the first node N1) of the storage capacitor Cst, which maintains a voltage corresponding to a difference between the first power supply voltage and the compensation voltage. Such a period may be named a threshold voltage compensation period or a data writing period.
In addition, in the case where the scanning line SLi4 is the i-th scanning line, the seventh transistor T7 is in an on state, and therefore the anode of the light emitting element LD is connected to the initialization line INTL, and the light emitting element LD is initialized to an amount of charge corresponding to a voltage difference between the initialization voltage and the second power supply voltage.
Thereafter, the transistors T5, T6 may be turned on according to the light emission signal of the on level applied to the i-th light emission line ELi. Accordingly, a driving current path connecting the first power line ELVDDL, the fifth transistor T5, the first transistor T1, the sixth transistor T6, the light emitting element LD, and the second power line ELVSSL is formed.
The amount of driving current flowing to the first electrode and the second electrode of the first transistor T1 is adjusted according to the voltage held in the storage capacitor Cst. The light emitting element LD emits light with a luminance corresponding to the amount of driving current. The light emitting element LD emits light until a light emitting signal of an off level is applied to the i-th light emitting line ELi.
When the light emitting signal is at an on level, the sub-pixel receiving the corresponding light emitting signal may be in a display state. Therefore, a period in which the light emission signal is at the on level may be referred to as a light emission period EP (or a light emission permission period). In addition, when the light emission signal is at the off level, the sub-pixel receiving the corresponding light emission signal may be in a non-display state. Therefore, a period in which the light emission signal is at the off level may be referred to as a non-light emission period NEP (or a light emission non-permission period).
The non-emission period NEP illustrated in fig. 3 is for preventing the sub-pixel SPij from emitting light with undesired luminance during the period in which the initialization period and the data writing period pass.
The non-light emitting period NEP may be additionally provided more than once during a period (e.g., one frame period) in which data written in the sub-pixel SPij is maintained. This may be to effectively represent a low gray scale by reducing the light emission period EP of the sub-pixel SPij, or smoothly blur (blur) the motion (motion) of the processed image.
Fig. 4 is a diagram for explaining the compensation value determining section according to an embodiment of the present invention. Fig. 5 is a diagram for explaining a first weight value according to an embodiment of the present invention. Fig. 6 is a diagram for explaining the second weighting value according to an embodiment of the present invention. Fig. 7 is a diagram for explaining a third weight value according to an embodiment of the present invention. Fig. 8 is a diagram for explaining a first compensation value according to an embodiment of the present invention. Fig. 9 and 10 are diagrams for explaining a second compensation value according to an embodiment of the present invention.
Referring to fig. 4, the compensation value determining part 16 according to an embodiment of the present invention may include a first weight value lookup table 161, a second weight value lookup table 162, a first multiplexer 163, a brightness compensating part 164, a gray compensating part 165, a first compensation value lookup table 166, a second compensation value lookup table 167, a second multiplexer 168, a position compensating part 169, and a final compensation value generating part MTP.
The first weight value lookup table 161 may store weight values 161i based on the first display frequency, the reference display luminance, and the reference input gray scale in advance. The second weight value lookup table 162 may previously store a weight value 162i based on the second display frequency, the reference display luminance, and the reference input gray scale. The first weight lookup table 161 and the second weight lookup table 162 may also mean a portion of the memory space of one memory device. Alternatively, the first weight lookup table 161 and the second weight lookup table 162 may be implemented by separate storage devices.
The display frequency may mean the number of image frames displayed per second in the display device 10. The first display frequency may be different from the second display frequency. The first display frequency may be a frequency suitable for displaying a moving image. For example, the first display frequency may be a high frequency above 60 Hz. The second display frequency may be a frequency suitable for displaying a still image. For example, the second display frequency may be a low frequency of less than 60 Hz.
The reference display luminance may be a part of the plurality of display luminances set in the display device 10. The display luminance (display brightness) may be set manually by a user operating the display device 10 or may be set automatically by an algorithm associated with an illuminance sensor or the like. The magnitude of the display brightness may limit the maximum brightness of the light emitted from the pixel. For example, the display luminance may be luminance information of light emitted from a pixel set to a maximum gradation. For example, the display luminance may be a luminance of white light generated by the entire pixel of the pixel unit 14 emitting light in accordance with the white gradation. The unit of brightness may be nit (Nits). For example, the maximum value of the plurality of display luminances may be 3000 nits, and the minimum value of the plurality of display luminances may be 4 nits. The maximum value and the minimum value of the plurality of display brightnesses may be variously set according to products. Even in the same gradation, the data voltage changes according to the display luminance, and thus the light emission luminance of the pixel also changes.
The reference input gray scale may be a part of a plurality of input gray scales set in the display device 10. For example, the minimum value of the plurality of input grayscales may be 0 and the maximum value may be 255. The maximum value and the minimum value of the plurality of input grayscales can be variously set according to products.
According to the present embodiment, it is not necessary to store all the weighting values for all the display luminances and all the input grayscales, and thus it is possible to prevent the tact time from increasing and the memory capacity from increasing.
The first multiplexer 163 may receive the input display frequency FREQi. The input display frequency FREQi may be a display frequency set for the current input image. In the case where the input display frequency FREQi is the same as the first display frequency, the first multiplexer 163 may output the weight value 161i included in the first weight value lookup table 161 as the first weight value 163i. On the other hand, in the case where the input display frequency FREQi is the same as the second display frequency, the first multiplexer 163 may output the weight value 162i included in the second weight value lookup table 162 as the first weight value 163i.
Referring to fig. 5, a graph of a first weighting value 163i is exemplarily shown. The horizontal axis of the graph represents the display brightness DBV, and the vertical axis represents the Weight (Weight). The first weighting values 163i may be weighting values corresponding to the reference input grayscales (for example, seven) in the respective reference display brightnesses DBV1, DBV2, DBV3, DBV4, DBV5, the.
The luminance compensation section 164 may receive the input display luminance DBVi. The input display luminance DBVi may be a display luminance currently set in the display device 10. Referring to fig. 5, the luminance compensation unit 164 may select two reference display luminances DBV3 and DBV4 having a smaller difference from the input display luminance DBVi. The reference display luminance DBV3 selected may be the reference display luminance having the smallest difference from the input display luminance DBVi among the reference display luminances DBV1 to DBV3 smaller than the input display luminance DBVi. The reference display luminance DBV4 selected may be the reference display luminance having the smallest difference from the input display luminance DBVi among the reference display luminances DBV4 to DBVk that are larger than the input display luminance DBVi.
The luminance compensation unit 164 may interpolate (for example, linearly interpolate (linear interpolation)) the first weighted value 163i corresponding to the selected two reference display luminance DBV3, DBV4 for each of the reference input grayscales G1, G2, G3, G4, G5, G6, G7, thereby generating the second weighted value 164i for the input display luminance DBVi. The second weight 164i may include weight values w1, w2, w3, w4, w5, w6, w7 corresponding to each of the reference input grayscales G1, G2, G3, G4, G5, G6, G7.
The gray compensation part 165 may receive the input gray DATAi. The gradation compensation unit 165 may select two reference input gradations having a smaller difference from the input gradation for each of the input gradations, gi1, gi2, G3, gi4, gi5, G6, and interpolate (for example, linearly interpolate (linear interpolation)) the second weighted values of the selected two reference input gradations, thereby generating the third weighted value 165i for the input gradation DATAi.
Hereinafter, a process in which the gradation compensation section 165 operates for the input gradation Gi1 will be described as an example. One of the reference input grayscales G2 of the selected reference input grayscales G2, G3 may be the reference input gray G2 having the smallest difference from the corresponding input gray Gi1 among the reference input grayscales G1, G2 smaller than the corresponding input gray Gi 1. The other reference input gray G3 of the selected reference input gray G2, G3 may be the reference input gray G3 having the smallest difference from the corresponding input gray Gi1 among the reference input gray G3, G4, G5, G6, G7 larger than the corresponding input gray Gi 1. The gray-scale compensation section 165 may interpolate (e.g., linearly interpolate (linear interpolation)) the second weighted values w2, w3 of the two selected reference input gray scales G2, G3, thereby generating a third weighted value wi1 for the input gray scale Gi 1. Similarly, the gradation compensation section 165 may generate third weighted values for input gradation. For the same input gray levels of the input gray levels DATAi as the reference input gray levels, the second weight values w3, w6 may be used as the third weight values w3, w6.
The compensation value 166i based on the first display frequency and the reference position of the pixel may be stored in advance in the first compensation value lookup table 166. The compensation value 167i based on the second display frequency and the reference position may be stored in advance in the second compensation value lookup table 167. The first compensation value lookup table 166 and the second compensation value lookup table 167 may also mean a portion of the memory space of one memory device. Alternatively, the first compensation value lookup table 166 and the second compensation value lookup table 167 may be implemented by separate storage devices.
According to the present embodiment, it is not necessary to store all the weighting values for all the positions of the pixels, and thus it is possible to prevent the takt time from increasing and the memory capacity from increasing.
The second multiplexer 168 may receive an input display frequency FREQi. In case the input display frequency FREQi is the same as the first display frequency, the second multiplexer 168 may output the compensation value 166i included in the first compensation value lookup table 166 as the first compensation value 168i. On the other hand, in the case where the input display frequency FREQi is the same as the second display frequency, the second multiplexer 168 may output the compensation value 167i included in the second compensation value lookup table 167 as the first compensation value 168i.
Referring to fig. 8, for easy understanding, the first compensation value 168i is shown with reference to the first direction DR1 and the second direction DR2, which are the same as the arrangement direction of the pixels. As previously described, the first compensation value 168i may not include the positions for all pixels, but only the first compensation values for the reference positions.
The position compensation unit 169 may interpolate (e.g., bilinear interpolation (bilinear interpolation)) the first compensation value 168i to generate a second compensation value 169i for a pixel that is not located at the reference position. For example, referring to fig. 9, the position compensation unit 169 may interpolate the first compensation value S11 and the first compensation value S14 located in the first direction DR1 of the first compensation value S11 to generate the second compensation value S13. On the other hand, the position compensation unit 169 may interpolate the first compensation value S41 and the first compensation value S44 located in the first direction DR1 of the first compensation value S41, thereby generating the second compensation value S43. Next, the position compensation unit 169 may interpolate the second compensation value S43 and the second compensation value S13 located in the second direction DR2 of the second compensation value S43, thereby generating a second compensation value S23. The position compensation unit 169 may repeat such a process to calculate second compensation values for pixels not located at the reference position (see fig. 10). On the other hand, the position compensation unit 169 may use the first compensation value for the reference position, S11, S14, S41, S44.
The final compensation value generation unit MTP may apply the third weighted value 165i to the second compensation value 169i to generate the final compensation value MTPi. For example, the final compensation value generation unit MTP may generate the final compensation value MTPi by multiplying the third weighted value 165i by the second compensation value 169 i. The timing control unit 11 may generate an output gradation by adding the final compensation value MTPi to the input gradation DATAi (see fig. 1).
Therefore, according to the present embodiment, an appropriate image compensation value can be calculated for the position of the entire pixel, the entire display luminance, and the entire input gray scale using the minimum memory capacity.
Fig. 11 is a block diagram of an electronic device according to an embodiment of the invention.
The electronic device 101 outputs various information through the display module 140 in the operating system. If the processor 110 runs the application program stored in the memory 180, the display module 140 provides the application program information to the user through the display panel 141.
The processor 110 obtains an external input through the input module 130 or the sensor module 191, and runs an application corresponding to the external input. For example, in case that the user selects the camera icon displayed on the display panel 141, the processor 110 obtains a user input through the input sensor 191-2 and activates the camera module 171. The processor 110 transmits image data corresponding to the photographed image obtained through the camera module 171 to the display module 140. The display module 140 may display an image corresponding to the photographed image through the display panel 141.
As still another example, in the case where personal information authentication is performed in the display module 140, the fingerprint sensor 191-1 obtains input fingerprint information as input data. The processor 110 compares input data obtained through the fingerprint sensor 191-1 with authentication data stored in the memory 180, and runs an application program according to the comparison result. The display module 140 may display information according to the logical operation of the application program through the display panel 141.
As yet another example, in case of selecting a music streaming icon displayed on the display module 140, the processor 110 obtains a user input through the input sensor 191-2 and activates a music streaming application stored in the memory 180. If a music operation command is input in the music streaming application, the processor 110 activates the sound output module 193 to provide the user with sound information conforming to the music operation command.
The operation of the electronic device 101 is briefly described above. The following describes the structure of the electronic device 101 in detail. Some of the structures of the electronic device 101 described later may be provided as one structure integrally, or one structure may be provided as two or more structures separated from each other.
Referring to fig. 11, an electronic device 101 may communicate with an external electronic device 102 through a network (e.g., a short-range wireless communication network or a long-range wireless communication network). According to one embodiment, the electronic device 101 may include a processor 110, a memory 180, an input module 130, a display module 140, a power module 150, an internal module 190, and an external module 170. According to an embodiment, the electronic device 101 may omit at least one of the above-described constituent elements, or add one or more other constituent elements. According to one embodiment, some of the above constituent elements (e.g., sensor module 191, antenna module 192, or acoustic output module 193) may be integrated into another constituent element (e.g., display module 140)
The processor 110 may execute software to control at least one other constituent element (e.g., hardware or software constituent element) of the electronic device 101 connected to the processor 110, and may perform various data processing or operations. According to one embodiment, the processor 110 may store commands or data received from other constituent elements (e.g., the input module 130, the sensor module 191, or the communication module 173) in the volatile memory 181 as at least a portion of data processing or operation, and process the commands or data stored in the volatile memory 181, with the resulting data stored in the non-volatile memory 182.
The processor 110 may include a main processor 111 and an auxiliary processor 112. The main processor 111 may include one or more of a central processing unit (CPU: central processing unit) 111-1 and an application processor (AP: application processor). The main processor 111 may further include any one or more of an image processing device (GPU: graphics processing unit) 111-2, a communication processor (CP: communication processor), and an image signal processor (ISP: image signal processor). The main processor 111 may also include a neural network processing unit (NPU: neural processing unit) 111-3. The neural network processing device 111-3 is a processor specialized in the processing of artificial intelligence models, which can be generated by machine learning. The artificial intelligence model may include a plurality of artificial neural network layers. The artificial neural network may be one of a deep neural network (DNN: deep neural network), CNN (convolutional neural network; convolutional neural network), RNN (recurrent neural network; cyclic neural network), RBM (restricted boltzmann machine; restricted Boltzmann machine), DBN (deep belief network; deep belief network), BRDNN (bidirectional recurrent deep neural network; bi-directional cyclic deep neural network), deep Q network (deep Q-networks), or a combination of two or more of the foregoing, but is not limited to the foregoing example.
The auxiliary processor 112 may include a controller 112-1. The controller 112-1 may include interface conversion circuitry and timing control circuitry. The controller 112-1 receives the image signal from the main processor 111, and converts the data format of the image signal to match the interface specification of the display module 140 to output image data. The controller 112-1 may output various control signals required for driving the display module 140.
The auxiliary processor 112 may further include a data conversion circuit 112-2, a gamma correction circuit 112-3, a rendering circuit 112-4, and the like. The data conversion circuit 112-2 may receive image data from the controller 112-1 and compensate the image data according to characteristics of the electronic device 101 or settings of a user or the like so that an image is displayed at a desired brightness, or convert the image data for reduction of power consumption, afterimage compensation or the like. The gamma correction circuit 112-3 may transform image data or gamma reference voltages, etc., so that an image displayed at the electronic device 101 has a desired gamma characteristic. The rendering circuit 112-4 may receive image data from the controller 112-1 and render the image data in consideration of a pixel configuration or the like applied to the display panel 141 of the electronic device 101. At least one of the data conversion circuit 112-2, the gamma correction circuit 112-3, and the rendering circuit 112-4 may be integrated into other constituent elements (e.g., the main processor 111 or the controller 112-1). At least one of the data conversion circuit 112-2, the gamma correction circuit 112-3, and the rendering circuit 112-4 may be integrated into the data driver 143 described later.
The memory 180 may store input data or output data for various data used by at least one constituent element of the electronic device 101 (e.g., the processor 110 or the sensor module 191) and commands related thereto. The memory 180 may include one or more of a volatile memory 181 and a non-volatile memory 182.
The input module 130 may receive commands or data for constituent elements of the electronic device 101 (e.g., the processor 110, the sensor module 191, or the sound output module 193) from outside the electronic device 101 (e.g., the user or the external electronic device 102).
The input module 130 may include a first input module 131 to input commands or data from a user and a second input module 132 to input commands or data from the external electronic device 102. The first input module 131 may include a microphone, a mouse, a keyboard, keys (e.g., buttons) or a pen (e.g., a passive pen or an active pen). The second input module 132 may support a specified protocol that can be connected with the external electronic device 102 in a wired or wireless connection. According to an embodiment, the second input module 132 may include an HDMI (high definition multimedia interface; high definition multimedia interface), a USB (universal serial bus; universal serial bus) interface, an SD card interface, or an audio interface. The second input module 132 may include a connector capable of physically connecting with the external electronic device 102, such as an HDMI connector, a USB connector, an SD card connector, or an audio connector (e.g., a headphone connector).
The display module 140 visually provides information to the user. The display module 140 may include a display panel 141, a scan driver 142, and a data driver 143. The display module 140 may further include a window, a chassis, and a bracket for protecting the display panel 141.
The display panel 141 may include a liquid crystal display panel, an organic light emitting display panel, or an inorganic light emitting display panel, and the kind of the display panel 141 is not particularly limited. The display panel 141 may be of a rigid type or a flexible type capable of being curled or folded. The display module 140 may further include a supporter, a bracket, a heat dissipation member, or the like to support the display panel 141.
The scan driver 142 may be mounted as a driving chip to the display panel 141. In addition, the scan driver 142 may be integrated into the display panel 141. For example, the scan driver 142 may include an amorphous silicon thin film transistor gate driver circuit (ASG, amorphous Silicon TFT Gate driver circuit), a low temperature polysilicon thin film transistor gate driver circuit (LTPS (Low Temperature Polycrystalline Silicon) TFT Gate driver circuit), or an oxide semiconductor thin film transistor gate driver circuit (OSG, oxide Semiconductor TFT Gate driver circuit) built into the display panel 141. The scan driver 142 receives a control signal from the controller 112-1 and outputs a scan signal to the display panel 141 in response to the control signal.
The display panel 141 may further include a light emitting driver. The light emission driver outputs a light emission control signal to the display panel 141 in response to a control signal received from the controller 112-1. The light emitting driver may be formed separately from the scan driver 142 or integrated into the scan driver 142.
The data driver 143 receives a control signal from the controller 112-1, converts image data into an analog voltage (e.g., a data voltage) in response to the control signal, and then outputs the data voltage to the display panel 141.
The data driver 143 may be integrated into other constituent elements (e.g., the controller 112-1). The functions of the interface conversion circuit and the timing control circuit of the controller 112-1 described above may also be integrated into the data driver 143.
The display module 140 may further include a light emitting driver, a voltage generating circuit, and the like. The voltage generating circuit may output various voltages required for driving the display panel 141.
The power module 150 supplies power to the constituent elements of the electronic device 101. The power module 150 may include a battery to charge a power supply voltage. The battery may include a primary battery that is not rechargeable, a secondary battery that is rechargeable, or a fuel cell. The power module 150 may include a PMIC (power management integrated circuit; power management integrated circuit). The PMIC supplies the optimized power to each of the above-described modules and the modules described later. The power module 150 may include wireless power transmitting and receiving parts electrically connected with the battery. The wireless power transmitting and receiving part may include a plurality of antenna radiators in the form of coils.
The electronic device 101 may further include an internal module 190 and an external module 170. The built-in module 190 may include a sensor module 191, an antenna module 192, and a sound output module 193. The external module 170 may include a camera module 171, a lighting module 172, and a communication module 173.
The sensor module 191 may sense an input through the body of the user or an input through a pen in the first input module 131 and generate an electrical signal or a data value corresponding to the input. The sensor module 191 may include any one or more of a fingerprint sensor 191-1, an input sensor 191-2, and a digitizer 191-3.
The fingerprint sensor 191-1 may generate a data value corresponding to a fingerprint of the user. The fingerprint sensor 191-1 may comprise any of an optical or capacitive fingerprint sensor.
The input sensor 191-2 may generate a data value corresponding to coordinate information input through the body of the user or input through the pen. The input sensor 191-2 generates a capacitance change amount by input into a data value. The input sensor 191-2 may sense input through a passive pen or transmit and receive data with an active pen.
The input sensor 191-2 may also measure biological signals such as blood pressure, moisture, or body fat. For example, in case that the user touches a body part to the sensor layer or the sensing panel and does not move during a certain time, the input sensor 191-2 may sense a bio-signal based on a change in an electric field (electric field) through the body part to output information desired by the user to the display module 140.
Digitizer 191-3 may generate a data value corresponding to coordinate information entered by a pen. The digitizer 191-3 generates an electromagnetic variation amount by input into a data value. Digitizer 191-3 may sense input through a passive pen or send and receive data with an active pen.
At least one of the fingerprint sensor 191-1, the input sensor 191-2, and the digitizer 191-3 may also be implemented as a sensor layer formed on the display panel 141 by a continuous process. The fingerprint sensor 191-1, the input sensor 191-2, and the digitizer 191-3 may be disposed at an upper side of the display panel 141, and any one of the fingerprint sensor 191-1, the input sensor 191-2, and the digitizer 191-3 (e.g., the digitizer 191-3) may be disposed at a lower side of the display panel 141.
Two or more of the fingerprint sensor 191-1, the input sensor 191-2, and the digitizer 191-3 may be integrated into one sensing panel by the same process. In the case of being integrated into one sensing panel, the sensing panel may be disposed between the display panel 141 and a window disposed at an upper side of the display panel 141. According to an embodiment, the sensing panel may also be disposed on the window, and the position of the sensing panel is not particularly limited.
At least one of the fingerprint sensor 191-1, the input sensor 191-2, and the digitizer 191-3 may be built-in to the display panel 141. That is, at least one of the fingerprint sensor 191-1, the input sensor 191-2, and the digitizer 191-3 may be formed simultaneously by a process of forming elements (e.g., light emitting elements, transistors, etc.) included in the display panel 141.
In addition, the sensor module 191 may generate an electrical signal or a data value corresponding to an internal state or an external state of the electronic device 101. The sensor module 191 may further include, for example, a gesture sensor, a gyro sensor, a barometric sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a bio-sensor, a temperature sensor, a humidity sensor, or an illuminance sensor.
The antenna module 192 may include one or more antennas for transmitting or receiving signals or power to or from the outside. According to an embodiment, the communication module 173 may transmit signals to the external electronic device 102 or receive signals from the external electronic device 102 through an antenna suitable for a communication mode. The antenna pattern of the antenna module 192 may also be integrated into one structure of the display module 140 (e.g., the display panel 141) or the input sensor 191-2, etc.
The sound output module 193 is a device for outputting sound signals to the outside of the electronic device 101, and may include, for example, a speaker used for general purposes such as multimedia playback or audio playback and a receiver dedicated for telephone reception. According to an embodiment, the receiver may be integrally formed with the speaker or separately formed. The sound output pattern of the sound output module 193 may also be integrated with the display module 140.
The camera module 171 can capture still images as well as moving images. According to an embodiment, the camera module 171 may include more than one lens, image sensor or image signal processor. The camera module 171 may further include an infrared camera capable of measuring the presence or absence of a user, the position of the user, the line of sight of the user, and the like.
The illumination module 172 may provide light. The illumination module 172 may include a light emitting diode or a xenon lamp (xenon lamp). The illumination module 172 may operate in conjunction with the camera module 171 or may operate independently.
The communication module 173 may support establishment of a wired or wireless communication channel between the electronic device 101 and the external electronic device 102 and communication execution through the established communication channel. The communication module 173 may include any one or all of a wireless communication module such as a cellular communication module, a short-range wireless communication module, or a GNSS (global navigation satellite system; global navigation satellite system) communication module, and a wired communication module such as a LAN (local area network; local area network) communication module, or a power line communication module. The communication module 173 may communicate with the external electronic device 102 through a near field communication network such as bluetooth, wiFi Direct (WiFi Direct), or IrDA (infrared data association; infrared data association) or a remote communication network such as a cellular network, the internet, or a computer network (e.g., LAN or WAN). The plurality of kinds of communication modules 173 described above may be implemented by one chip or each by a separate chip.
The input module 130, the sensor module 191, the camera module 171, etc. may be used to control the operation of the display module 140 in conjunction with the processor 110.
The processor 110 outputs commands or data to the display module 140, the sound output module 193, the camera module 171, or the illumination module 172 according to input data received from the input module 130. For example, the processor 110 may generate image data corresponding to input data applied through a mouse or an active pen, etc., to be output to the display module 140, or generate command data corresponding to input data to be output to the camera module 171 or the illumination module 172. The processor 110 may reduce power consumed in the electronic device 101 by converting the operating mode of the electronic device 101 to a low power mode or sleep mode without receiving input data from the input module 130 during a certain time.
The processor 110 outputs a command or data to the display module 140, the sound output module 193, the camera module 171, or the illumination module 172 according to the sensing data received from the sensor module 191. For example, the processor 110 may compare authentication data applied through the fingerprint sensor 191-1 with authentication data stored in the memory 120 and then run an application according to the comparison result. The processor 110 may run a command according to sensed data sensed through the input sensor 191-2 or the digitizer 191-3 or output corresponding image data to the display module 140. In the case where the sensor module 191 includes a temperature sensor, the processor 110 may receive temperature data for the measured temperature from the sensor module 191 and also perform brightness correction or the like for the image data based on the temperature data.
The processor 110 may receive measurement data for the presence or absence of a user, the position of the user, the line of sight of the user, and the like from the camera module 171. The processor 110 may also perform brightness correction or the like for the image data based on the measurement data. For example, the processor 110, which determines whether or not a user is present through an input from the camera module 171, may output the brightness-corrected image data to the display module 140 through the data conversion circuit 112-2 or the gamma correction circuit 112-3.
Some of the constituent elements may be connected to each other by way of an inter-peripheral device communication (e.g., bus, GPIO (general purpose input/output; general purpose input/output), SPI (serial peripheral interface; serial peripheral interface), MIPI (mobile industry processor interface; mobile industry processor interface) or UPI (Ultra path interconnect; hyper path interconnect) link) to exchange signals (e.g., commands or data) with each other. The processor 110 may communicate with the display module 140 via interfaces agreed upon with each other, for example, any of the above communication methods may be used, and is not limited to the above communication method.
The electronic device 101 according to various embodiments disclosed herein may be a variety of forms of devices. The electronic device 101 may include, for example, at least one of a portable communication device (e.g., a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device. The electronic apparatus 101 according to embodiments herein is not limited to the foregoing devices.
The drawings and detailed description of the invention referred to so far are merely illustrative of the invention, which is used for the purpose of illustrating the invention only and is not intended to limit the meaning or limit the scope of the invention described in the claims. Accordingly, it is understood by those skilled in the art that various modifications and other embodiments can be made thereto, as well as equivalents thereof. Accordingly, the true technical scope of the present invention should be determined by the technical idea of the appended claims.

Claims (30)

1. A display device, comprising:
a compensation value determination unit that generates a final compensation value for the input image;
a timing control unit that receives an input gradation for the input image, and applies the final compensation value to the input gradation to generate an output gradation; and
a pixel unit for displaying an output image corresponding to the output gradation by using pixels,
the compensation value determining section determines a weighting value based on a display frequency, a display luminance, and the input gray scale,
the compensation value determining section determines a compensation value based on the display frequency and the position of the pixel,
the compensation value determining unit applies the weighted value to the compensation value to generate the final compensation value.
2. The display device according to claim 1, wherein,
the compensation value determination unit includes:
a first weight value lookup table in which weight values based on a first display frequency, a reference display luminance, and a reference input gray scale are stored in advance; and
a second weight value lookup table in which weight values based on a second display frequency, the reference display luminance, and the reference input gray scale are stored in advance,
the first display frequency is different from the second display frequency.
3. The display device according to claim 2, wherein,
the compensation value determination unit further includes:
a first compensation value lookup table in which compensation values based on the first display frequency and a reference position of the pixel are stored in advance; and
and a second compensation value lookup table in which compensation values based on the second display frequency and the reference position are stored in advance.
4. The display device according to claim 3, wherein,
the compensation value determination unit further includes:
a first multiplexer that receives an input display frequency and outputs the weight value included in the first weight value lookup table as a first weight value in the case where the input display frequency is the same as the first display frequency, and outputs the weight value included in the second weight value lookup table as the first weight value in the case where the input display frequency is the same as the second display frequency.
5. The display device according to claim 4, wherein,
the compensation value determination unit further includes:
and a luminance compensation unit configured to receive an input display luminance, select two reference display luminances having a smallest difference from the input display luminance, and interpolate the first weighted value corresponding to the selected two reference display luminances for each of the reference input grayscales, thereby generating a second weighted value for the input display luminance.
6. The display device according to claim 5, wherein,
the compensation value determination unit further includes:
and a gray level compensation unit configured to receive the input gray levels, select two reference input gray levels having the smallest difference from the input gray levels for the input gray levels, and interpolate two second weighted values of the two selected reference input gray levels, thereby generating a third weighted value for the input gray level.
7. The display device according to claim 6, wherein,
the compensation value determination unit further includes:
a second multiplexer that receives the input display frequency and outputs the compensation value included in the first compensation value lookup table as a first compensation value if the input display frequency is the same as the first display frequency, and outputs the compensation value included in the second compensation value lookup table as the first compensation value if the input display frequency is the same as the second display frequency.
8. The display device according to claim 7, wherein,
the compensation value determination unit further includes:
and a position compensation unit that interpolates the first compensation value to generate a second compensation value for a pixel that is not located at the reference position.
9. The display device according to claim 8, wherein,
the compensation value determination unit further includes:
and a final compensation value generation unit configured to apply the third weighted value to the second compensation value to generate the final compensation value.
10. The display device according to claim 9, wherein,
the final compensation value generation unit generates the final compensation value by multiplying the third weighted value by the second compensation value,
the timing control unit generates the output gradation by adding the final compensation value to the input gradation.
11. A driving method of a display device, comprising:
a step of generating a final compensation value for the input image;
a step of applying the final compensation value to an input gray scale of the input image to generate an output gray scale; and
a step of displaying an output image corresponding to the output gradation using pixels,
the step of generating the final compensation value comprises:
A step of determining a weighting value based on a display frequency, a display luminance, and the input gray scale;
a step of determining a compensation value based on the display frequency and the position of the pixel; and
and a step of generating the final compensation value by applying the weighting value to the compensation value.
12. The driving method of a display device according to claim 11, wherein,
the display device includes:
a first weight value lookup table in which weight values based on a first display frequency, a reference display luminance, and a reference input gray scale are stored in advance; and
a second weight value lookup table in which weight values based on a second display frequency, the reference display luminance, and the reference input gray scale are stored in advance,
the first display frequency is different from the second display frequency.
13. The driving method of a display device according to claim 12, wherein,
the display device further includes:
a first compensation value lookup table in which compensation values based on the first display frequency and a reference position of the pixel are stored in advance; and
and a second compensation value lookup table in which compensation values based on the second display frequency and the reference position are stored in advance.
14. The driving method of a display device according to claim 13, wherein,
the step of determining the weighted value comprises:
and outputting the weight value included in the first weight value lookup table as a first weight value when the input display frequency is the same as the first display frequency, and outputting the weight value included in the second weight value lookup table as the first weight value when the input display frequency is the same as the second display frequency.
15. The driving method of a display device according to claim 14, wherein,
the step of determining the weighting value further comprises:
and a step of selecting two reference display luminances having the smallest difference from the input display luminance, and interpolating the first weighted values corresponding to the two selected reference display luminances for each of the reference input grayscales, thereby generating a second weighted value for the input display luminance.
16. The driving method of a display device according to claim 15, wherein,
the step of determining the weighting value further comprises:
and selecting two reference input grayscales with the smallest difference from the input grayscales for each of the input grayscales, and interpolating two second weighted values of the two selected reference input grayscales, thereby generating a third weighted value for the input grayscales.
17. The driving method of a display device according to claim 16, wherein,
the step of determining the compensation value comprises:
and outputting the compensation value included in the first compensation value lookup table as a first compensation value when the input display frequency is the same as the first display frequency, and outputting the compensation value included in the second compensation value lookup table as the first compensation value when the input display frequency is the same as the second display frequency.
18. The driving method of a display device according to claim 17, wherein,
the step of determining the compensation value further comprises:
and a step of interpolating the first compensation value to generate a second compensation value for a pixel not located at the reference position.
19. The driving method of a display device according to claim 18, wherein,
in the step of generating the final compensation value,
the third weighting value is applied to the second compensation value to generate the final compensation value.
20. The driving method of a display device according to claim 19, wherein,
multiplying the third weighted value by the second compensation value to generate the final compensation value,
The final compensation value is added to the input gray scale to generate the output gray scale.
21. An integrated circuit, comprising:
a first circuit unit that generates a final compensation value for an input image; and
a second circuit section for receiving an input gradation for the input image and applying the final compensation value to the input gradation to generate an output gradation,
the first circuit section determines a weighting value based on a display frequency, a display luminance, and the input gray scale,
the first circuit portion determines a compensation value based on the display frequency and the position of the pixel,
the first circuit unit applies the weighted value to the compensation value to generate the final compensation value.
22. The integrated circuit of claim 21, wherein,
the first circuit section includes:
a first weight value lookup table in which weight values based on a first display frequency, a reference display luminance, and a reference input gray scale are stored in advance; and
a second weight value lookup table in which weight values based on a second display frequency, the reference display luminance, and the reference input gray scale are stored in advance,
the first display frequency is different from the second display frequency.
23. The integrated circuit of claim 22, wherein,
the first circuit section further includes:
a first compensation value lookup table in which compensation values based on the first display frequency and a reference position of the pixel are stored in advance; and
and a second compensation value lookup table in which compensation values based on the second display frequency and the reference position are stored in advance.
24. The integrated circuit of claim 23, wherein,
the first circuit section further includes:
a first multiplexer that receives an input display frequency and outputs the weight value included in the first weight value lookup table as a first weight value in the case where the input display frequency is the same as the first display frequency, and outputs the weight value included in the second weight value lookup table as the first weight value in the case where the input display frequency is the same as the second display frequency.
25. The integrated circuit of claim 24, wherein,
the first circuit section further includes:
and a luminance compensation unit configured to receive an input display luminance, select two reference display luminances having a smallest difference from the input display luminance, and interpolate the first weighted value corresponding to the selected two reference display luminances for each of the reference input grayscales, thereby generating a second weighted value for the input display luminance.
26. The integrated circuit of claim 25, wherein,
the first circuit section further includes:
and a gray level compensation unit configured to receive the input gray levels, select two reference input gray levels having the smallest difference from the input gray levels for the input gray levels, and interpolate two second weighted values of the two selected reference input gray levels, thereby generating a third weighted value for the input gray level.
27. The integrated circuit of claim 26, wherein,
the first circuit section further includes:
a second multiplexer that receives the input display frequency and outputs the compensation value included in the first compensation value lookup table as a first compensation value if the input display frequency is the same as the first display frequency, and outputs the compensation value included in the second compensation value lookup table as the first compensation value if the input display frequency is the same as the second display frequency.
28. The integrated circuit of claim 27, wherein,
the first circuit section further includes:
and a position compensation unit that interpolates the first compensation value to generate a second compensation value for a pixel that is not located at the reference position.
29. The integrated circuit of claim 28, wherein,
the first circuit section further includes:
and a final compensation value generation unit configured to apply the third weighted value to the second compensation value to generate the final compensation value.
30. The integrated circuit of claim 29, wherein,
the final compensation value generation unit generates the final compensation value by multiplying the third weighted value by the second compensation value,
the second circuit unit generates the output gradation by adding the final compensation value to the input gradation.
CN202311327418.9A 2022-10-14 2023-10-13 Integrated circuit, display device and driving method of display device Pending CN117894266A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
KR10-2022-0132601 2022-10-14
KR1020230061361A KR20240053510A (en) 2022-10-14 2023-05-11 Integrated circuit, display device, and driving method of display device
KR10-2023-0061361 2023-05-11

Publications (1)

Publication Number Publication Date
CN117894266A true CN117894266A (en) 2024-04-16

Family

ID=90640012

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311327418.9A Pending CN117894266A (en) 2022-10-14 2023-10-13 Integrated circuit, display device and driving method of display device

Country Status (1)

Country Link
CN (1) CN117894266A (en)

Similar Documents

Publication Publication Date Title
CN115039168A (en) Display control method and electronic device supporting same
KR20190118693A (en) Display device and method for driving the same
CN117894266A (en) Integrated circuit, display device and driving method of display device
US20240127743A1 (en) Integrated circuit, display device, and method of driving the display device
KR20240053510A (en) Integrated circuit, display device, and driving method of display device
CN113066441B (en) Display device and data processing method thereof
US20240135855A1 (en) Display device and driving method thereof
KR20240059694A (en) Display device and driving method thereof
US11942030B1 (en) Source driver, display device or electronic device including source driver, and method of driving the same
US20240135858A1 (en) Display device and method of driving the same
US20240135863A1 (en) Display device and method of driving the same
KR20240065569A (en) Display device and driving method thereof
US20240071293A1 (en) Display device
US20240105134A1 (en) Display device, a method of operating a display device and a display driver
US20240144860A1 (en) Display device and driving method thereof
EP4369331A1 (en) Display device, method of driving the same, and electronic device
US20240161704A1 (en) Display device, method of driving the same, and electronic device including the same
CN117935745A (en) Display device
KR20240059695A (en) Display device and method of driving the same
US20110109782A1 (en) Display Device and Driving Method Thereof
US20240096283A1 (en) Display device, method of driving the same, and electronic device
US20240153424A1 (en) Scan driver and display device
KR20240055979A (en) Display device and method of driving the same
EP4102494A1 (en) Display device, electronic device including display module and method of operation thereof
US20240135864A1 (en) Display device and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication