EP1387343B1 - Method and device for processing video data for display on a display device - Google Patents

Method and device for processing video data for display on a display device Download PDF

Info

Publication number
EP1387343B1
EP1387343B1 EP20030102129 EP03102129A EP1387343B1 EP 1387343 B1 EP1387343 B1 EP 1387343B1 EP 20030102129 EP20030102129 EP 20030102129 EP 03102129 A EP03102129 A EP 03102129A EP 1387343 B1 EP1387343 B1 EP 1387343B1
Authority
EP
European Patent Office
Prior art keywords
dithering
function
video data
picture
moving object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
EP20030102129
Other languages
German (de)
French (fr)
Other versions
EP1387343A2 (en
EP1387343A3 (en
Inventor
Sebastien Weitbruch
Cedric Thebault
Didier Doyen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
THOMSON LICENSING
Original Assignee
Thomson Licensing SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from EP02291924A external-priority patent/EP1387340A1/en
Application filed by Thomson Licensing SAS filed Critical Thomson Licensing SAS
Priority to EP20030102129 priority Critical patent/EP1387343B1/en
Publication of EP1387343A2 publication Critical patent/EP1387343A2/en
Publication of EP1387343A3 publication Critical patent/EP1387343A3/en
Application granted granted Critical
Publication of EP1387343B1 publication Critical patent/EP1387343B1/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/2007Display of intermediate tones
    • G09G3/2044Display of intermediate tones using dithering
    • G09G3/2051Display of intermediate tones using dithering with use of a spatial dither pattern
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/2007Display of intermediate tones
    • G09G3/2018Display of intermediate tones by time modulation using two or more time intervals
    • G09G3/2022Display of intermediate tones by time modulation using two or more time intervals using sub-frames
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0261Improving the quality of display appearance in the context of movement of objects on the screen or movement of the observer relative to the screen
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0271Adjustment of the gradation levels within the range of the gradation scale, e.g. by redistribution or clipping
    • G09G2320/0276Adjustment of the gradation levels within the range of the gradation scale, e.g. by redistribution or clipping for the purpose of adaptation to the characteristics of a display device, i.e. gamma correction
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/10Special adaptations of display systems for operation with variable images
    • G09G2320/106Determination of movement vectors or equivalent parameters within the image
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/22Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources
    • G09G3/28Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using luminous gas-discharge panels, e.g. plasma panels

Definitions

  • the present invention relates to a method for processing video data for display on a display device having a plurality of luminous elements by applying a dithering function to at least a part of the video data to refine the grey scale portrayal of video pictures of the video data. Furthermore, the present invention relates to a corresponding device for processing video data including dithering means.
  • a PDP (Plasma Display Panel) utilizes a matrix array of discharge cells, which can only be "ON”, or “OFF". Unlike a CRT or LCD in which grey levels are expressed by analogue control of the light emission, a PDP controls the grey level by modulating the number of light pulses per frame (sustain pulses). This time-modulation will be integrated by the eye over a period corresponding to the eye time response. Since the video amplitude is portrayed by the number of light pulses, occurring at a given frequency, more amplitude means more light pulses and thus more "ON" time. For this reason, this kind of modulation is also known as PWM, pulse width modulation.
  • This PWM is responsible for one of the PDP image quality problems: the poor grey scale portrayal quality, especially in the darker regions of the picture. This is due to the fact, that displayed luminance is linear to the number of pulses, but the eye response and sensitivity to noise is not linear. In darker areas the eye is more sensitive than in brighter areas. This means that even though modern PDPs can display ca. 255 discrete video levels, quantization error will be quite noticeable in the darker areas.
  • a PDP uses PWM (pulse width modulation) to generate the different shades of grey. Contrarily to CRTs where luminance is approximately quadratic to applied cathode voltage, luminance is linear to the number of discharge impulses. Therefore an approximately digital quadratic gamma function has to be applied to video before the PWM.
  • PWM pulse width modulation
  • Dithering is a known technique for avoiding to loose amplitude resolution bits due to truncation.
  • this technique only works if the required resolution is available before the truncation step. Usually this is the case in most applications, since the video data after a gamma operation used for pre-correction of the video signal has 16-bit resolution. Dithering can bring back as many bits as those lost by truncation in principle. However, the dithering noise frequency decreases, and therefore becomes more noticeable, with the number of dithered bits.
  • a quantization step of 1 shall be reduced by dithering.
  • the dithering technique uses the temporal integration property of the human eye.
  • the quantization step may be reduced to 0,5 by using 1-bit dithering. Accordingly, half of the time within the time response of the human eye there is displayed the value 1 and half of the time there is displayed the value 0. As a result the eye sees the value 0,5.
  • the quantization steps may be reduced to 0,25.
  • Such dithering requires two bits. For obtaining the value 0,25 a quarter of the time the value 1 is shown and three quarters of the time the value 0. For obtaining the value 0,5 two quarters of the time the value 1 and two quarters of the time the value 0 is shown. Similarly, the value 0,75 may be generated.
  • quantization steps of 0,125 may be obtained by using 3-bit dithering. This means that 1 bit of dithering corresponds to multiply the number of available output levels by 2, 2 bits of dithering multiply by 4, and 3 bits of dithering multiply by 8 the number of output levels. A minimum of 3 bits of dithering may be required to give to the grey scale portrayal a 'CRT' look.
  • the dithering most adapted to PDP until now is the Cell-Based Dithering, described in the European patent application EP-A-1 136 974 and Multi-Mask dithering described in the European patent application with the filing number 01 250 199.5 , which improves grey scale portrayal but adds high frequency low amplitude dithering noise. It is expressively referred to both documents.
  • Cell-based dithering adds a temporal dithering pattern that is defined for every panel cell and not for every panel pixel as shown in Fig. 1 .
  • a panel pixel is composed of three cells: red, green and blue cell. This has the advantage of rendering the dithering noise finer and thus less noticeable to the human viewer.
  • the dithering pattern is defined cell-wise, it is not possible to use techniques like error-diffusion, in order to avoid colouring of the picture when one cell would diffuse in the contiguous cell of a different colour. This is not a big disadvantage, because it has been observed sometimes an undesirable low frequency moving interference, between the diffusion of the truncation error and a moving pattern belonging to the video signal. Error diffusion works best in case of static pictures. Instead of using error diffusion, a static 3-dimensional dithering pattern is proposed.
  • This static 3-dimentional dithering is based on a spatial (2 dimensions x and y) and temporal (third dimension t) integration of the eye.
  • the matrix dithering can be represented as a function with three variables: ⁇ (x,y,t).
  • the three parameters x, y and t will represent a kind of phase for the dithering. Now, depending on the number of bits to be rebuilt, the period of these three phases can evolve.
  • Figure 2 illustrates the 3-dimensional matrix concept.
  • the values displayed on the picture slightly change for each plasma cell in the vertical and horizontal directions. In addition, the value also changes for each frame.
  • the spatial resolution of the eye is good enough to be able to see a fixed static pattern A, B, A, B but if a third dimension, namely the time, is added in the form of an alternating function, then the eye will be only able to see the average value of each cell.
  • the human eye While displaying moving objects on the plasma screen, the human eye will follow the objects and no more integrates the same cell of the plasma (PDP) over the time. In that case, the third dimension will no more work perfectly and a dithering pattern can be seen.
  • PDP plasma
  • the third dimension aspect of the dithering will not work correctly and only the spatial dithering will be available. Such an effect will make the dithering more or less visible depending on the movement.
  • the dithering pattern is no longer hidden by the spatial and temporal eye integration.
  • a method and system for digital signal translation wherein the value of a data word is dithered over time such that the average value of the data word is an approximation of the desired value of the data word and for increasing the apparent dynamic range of a display system is already known from EP-A-0 656 616 and BECK DEPARTMENT OF ELECTRICAL ENGINEERING D R ET AL already disclosed " Motion Dithering for Increasing Perceived Image Quality for Low-Resolution Displays" on the SID INTERNATIONAL SYMPOSIUM in DIGEST OF TECHNICAL PAPERS., vol. 29, 13 July 1998 (1998-07-13), pages 407-410 , XP007008628 Santa Anaheim, CA, USA. Furthermore, Differential image data compression systems and techniques are known from WO 9111 0324 A ( PCT/US90/07685 ).
  • the dithering function or pattern has two spatial dimensions and one temporal dimension.
  • Such a dithering function enables an enhanced reduction of quantization steps in the case of static pictures compared to error diffusion.
  • the dithering function may be based on a plurality of masks.
  • different dither patterns may be provided for different entries in a number of least significant bits of the data word representing the input video level. This makes it possible to suppress the disturbing patterns occurring on the plasma display panel when using the conventional dither patterns.
  • the application of the dithering function or pattern may be based on single luminous elements called cells of the display device. I.e. to each colour component R, G, B of a pixel separate dithering numbers may be added.
  • Such cell based dithering has the advantage of rendering the dithering noise finer and thus making it less noticeable to the human viewer.
  • the dithering may be performed by a 1-, 2-, 3-, and/or 4-bit function.
  • the number of bits used depends on the processing capability. In general 3- bit dithering is enough so that most of the quantization noise is not visible.
  • the motion vector is computed for each pixel individually.
  • the quality of higher resolution dithering can be enhanced compared to a technique where the motion vector is computed for a plurality of pixels or a complete area.
  • the motion vector should be computed for both spatial dimensions x and y.
  • any movement of an object observed by the human viewer may be regarded for the dithering process.
  • the temporal component of the dithering function may be introduced by controlling the dithering in the rhythm of picture frames. Thus, no additional synchronisation has to be provided.
  • the dithering according to the present invention may be based on a Cell-based and/or Multi-Mask dithering, which consists in adding a dithering signal that is defined for every plasma cell and not for every pixel.
  • a dithering may further be optimized for each video level. This makes the dithering noise finer and less noticeable to the human viewer.
  • the adaptation of the dithering pattern to the movement of the picture in order to suppress the dithering structure appearing for specific movement may be obtained by using a motion estimator to change the phase or other parameters of the dithering function for each cell. In that case, even if the eye is following the movement, the quality of the dithering will stay constant and a pattern of dithering in case of motion will be suppressed. Furthermore, this invention can be combined with any kind of matrix dithering.
  • this vector can be used to change the phase of the dithering according to the formula: ⁇ ⁇ x o - V x x o ⁇ y o , y o - V y x o ⁇ y o , t o .
  • the new dithering pattern will depend on five parameters and can be defined as following: ⁇ x o , y o , V x x o ⁇ y o , V y x o ⁇ y o , t .
  • a big advantage of such a motion compensated dithering is its robustness regarding the motion vector.
  • the role of the motion vectors is to avoid any visible pattern of the dithering during a movement that suppresses the temporal integration of the eye. Even if the motion vectors are not exact, they can suppress the pattern.
  • this vector is used to change the phase of the dithering according to the formula : ⁇ ⁇ x o - f x x o ⁇ y o ⁇ t o , y o - f y x o ⁇ y o ⁇ t o , t o
  • represents the period of the dithering and mod( ⁇ ) the function modulo ⁇ .
  • represents the period of the dithering and mod( ⁇ ) the function modulo ⁇ .
  • the new dithering pattern will depend on five parameters and can be defined as following : ⁇ (x o ,y o ,v x (x o ,y o ,t),v y (x o ,y o ,t),t).
  • the vectors used are taken from more than one frame.
  • 3-bit dithering is implemented so that up to 8 frames are used for dithering. If the number of frames used for dithering is increased, the frequency of the dithering might be too low, and so flicker will appear.
  • Mainly 3-bit dithering is rendered with a 4-frames cycle and a 2D spatial component.
  • FIG. 3 illustrates a possible implementation for the algorithm.
  • RGB input pictures indicated by the signals R 0 , G 0 and B 0 are forwarded to a gamma function block 10. It can consist of a look up table (LUT) or it can be formed by a mathematical function.
  • the outputs R 1 , G 1 and B 1 of the gamma function block 10 are forwarded to a dithering block 12 which takes into account the pixel position and the frame parity as temporal component for the computation of the dithering value.
  • the frame parity is based on the frame number within one dithering cycle. For instance, within a 3-bit dithering based on a 4-frames cycle the frame number changes cyclically from 0 to 3.
  • the input picture R 0 , G 0 and B 0 is also forwarded to a motion estimator 14, which will provide, for each pixel, a motion vector (V x , V y ).
  • This motion vector will be additionally used by the dithering block 12 for computing the dithering pattern.
  • the video signals R 1 , G 1 , B 1 subjected to the dithering in the dithering block 12 are output as signals R2, G2, B2 and are forwarded to a sub-field coding unit 16 which performs sub-field coding under the control of the control unit 18.
  • the plasma control unit 18 provides the code for the sub-field coding unit 16 and the dithering pattern DITH for the dithering block 12.
  • the sub-field signals for each colour output from the sub-field coding unit 16 are indicated by reference signs SF R , SF G, SF B .
  • these sub-field code words for one line are all collected in order to create a single very long code word which can be used for the linewise PDP addressing. This is carried out in a serial to parallel conversion unit 20 which is itself controlled by the plasma control unit 18.
  • control unit 18 generates all scan and sustain pulses for PDP control. It receives horizontal and vertical synchronizing signals for reference timing.
  • Figure 4 illustrates a modification of the embodiment of figure 3 .
  • a frame memory is used at the dithering block level.
  • the additional memory requirements are not so strong since the value to be stored is modulo ⁇ , which is mainly around 4 for standard dithering in order to limit the temporal visibility of the dithering (low frequency).
  • Motion compensated dithering is applicable to all colour cell based displays (for instance colour LCDs) where the number of resolution bits is limited.
  • the present invention brings the advantages of suppressing the visible pattern of classical matrix dithering in case of moving pictures and of strong robustness regarding the motion vector field.

Description

  • The present invention relates to a method for processing video data for display on a display device having a plurality of luminous elements by applying a dithering function to at least a part of the video data to refine the grey scale portrayal of video pictures of the video data. Furthermore, the present invention relates to a corresponding device for processing video data including dithering means.
  • Background
  • A PDP (Plasma Display Panel) utilizes a matrix array of discharge cells, which can only be "ON", or "OFF". Unlike a CRT or LCD in which grey levels are expressed by analogue control of the light emission, a PDP controls the grey level by modulating the number of light pulses per frame (sustain pulses). This time-modulation will be integrated by the eye over a period corresponding to the eye time response. Since the video amplitude is portrayed by the number of light pulses, occurring at a given frequency, more amplitude means more light pulses and thus more "ON" time. For this reason, this kind of modulation is also known as PWM, pulse width modulation.
  • This PWM is responsible for one of the PDP image quality problems: the poor grey scale portrayal quality, especially in the darker regions of the picture. This is due to the fact, that displayed luminance is linear to the number of pulses, but the eye response and sensitivity to noise is not linear. In darker areas the eye is more sensitive than in brighter areas. This means that even though modern PDPs can display ca. 255 discrete video levels, quantization error will be quite noticeable in the darker areas.
  • As mentioned before, a PDP uses PWM (pulse width modulation) to generate the different shades of grey. Contrarily to CRTs where luminance is approximately quadratic to applied cathode voltage, luminance is linear to the number of discharge impulses. Therefore an approximately digital quadratic gamma function has to be applied to video before the PWM.
  • Due to this gamma function, for smaller video levels, many input levels are mapped to the same output level. In other words, for darker areas, the output number of quantization bits is smaller than the input number, in particular for values smaller than 16 (when working with 8 bit for video input) that are all mapped to 0. This also counts for four bit resolution which is actually unacceptable for video.
  • One known solution to improve the quality of the displayed pictures is to artificially increase the number of displayed video levels by using dithering. Dithering is a known technique for avoiding to loose amplitude resolution bits due to truncation. However, this technique only works if the required resolution is available before the truncation step. Usually this is the case in most applications, since the video data after a gamma operation used for pre-correction of the video signal has 16-bit resolution. Dithering can bring back as many bits as those lost by truncation in principle. However, the dithering noise frequency decreases, and therefore becomes more noticeable, with the number of dithered bits.
  • The concept of dithering shall be explained by the following example. A quantization step of 1 shall be reduced by dithering. The dithering technique uses the temporal integration property of the human eye. The quantization step may be reduced to 0,5 by using 1-bit dithering. Accordingly, half of the time within the time response of the human eye there is displayed the value 1 and half of the time there is displayed the value 0. As a result the eye sees the value 0,5.
  • Optionally, the quantization steps may be reduced to 0,25. Such dithering requires two bits. For obtaining the value 0,25 a quarter of the time the value 1 is shown and three quarters of the time the value 0. For obtaining the value 0,5 two quarters of the time the value 1 and two quarters of the time the value 0 is shown. Similarly, the value 0,75 may be generated. In the same manner quantization steps of 0,125 may be obtained by using 3-bit dithering. This means that 1 bit of dithering corresponds to multiply the number of available output levels by 2, 2 bits of dithering multiply by 4, and 3 bits of dithering multiply by 8 the number of output levels. A minimum of 3 bits of dithering may be required to give to the grey scale portrayal a 'CRT' look.
  • Proposed dithering methods in the literature (like error diffusion) were mainly developed to improve quality of still images (fax application and newspaper photo portrayal). Results obtained are therefore not optimal if the same dithering algorithms are directly applied to PDPs and mainly in the displaying of video with motion.
  • The dithering most adapted to PDP until now is the Cell-Based Dithering, described in the European patent application EP-A-1 136 974 and Multi-Mask dithering described in the European patent application with the filing number 01 250 199.5 , which improves grey scale portrayal but adds high frequency low amplitude dithering noise. It is expressively referred to both documents.
  • Cell-based dithering adds a temporal dithering pattern that is defined for every panel cell and not for every panel pixel as shown in Fig. 1. A panel pixel is composed of three cells: red, green and blue cell. This has the advantage of rendering the dithering noise finer and thus less noticeable to the human viewer.
  • Because the dithering pattern is defined cell-wise, it is not possible to use techniques like error-diffusion, in order to avoid colouring of the picture when one cell would diffuse in the contiguous cell of a different colour. This is not a big disadvantage, because it has been observed sometimes an undesirable low frequency moving interference, between the diffusion of the truncation error and a moving pattern belonging to the video signal. Error diffusion works best in case of static pictures. Instead of using error diffusion, a static 3-dimensional dithering pattern is proposed.
  • This static 3-dimentional dithering is based on a spatial (2 dimensions x and y) and temporal (third dimension t) integration of the eye. For the following explanations, the matrix dithering can be represented as a function with three variables: ϕ(x,y,t). The three parameters x, y and t will represent a kind of phase for the dithering. Now, depending on the number of bits to be rebuilt, the period of these three phases can evolve.
  • Figure 2 illustrates the 3-dimensional matrix concept. The values displayed on the picture slightly change for each plasma cell in the vertical and horizontal directions. In addition, the value also changes for each frame.
    In the example of figure 2, for the frame displayed at time to the following dithering values are given: ϕ x o y o t o = A
    Figure imgb0001
    ϕ x o + 1 , y o , t o = B
    Figure imgb0002
    ϕ x o + 1 , y o + 1 , t o = A
    Figure imgb0003
    ϕ x o , y o + 1 , t o = B
    Figure imgb0004
  • One frame later, the dithering values are at time to+1: ϕ x o , y o , t o + 1 = B
    Figure imgb0005
    ϕ x o + 1 , y o , t o + 1 = A
    Figure imgb0006
    ϕ x o + 1 , y o + 1 , t o + 1 = B
    Figure imgb0007
    ϕ x o , y o + 1 , t o + 1 = A
    Figure imgb0008
  • The spatial resolution of the eye is good enough to be able to see a fixed static pattern A, B, A, B but if a third dimension, namely the time, is added in the form of an alternating function, then the eye will be only able to see the average value of each cell.
  • The case of a cell located at the position (xo,yo) shall be considered. The value of this cell will change from frame to frame as following ϕ (xo,yo,to)=A, ϕ (xo,yo,to+1)=B, ϕ (xo,yo,to+2)=A and so on.
  • The eye time response of several milliseconds (temporal integration) can be then represented by the following formula: Eye x o y o = 1 T t = t o t = t o + T ϕ x o y o t
    Figure imgb0009
    which, in the present example, leads to Eye x o y o = A + B 2
    Figure imgb0010
  • It should be noted that the proposed pattern, when integrated over time, always gives the same value for all panel cells. If this would not be the case, under some circumstances, some cells might acquire an amplitude offset to other cells, which would correspond to an undesirable fixed spurious static pattern.
  • While displaying moving objects on the plasma screen, the human eye will follow the objects and no more integrates the same cell of the plasma (PDP) over the time. In that case, the third dimension will no more work perfectly and a dithering pattern can be seen.
  • In order to better understand this problem, the following example of a movement V =(1;0) shall be looked at, which represents a motion in x-direction of one pixel per frame. In that case, the eye will look at (xo,yo) at time to and then it will follow the movement to pixel (xo+1,yo) at time to+1 and so on. In that case, the cell seen by the eye will be defined as following: Eye = 1 T ϕ x o y o t o + ϕ x o + 1 , y o , t o + 1 + + ϕ x o + T , y o , t o + T
    Figure imgb0011
    which corresponds to Eye = 1 T A + A + + A = A .
    Figure imgb0012
    In that case, the third dimension aspect of the dithering will not work correctly and only the spatial dithering will be available. Such an effect will make the dithering more or less visible depending on the movement. The dithering pattern is no longer hidden by the spatial and temporal eye integration.
  • A method and system for digital signal translation wherein the value of a data word is dithered over time such that the average value of the data word is an approximation of the desired value of the data word and for increasing the apparent dynamic range of a display system is already known from EP-A-0 656 616 and BECK DEPARTMENT OF ELECTRICAL ENGINEERING D R ET AL already disclosed "Motion Dithering for Increasing Perceived Image Quality for Low-Resolution Displays" on the SID INTERNATIONAL SYMPOSIUM in DIGEST OF TECHNICAL PAPERS., vol. 29, 13 July 1998 (1998-07-13), pages 407-410, XP007008628 Santa Anaheim, CA, USA. Furthermore, Differential image data compression systems and techniques are known from WO 9111 0324 A ( PCT/US90/07685 ).
  • In view of that it is the object of the present invention to eliminate a dithering pattern appearing for a viewer observing a moving object on a picture.
  • The invention is disclosed in independent claims 1 and 8.
  • Fortunately, the dithering function or pattern has two spatial dimensions and one temporal dimension. Such a dithering function enables an enhanced reduction of quantization steps in the case of static pictures compared to error diffusion.
  • The dithering function may be based on a plurality of masks. Thus, different dither patterns may be provided for different entries in a number of least significant bits of the data word representing the input video level. This makes it possible to suppress the disturbing patterns occurring on the plasma display panel when using the conventional dither patterns.
  • Furthermore, the application of the dithering function or pattern may be based on single luminous elements called cells of the display device. I.e. to each colour component R, G, B of a pixel separate dithering numbers may be added. Such cell based dithering has the advantage of rendering the dithering noise finer and thus making it less noticeable to the human viewer.
  • The dithering may be performed by a 1-, 2-, 3-, and/or 4-bit function. The number of bits used depends on the processing capability. In general 3- bit dithering is enough so that most of the quantization noise is not visible.
  • Preferably, the motion vector is computed for each pixel individually. By doing so the quality of higher resolution dithering can be enhanced compared to a technique where the motion vector is computed for a plurality of pixels or a complete area.
  • Furthermore, the motion vector should be computed for both spatial dimensions x and y. Thus, any movement of an object observed by the human viewer may be regarded for the dithering process.
  • As already mentioned, a pre-correction by the quadratic gamma function should be performed before the dithering process. Thus, also the quantization errors produced by the gamma function correction are reduced with the help of dithering.
  • The temporal component of the dithering function may be introduced by controlling the dithering in the rhythm of picture frames. Thus, no additional synchronisation has to be provided.
  • The dithering according to the present invention may be based on a Cell-based and/or Multi-Mask dithering, which consists in adding a dithering signal that is defined for every plasma cell and not for every pixel. In addition, such a dithering may further be optimized for each video level. This makes the dithering noise finer and less noticeable to the human viewer.
  • The adaptation of the dithering pattern to the movement of the picture in order to suppress the dithering structure appearing for specific movement may be obtained by using a motion estimator to change the phase or other parameters of the dithering function for each cell. In that case, even if the eye is following the movement, the quality of the dithering will stay constant and a pattern of dithering in case of motion will be suppressed. Furthermore, this invention can be combined with any kind of matrix dithering.
  • Drawings
  • Exemplary embodiments of the invention are illustrated in the drawings and are explained in more detail in the following description. In the drawings:
  • Figure 1
    shows the principal of the pixel-based dithering and cell based dithering;
    Figure 2
    illustrates the concept of 3-dimensional matrix dithering; and
    Figure 3
    shows a block diagram of a hardware implementation for the algorithm according to the present invention.
    Figure 4
    shows another embodiment for the block diagram.
    Exemplary embodiments
  • In order to suppress the visible pattern of a classical matrix dithering in case of moving pictures the motion of the picture is taken into account by using a motion estimator.
  • This will provide, for each pixel M(xo,yo) of the screen, a vector V(xo,yo)= (Vx(xo,yo),Vy(xo,yo)) representing its movement. In that case, this vector can be used to change the phase of the dithering according to the formula: ϕ x o - V x x o y o , y o - V y x o y o , t o .
    Figure imgb0013
  • More generally, the new dithering pattern will depend on five parameters and can be defined as following: ζ x o , y o , V x x o y o , V y x o y o , t .
    Figure imgb0014
  • A big advantage of such a motion compensated dithering is its robustness regarding the motion vector. In fact, the role of the motion vectors is to avoid any visible pattern of the dithering during a movement that suppresses the temporal integration of the eye. Even if the motion vectors are not exact, they can suppress the pattern.
  • According to a more optimised solution, for each pixel M(xo, yo) of the screen, a vector V (xo,yo,to)=(Vx(xoyoto),Vy(xo,yo,to)) representing its movement at time to is provided. In that case, this vector is used to change the phase of the dithering according to the formula : ϕ x o - f x x o y o t o , y o - f y x o y o t o , t o
    Figure imgb0015
    where f(x,y,t) is a recursive function described as following : f x x o y o t o = V x x o y o t o + V x x o , y o , t o - 1 mod τ
    Figure imgb0016
    and f y x o y o t o = v x x o y o t o + f y x o , y o , t o - 1 ) mod τ .
    Figure imgb0017
  • In this formula, τ represents the period of the dithering and mod(τ) the function modulo τ. For instance if τ=4, there is a periodic dithering pattern on 4 frames, which means that ϕ(xo,yo,to)=ϕ(xo,yo,to+4) and the modulo 4 functions means that : (0)mod(4)=0, (1)mod(4)=1, (2)mod(4)=2, (3)mod(4)=3, (4)mod(4)=0, (5)mod(4)=1, (6)mod(4)=2, (7)mod(4)=3 and so on.
  • More generally, the new dithering pattern will depend on five parameters and can be defined as following : ζ(xo,yo,vx(xo,yo,t),vy(xo,yo,t),t). The only difference now is that the vectors used are taken from more than one frame. Preferably 3-bit dithering is implemented so that up to 8 frames are used for dithering. If the number of frames used for dithering is increased, the frequency of the dithering might be too low, and so flicker will appear. Mainly 3-bit dithering is rendered with a 4-frames cycle and a 2D spatial component.
  • Figure 3 illustrates a possible implementation for the algorithm. RGB input pictures indicated by the signals R0, G0 and B0 are forwarded to a gamma function block 10. It can consist of a look up table (LUT) or it can be formed by a mathematical function. The outputs R1, G1 and B1 of the gamma function block 10 are forwarded to a dithering block 12 which takes into account the pixel position and the frame parity as temporal component for the computation of the dithering value. The frame parity is based on the frame number within one dithering cycle. For instance, within a 3-bit dithering based on a 4-frames cycle the frame number changes cyclically from 0 to 3.
  • In parallel to that, the input picture R0, G0 and B0 is also forwarded to a motion estimator 14, which will provide, for each pixel, a motion vector (Vx, Vy). This motion vector will be additionally used by the dithering block 12 for computing the dithering pattern.
  • The video signals R1, G1, B1 subjected to the dithering in the dithering block 12 are output as signals R2, G2, B2 and are forwarded to a sub-field coding unit 16 which performs sub-field coding under the control of the control unit 18. The plasma control unit 18 provides the code for the sub-field coding unit 16 and the dithering pattern DITH for the dithering block 12.
  • As to the sub-field coding it is expressively referred to the already mentioned European patent application EP-A-1 136 974 .
  • The sub-field signals for each colour output from the sub-field coding unit 16 are indicated by reference signs SFR, SFG, SFB. For plasma display panel addressing, these sub-field code words for one line are all collected in order to create a single very long code word which can be used for the linewise PDP addressing. This is carried out in a serial to parallel conversion unit 20 which is itself controlled by the plasma control unit 18.
  • Furthermore, the control unit 18 generates all scan and sustain pulses for PDP control. It receives horizontal and vertical synchronizing signals for reference timing.
  • Figure 4 illustrates a modification of the embodiment of figure 3. In this case, a frame memory is used at the dithering block level. The additional memory requirements are not so strong since the value to be stored is modulo τ, which is mainly around 4 for standard dithering in order to limit the temporal visibility of the dithering (low frequency). In that case, 2 bits per pixels are enough to store values that are modulo 4. For instance a WXGA panel will require 853x3x480x2=2.34 Mbit.
  • Although the present embodiment requires the use of a motion estimator, such a motion estimator is already mandatory for other skills like false contour compensation, sharpness improvement and phosphor lag reduction. Since the same vectors can be reused the extra costs are limited.
  • Motion compensated dithering is applicable to all colour cell based displays (for instance colour LCDs) where the number of resolution bits is limited.
  • In all cases the present invention brings the advantages of suppressing the visible pattern of classical matrix dithering in case of moving pictures and of strong robustness regarding the motion vector field.

Claims (16)

  1. Method for processing video data (R0, G0, B0) for display on a display device having a plurality of luminous elements by
    applying a dithering function to at least part of said video data (R0, G0, B0) to refine the grey scale portrayal of video pictures of said video data,
    characterized by
    computing at least one motion vector representing the movement of a moving object on a picture from said video data (R0, G0, B0) and
    changing the phase, amplitude, spatial resolution and/or temporal resolution of said dithering function in accordance with said at least one motion vector representing the movement of a moving object on a picture when applying the dithering function to said video data (R0, G0, B0) to eliminate a dithering pattern appearing for a viewer observing said moving object on a picture if his eyes follow said movement.
  2. Method according to claim 1, wherein said dithering function includes two spatial dimensions and one temporal dimension.
  3. Method according to claim 1 or 2, wherein said dithering function includes the application of a plurality of masks.
  4. Method according to claim 1 or 2, wherein said applying of said dithering function is based on single luminous elements called cells of said display device.
  5. Method according to one of the claims 1 to 4, wherein said dithering function is a 1-, 2-, 3- and/or 4- bit dithering function.
  6. Method according to one of the claims 1 to 5, wherein said at least one motion vector representing a moving object on a picture is defined for each pixel or cell individually.
  7. Method according to one of the claims 1 to 6, wherein said at least one motion vector representing a moving object on a picture has two spatial dimensions.
  8. Device for processing video data (R0, G0, B0) for display on a display device having a plurality of luminous elements including dithering means (12) for applying a dithering function to at least a part of said video data (R0, G0, B0) to refine the grey scale portrayal of video pictures of said video data (R0, G0, B0),
    characterized by
    motion estimations means (14) connected to said dithering means (12) for computing at least one motion vector representing the movement of a moving object on a picture (Vx, Vy) from said video data (R0, G0, B0), wherein the phase, amplitude, spatial resolution and/or temporal resolution of said dithering function is changeable in accordance with said at least one motion vector representing the movement of a moving object on a picture (Vx, Vy) to eliminate a dithering pattern appearing for a viewer observing said moving object on a picture if his eyes follow said movement.
  9. Device according to claim 8, wherein said dithering function used by said dithering means (12) includes two spatial dimensions and a temporal dimension.
  10. Device according to claim 8 or 9, wherein said dithering function of said dithering means (12) is based on a plurality of masks.
  11. Device according to claim 8 or 9, wherein said dithering function of said dithering means (12) is based on single luminous elements called cells of said display device.
  12. Device according to one of the claims 8 to 11, wherein said dithering means (12) is able to process a 1-, 2-, 3- and/or 4-bit dithering function.
  13. Device according to one of the claims 8 to 12, wherein said at least one motion vector representing a moving object on a picture (Vx, Vy) is definable for each pixel individually by said motion estimation means (14).
  14. Device according to one of the claims 8 to 13, wherein said at least one motion vector representing a moving object on a picture (Vx, Vy) includes two spatial dimensions.
  15. Device according to one of the claims 8 to 14, further including gamma function means (10) connected to said dithering means (12), so that the input signals of said dithering means (12) are pre-corrected by a gamma function.
  16. Device according to one of the claims 8 to 15, further including controlling means (18) connected to said dithering means (12) for controlling said dithering means (12) temporally in dependence of frames of said video data R0, G0, B0).
EP20030102129 2002-07-30 2003-07-11 Method and device for processing video data for display on a display device Expired - Lifetime EP1387343B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP20030102129 EP1387343B1 (en) 2002-07-30 2003-07-11 Method and device for processing video data for display on a display device

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP02291924A EP1387340A1 (en) 2002-07-30 2002-07-30 Method and device for processing video data for a display
EP22919245 2002-07-30
EP20030102129 EP1387343B1 (en) 2002-07-30 2003-07-11 Method and device for processing video data for display on a display device

Publications (3)

Publication Number Publication Date
EP1387343A2 EP1387343A2 (en) 2004-02-04
EP1387343A3 EP1387343A3 (en) 2007-07-18
EP1387343B1 true EP1387343B1 (en) 2009-03-25

Family

ID=30116934

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20030102129 Expired - Lifetime EP1387343B1 (en) 2002-07-30 2003-07-11 Method and device for processing video data for display on a display device

Country Status (1)

Country Link
EP (1) EP1387343B1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012000136A1 (en) 2010-07-02 2012-01-05 Thomson Broadband R&D (Beijing) Co., Ltd. Method for measuring video quality using a reference, and apparatus for measuring video quality using a reference

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0462261A1 (en) * 1989-12-28 1991-12-27 Massachusetts Institute Of Technology Video telephone systems
KR950016387A (en) * 1993-12-02 1995-06-17 윌리엄 이. 힐러 Technology to increase the clear dynamic range of visual displays
US6288698B1 (en) * 1998-10-07 2001-09-11 S3 Graphics Co., Ltd. Apparatus and method for gray-scale and brightness display control
EP1136974A1 (en) * 2000-03-22 2001-09-26 Deutsche Thomson-Brandt Gmbh Method for processing video data for a display device

Also Published As

Publication number Publication date
EP1387343A2 (en) 2004-02-04
EP1387343A3 (en) 2007-07-18

Similar Documents

Publication Publication Date Title
US20070030285A1 (en) Method and device for processing video data for display on a display device
KR100318647B1 (en) Gradation display system
KR100898851B1 (en) Method and apparatus for processing video picture data for display on a display device
EP1269457B1 (en) Method for processing video data for a display device
AU785352B2 (en) Method and apparatus for processing video pictures
JP3354741B2 (en) Halftone display method and halftone display device
EP1356443B1 (en) Method and apparatus for controlling a display device
WO2000043979A1 (en) Apparatus and method for making a gray scale display with subframes
US7023450B1 (en) Data processing method and apparatus for a display device
JP2000347616A (en) Display device and display method
EP1262947B1 (en) Method and apparatus for processing video picture data for a display device
EP1581922B1 (en) Method and device for processing video data for display on a display device
EP1758073A1 (en) Method and device for processing video data to be displayed on a display device
EP1387343B1 (en) Method and device for processing video data for display on a display device
EP1695329B1 (en) Method and apparatus for processing video pictures, in particular in film mode sequences
EP1193672B1 (en) Display and image displaying method
Hoppenbrouwers et al. 29‐1: 100‐Hz Video Upconversion in Plasma Displays
MXPA05007299A (en) Method and device for processing video data for display on a display device
EP1995712A1 (en) Method for applying dithering to video data and display device implementing said method

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL LT LV MK

17P Request for examination filed

Effective date: 20040720

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: THOMSON LICENSING

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL LT LV MK

RIC1 Information provided on ipc code assigned before grant

Ipc: G09G 3/28 20060101AFI20031105BHEP

Ipc: G09G 3/20 20060101ALI20070613BHEP

17Q First examination report despatched

Effective date: 20080214

AKX Designation fees paid

Designated state(s): DE FR GB

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): DE FR GB

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REF Corresponds to:

Ref document number: 60326793

Country of ref document: DE

Date of ref document: 20090507

Kind code of ref document: P

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

RAP2 Party data changed (patent owner data changed or rights of a patent transferred)

Owner name: THOMSON LICENSING

26N No opposition filed

Effective date: 20091229

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 14

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 15

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 60326793

Country of ref document: DE

Representative=s name: DEHNS, DE

Ref country code: DE

Ref legal event code: R082

Ref document number: 60326793

Country of ref document: DE

Representative=s name: DEHNS PATENT AND TRADEMARK ATTORNEYS, DE

Ref country code: DE

Ref legal event code: R082

Ref document number: 60326793

Country of ref document: DE

Representative=s name: HOFSTETTER, SCHURACK & PARTNER PATENT- UND REC, DE

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 16

REG Reference to a national code

Ref country code: FR

Ref legal event code: TP

Owner name: THOMSON LICENSING DTV, FR

Effective date: 20180830

REG Reference to a national code

Ref country code: GB

Ref legal event code: 732E

Free format text: REGISTERED BETWEEN 20180927 AND 20181005

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 60326793

Country of ref document: DE

Representative=s name: DEHNS, DE

Ref country code: DE

Ref legal event code: R081

Ref document number: 60326793

Country of ref document: DE

Owner name: INTERDIGITAL MADISON PATENT HOLDINGS, FR

Free format text: FORMER OWNER: THOMSON LICENSING, BOULOGNE-BILLANCOURT, FR

Ref country code: DE

Ref legal event code: R082

Ref document number: 60326793

Country of ref document: DE

Representative=s name: DEHNS PATENT AND TRADEMARK ATTORNEYS, DE

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20220719

Year of fee payment: 20

Ref country code: DE

Payment date: 20220527

Year of fee payment: 20

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20220727

Year of fee payment: 20

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230514

REG Reference to a national code

Ref country code: DE

Ref legal event code: R071

Ref document number: 60326793

Country of ref document: DE

REG Reference to a national code

Ref country code: GB

Ref legal event code: PE20

Expiry date: 20230710

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF EXPIRATION OF PROTECTION

Effective date: 20230710