MXPA05007299A - Method and device for processing video data for display on a display device - Google Patents

Method and device for processing video data for display on a display device

Info

Publication number
MXPA05007299A
MXPA05007299A MXPA/A/2005/007299A MXPA05007299A MXPA05007299A MX PA05007299 A MXPA05007299 A MX PA05007299A MX PA05007299 A MXPA05007299 A MX PA05007299A MX PA05007299 A MXPA05007299 A MX PA05007299A
Authority
MX
Mexico
Prior art keywords
oscillation
function
video data
modulation
pattern
Prior art date
Application number
MXPA/A/2005/007299A
Other languages
Spanish (es)
Inventor
Thebault Cedric
Correa Carlos
Weitbruch Sebastien
Original Assignee
Deutsche Thomsonbrandt Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Deutsche Thomsonbrandt Gmbh filed Critical Deutsche Thomsonbrandt Gmbh
Publication of MXPA05007299A publication Critical patent/MXPA05007299A/en

Links

Abstract

The visible pattern of classical matrix dithering in case of applications with moving pictures and static pictures shall be suppressed more effectively. Therefore, it is proposed to change a dithering pattern in the dithering block (12) nonperiodically. This can be effected by a random generator (13) which may be activated by a motion detector (14). The motion detector (14) enables to individually alternate the dithering function depending on moving or static pictures.

Description

before ihe expiration of the time limit for amending the For two-letter codes and other abbrevialions, referto the "Guid-claims and to be republished in the event of receipt of anee Notes on Codes and Abbreviations" appearing at the beginning-amendments no ofeach regular issue ofthe PCT Gazette.
METHOD AND DEVICE FOR PROCESSING VIDEO DATA TO PRESENT THEM IN A VISUAL REPRESENTATION DEVICE FIELD OF THE INVENTION The present invention relates to a method for processing video data to 'present them on a visual display device having a plurality of light elements by applying an oscillation function to at least a part of the video data to refine the representation of the gray scale of video images of video data. In addition, the present invention relates to a corresponding device for processing video data, including oscillating means.
BACKGROUND A PDP (Plasma Visual Representation Panel) uses a matrix arrangement of discharge cells, which can only be "ON" or "OFF". Unlike a CRT or LCD in which the gray levels are expressed by the analog control of light emission, a PDP controls the gray level by modulating the number of pulses of light per frame (sustained pulses). This time modulation will be integrated by the eye during a period corresponding to the eye's time response. Since the amplitude of video is presented by the number of light pulses, which occur at a given frequency, greater amplitude means more pulses of light and thus more time "ON". For this reason, this type of modulation is also known as PWM, modulation with pulse width. This PWM is responsible for one of the PDP image quality problems: the poor rendering quality of the gray scale, especially the darker regions of the image. This is due to the fact that, the presented luminance is linear to the number of pulses, but the response and sensitivity of the eye to the noise is not linear. In darker areas the eye is more sensitive than in brighter areas. This means that when the most modern PDPs can present. discrete video levels 255, the quantization error will be very noticeable in the darkest areas.- As mentioned above, the PDP uses the PWM (pulse width modulation) to generate the different shades of gray. Contrary to CRT where the luminance is approximately quadratic to the applied cathodic voltage, the luminance is linear to the number of discharge pulses. Therefore, an approximately quadratic digital gamma function has been applied to the video before the PWM. Due to this gamma function, for smaller video levels, many input levels are plotted at the same output level. In other words, for darker areas, the number of output quantization bits is smaller than the input number, in particular for values less than 16 (when working with 8 bits for video input) which are all plotted to 0 This also counts for the four-bit resolution which is currently unacceptable for video. A known solution to improve the quality of presented images is to artificially increase the number of video levels presented using oscillation. Oscillation is a known technique to avoid losing bits of amplitude resolution due to truncation. However, this technique only works if the required resolution is available before the truncation step. This is usually the case with most applications, since the video data after a gamma operation used for pre-correction of the video signal has a resolution of 16 bits. The oscillation can return as many bits as those lost by truncation in principle. However, the frequency of noise by oscillation decreases, and therefore becomes more noticeable, by the number of oscillating bits. The concept of oscillation will be explained by the following example. A quantization step of 1 will be reduced by oscillation. The oscillation technique uses the temporal integration property of the human eye. The quantization step can be reduced to 0.5 using a 1-bit oscillation. Consequently, half of the time within the temporal response of the human eye exists presenting the value of 1 and half of the time the value of 0 has been presented. As a result, the eye sees the value of 0.5. Optionally, the quantization steps can be reduced to 0.25. The oscillation requires two bits. To obtain the value of 0.25 a quarter of the time is shown the value of 1 and three quarters of the time the value of 0. To obtain the value of 0.5 two quarters of the time shown the value of 1 and two quarters of the time the value of 0 Similarly, the value of 0.75 can be generated. In the same way, the quantization steps of 0.125 can be obtained - using the 3-bit oscillation. This means that 1 oscillation bit corresponds to multiplying the number of output levels available by 2, 2 bits of oscillation multiplied by 4 and 3 bits of oscillation multiplied by 8 the number of output levels. A minimum of 3 oscillation bits may be required to give the representation of the gray scale a "CRT" appearance. The oscillation methods proposed in the literature (such as the diffusion of error) were developed mainly to improve the quality of static images (fax applications and newspaper photographic representation). The results obtained are therefore not optimal for PDP. The majority of the oscillation adapted for PDP up to now is the Cell Based Oscillation described in European Patent Application EP-A-1 136 974 and the Multimaster oscillation described in the European Patent Application with the filing number 01 250 199.5 , which improves the representation of the gray scale but adds an amplitude oscillation noise, of low frequency. This refers expressly to both documents. The cell-based oscillation adds a temporal oscillation pattern that is defined - for each cell of the panel and not for each pixel of the panel as shown in Figure 1. One pixel-panel is composed of three cells: red, green cells and blue. This has the advantage of stopping the oscillation noise and thus making it less noticeable to the human observer. Because the oscillation pattern is defined throughout the cell, it is not possible to use techniques such as error diffusion, to avoid coloring the image when a cell diffuses into the contiguous cell of a different color. Instead of using error diffusion, a three-dimensional static oscillation pattern is proposed.
This three-dimensional static oscillation is based on a spatial integration (2 dimensions x and y) and temporal integration (third dimension t) of the eye. For the following explanations the matrix oscillation can be represented as a function with three variables: f (x, y, t). The three parameters x, y and 't represent a type of phase for the oscillation. (< py, v x- >? (x, y, t), fx, t:? ~> f (x, y, t) y? .y 't-> f (x, y, t) are periodic). Now, depending on the number of bits to be reconstructed, the period of those three phases can change. For each frame, each function ft: (x, y) - > f (x, y, t) represents a (two-dimensional) pattern of oscillation. Figure 2 illustrates the concept of a three-dimensional matrix. The values presented on the image change slightly for each plasma cell in the vertical and horizontal directions. In addition, the value also changes for each box. In the example of Figure 2, for the table presented at time t0 the following oscillation values are given: f (xo, y0 / t0) = A f (x0 + l, y0. To) = Bf (x0 + l, I + l, t0) = A? (x0, yo + l. to) = B A table later, at time t0 + l the oscillation values are: f (xo.i.t0 + l) = B f (xo + l, i, t0 + l ) = A f (xo + l, i + l / t0 + l) = B f (x0, i + l, t0 + l) = A The spatial resolution of the eye is good enough to see a fixed static pattern A, B, A, B but if a third dimension is added, that is, time, in the form of an alternate function, then the eye will only be able to see the average value of each cell. It will be considered the case of a cell located in the position (xo, yo) - The value of this cell will change from frame to frame as follows f (xo, yoYo) = A, f (x0, y0, t0 + l) = B ,? (xo, I, to + 2) = A and so on. The response time of the eye of several milliseconds (temporal integration) can then be represented by the following formula: l 'Eye (x0, y 0) = ~ 2J EYE = which, in the present example, T, =. "Leads a A + B Eye (x0, y Q) = 2 It should be noted that the proposed pattern, when integrated over time, always gives the same value for all cells in the panel. This would not be the case, under the same circumstances, some cells may acquire a deviation of amplitude to other cells, which would correspond to an undesirable fixed spurious static pattern. While moving objects are displayed on the plasma screen, the human eye will follow the objects and will no longer integrate the same plasma cell (PDP) over time. In that case, in the third dimension it will not work more perfectly and an oscillation pattern can be observed. To better understand this problem, we will observe the following example of a movement = (1; 0), which represents a movement in the x direction of one pixel per frame as shown in Figure 3. In that case, the eye will see (x0, i) at time t0 and then it will continue to move the pixel (x0 + l, i) to time t0 + l and so on. In that case, the cell observed by the eye will be defined as follows: Qj = ~ (f (x0, y 0, t0) + f (x0 +, and Q, t0 + V) -r ... + f (x0 + T, and 0, t0 + T)) which corresponds to Eye = ~ (A + A + ... + A) = A. In that case, the three-dimensional aspect of the oscillation will not work properly and only spatial oscillation will be available. This effect will make the oscillation more or less visible depending on the movement. The oscillation pattern is no longer hidden by the spatial or temporal integration of the eye. Especially, for some movements, a pattern may appear that is unpleasant. The same type of problem can also appear for the same reason when the image to be presented already includes an oscillation. This is the case for some PC applications. Then the two oscillations can interfere with each other and also produce a strong fixed pattern. In view of that, the aim of the present invention is to provide a method and a device with an improved oscillation function. According to the present invention this object is solved by a method for processing video data to present it in a visual representation device having a plurality of light elements including the steps of applying an oscillation function to at least a part of the data -video to refine the representation of the gray scale of the video images of the video data, providing the modulation function that is not periodic and changing the phase or amplitude of the oscillation function according to the modulation function when the oscillation function is applied to at least a part of the video data. Furthermore, according to the present invention there is provided a device for processing video data to be presented on a visual display device having a plurality of light elements including oscillating means for applying an oscillation function to at least a part of the data of video to refine the representation of the gray scale of the video images of the video data, wherein the oscillation means include modulation means for modulating the phase or amplitude of the oscillation function with a modulation function other than periodical The modulation function of the invention allows an oscillation that is less noticeable to the viewers when static or moving images are represented. The reason for this is that the human body will not integrate periodic patterns of the oscillation function that are visible. Advantageously, the modulation function includes a random function. This random function causes the oscillation pattern to not appear periodically. This means that at a given time an oscillation pattern seems to change so that the observer will not perceive an unpleasant pattern. The oscillation function can include two spatial dimensions in addition to the time dimension given by the modulation function. This structure allows an advanced matrix oscillation. Advantageously, the oscillation function is an oscillation function of 1-, 2-, 3-, and / or 4 bits. The number of bits used depends on the processing capacity. In general, a 3-bit oscillation is sufficient, so that most of the quantization noise is not visible. As already mentioned, a precorrection with the quadratic gamma function must be carried out before the oscillation process. In this way, also the quantization errors produced by the correction of the gamma function are reduced with the help of oscillation. The temporal component of the oscillation function can be introduced by controlling the oscillation in the rhythm of the picture frames. In this way, no additional synchronization has been provided. The oscillation according to the present invention can be based on an oscillation based on the Cell and / or Multimask, which consists of adding an oscillation signal that is defined by each plasma cell and not by each pixel. In addition, that oscillation can be further optimized for each level of video. This makes the noise of oscillation finer and less noticeable to the human spectator. An adaptation of the oscillation pattern to the movement of the image to suppress the oscillation structure appears for a specific movement that can be obtained by using a motion estimator to change the phase or other parameters of the oscillation function for each cell. In that case, even if the eye is following the movement, the quality of the oscillation will remain constant and a pattern of oscillation in the case of motion will be suppressed. In addition, this invention can be combined with any type of matrix oscillation.
DRAWINGS The exemplary embodiments of the invention are illustrated in the drawings and are explained in greater detail in the following description. In the drawings: Figure 1 shows the principle of the oscillation based on the pixel and the cell-based oscillation; Figure 2 illustrates the concept of three-dimensional matrix oscillation; Figure 3 shows the integration principle of the eye for a moving image, when the three-dimensional matrix oscillation is applied; and Figure 4 shows a block diagram of an implementation of programs and programming or hardware systems for the algorithm according to the present invention.
EXEMPLARY MODALITIES The following modality has the purpose of eliminating the oscillation pattern that appears with the oscillation based on the cell during the movement to only have advantages in comparison with the Error Diffusion. This will be achieved by using a random sequence of oscillation patterns instead of a predetermined one or similar to that of the prior art. Due to this principle the total quality of the image is the same for static and moving images.
MATRIX OSCILLATION WITH RANDOM PATTERN SEQUENCE The problem with the fixed matrix oscillation is due to its structure, which is totally defined. To avoid these problems, the oscillation must be less feasible and its structure more complicated. To obtain this result, the oscillation pattern to be applied to the image can be randomly alternated to achieve a random pattern sequence of matrix oscillation. This can be done using a random function t - * p (t) instead of t. The new oscillation function will be defined by: f (x,?, P (t)). Consequently, an oscillation value f (xry, p (t)) is assigned to each three-dimensional vector (x, y, t). This can be illustrated by means of an example: f (x, y, t) = (x-ry + t) module 2. Also, according to Figure 3 it is assumed that A = 0 and B = l will generate a level of 0.5.
There is no movement, the eye will observe by a given pixel, a temporal sequence of 0 and 1. If there is a movement of 1 pixel / frame for example, the eye will continuously observe, depending on the pixel 0 or 1 as already explained above. According to a preferred embodiment, a random function p is used which takes the following values for t = t0- - - 123 ...: p (t0), / Yt2Y., P (t23). . . = 1.1, 0.1, 0.0, 1.0, 1.1, 0.0, 0.1, 0.0, 1.1, 1.0, 0.0, 1.1 .. Since there are only two different oscillation patterns, the random generator generates the values of 0 and 1. If there is no movement, the eye will observe the sequence: 1.1, 0.1, 0.0, 1.0, 1 , 1, 0,0, 0,1, 0,0, 1,1, 1,0, 0,0, 1,1 (or the inverse depending of the pixel: 0,0,1,0,1,1 , 0.1,0,0,1,1,1,0,1,1,0,0,0,1,1,1,0,0). However, if there is a movement of 1 pixel / frame for example, the eye will observe the resulting sequence: 1,0,0,0,0,1,1,1,1,0,0,1,0,0, 0,1,1,1,0,1,1,0,1,1,0 (or the inverse depending on the pixel: 0,1,1,1,1,0,0,0,0,1,1 , 0,1,1,1,0,0,1,0,0,1,0,0,1). The resulting sequence is obtained by taking the first value of the random function, the second value of the inverse random function, the third value of the random function, and so on. The sequence will look equally for any movement. It will always have the same characteristics as the original oscillation sequence. The temporal frequency of oscillation for a movement of 1 pixel / frame will not be as high as for static images, so that lower frequencies will appear. This means that the oscillation will be more noticeable. But the oscillation will still work correctly, and there will be no difference between the quality of the oscillation in a static image and one in motion. Compared to the oscillation based on the standard cell, the static images look noisier, but it is much better for most moving images. Optionally, a motion detector or estimator may be employed to decide whether the random oscillation has to be used in place of the standard oscillation. Random oscillation will be used for moving images, the standard for static images. Preferably, a 3-bit oscillation is implemented so that up to 8 oscillation frames are used. If the number of frames used for the oscillation increases, the frequency of the oscillation may be very low, and thus it will appear to be blinking. Mainly the oscillation of 3 bits is transformed with a cycle of 8 frames and a 2D spatial component. In this case, the random generator generates the values from 0 to 7, since eight oscillation patterns are used. Figure 3 illustrates a possible implementation for the algorithm. The fed RGB images indicated by the signals Ro, Go and B0 are sent to a gamma function block 10. This may consist of a look-up table (LUT) or may be formed by a mathematical function. The outputs Ri, Gi and Bj of the gamma function block 10 are sent to an oscillation block 12 which takes into account the position of the pixel and a random value p given by a random generator 13 for the calculation of the oscillation value according to to the equation mentioned above. The oscillation generator 13 optionally receives an input from the motion detector 14. The input signal serves to activate the random generator 13. If it is not activated, the random generator only increases the value of p to toggle the oscillation pattern therein. order that for the oscillation based on the standard cell. The motion detector 14 can take all the image or predetermined parts of the image transmitted in the signals R0, Go, B0 as a basis for forming the input signal for the random generator 13 to make the oscillation more adaptable to the different types of images . The video signals Ri, Gi, Bi subject to oscillation in the oscillation block 12 come out as signals R2, G2, B2 and are sent to a coding unit of subfield 16 which performs the subfield coding under the control of the unit control 18. The plasma control unit 18 provides the code to the subfield coding unit 16. With respect to the subfield coding this is expressly referred to in the aforementioned European patent application EP-A-1 136 974. The subfield signals for each color output of the subfield encoding unit 16 are indicated by the reference signs SFR, SFG, SFB. For addressing the Plasma Visual Representation panel, those subfield code words for a line are all collected to create a single very long codeword which can be used for PDP addressing along the line. This is carried out in a serial to parallel conversion unit 20 which is itself controlled by the plasma control unit 18. Furthermore, the control unit 18 generates all the scanning and holding pulses for the control of the PDP. Receives horizontal and vertical synchronization signals for the reference timing. In the present modality the use of a motion estimator is recommended, however, that estimator or motion detector can be used for other purposes such as false contour compensation, improvement of sharpness and reduction of phosphorescent delay. In this case, since the same movement vectors can be reused, the extra costs are limited. The movement compensated by oscillation is applicable to all visual representation devices based on color cells (for example color LCD) where the number of resolution bits is limited. The present invention has the advantage of suppressing the visible pattern of classical matrix oscillation in the case of applications with moving images and static images.

Claims (8)

  1. CLAIMS 1. Method for processing video data to present them on the visual representation device having a plurality of luminous elements by applying an oscillation function to at least a part of video data to refine the representation of the gray scale of the images of video and video data, characterized in that it provides a modulation function that is not periodic and changes the phase or amplitude of the oscillation function according to the modulation function when the oscillation function is applied to at least a part of the video data.
  2. 2. Method according to claim 1, wherein the modulation function includes a random function.
  3. Method according to claim 1 or 2, wherein the oscillation function includes two spatial dimensions in addition to a time dimension given by a modulation function.
  4. Method according to one of claims 1 to 3, wherein the oscillation function is an oscillation function of 1, 2, 3 and / or 4 bits.
  5. 5. Device for processing video data to be presented on a visual display device having a plurality of light elements including oscillating means for applying an oscillation function to at least a portion of the video data to refine the scale representation of gray or video images of the video data, characterized in that the oscillation means include modulation means for modulating the phase or amplitude of the oscillation function with a modulation function that is not periodic. Device according to claim 5, wherein the modulation function is provided by a random generator connected to the oscillation means. Device according to claim 5 or 6, wherein the oscillation function includes two spatial dimensions in addition to the time dimension obtained by the modulation function. Device according to one of claims 5 to 7, wherein the oscillation function is an oscillation function of 1, 2, 3, and / or 4 bits.
MXPA/A/2005/007299A 2003-01-10 2005-07-05 Method and device for processing video data for display on a display device MXPA05007299A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP03290063 2003-01-10

Publications (1)

Publication Number Publication Date
MXPA05007299A true MXPA05007299A (en) 2006-12-13

Family

ID=

Similar Documents

Publication Publication Date Title
US20070030285A1 (en) Method and device for processing video data for display on a display device
CN100458883C (en) Method and apparatus for processing video pictures to improve dynamic false contour effect compensation
KR100831234B1 (en) A method for a frame rate control and a liquid crystal display for the method
AU785352B2 (en) Method and apparatus for processing video pictures
KR100825339B1 (en) Image processing circuit and image processing method
US8237751B2 (en) Multi-primary conversion
EP1269457B1 (en) Method for processing video data for a display device
US20080253455A1 (en) High Frame Motion Compensated Color Sequencing System and Method
CN102005171A (en) Image display apparatus and luminance control method thereof
KR20060053933A (en) Method and device for processing video data by combining error diffusion and another dithering
JP4991066B2 (en) Method and apparatus for processing video images
EP1581922B1 (en) Method and device for processing video data for display on a display device
MXPA05007299A (en) Method and device for processing video data for display on a display device
EP1387343B1 (en) Method and device for processing video data for display on a display device
KR19980075493A (en) Adaptive Screen Brightness Correction Device in PDPD and Its Correction Method
JP2003513317A (en) Display circuit for grayscale control
Hoppenbrouwers et al. 29‐1: 100‐Hz Video Upconversion in Plasma Displays
EP1995712A1 (en) Method for applying dithering to video data and display device implementing said method
JP2016045393A (en) Image processor, display device, and display method
KR100561719B1 (en) Apparatus for gray scale conversion of video signal and converting method thereof
Russell P. 31: Image Compression for Color‐Sequential LCOS, with Decompression at the Retina
KR20080092103A (en) Image processing apparatus and image processing method thereof