CN110636304B - YCbCr444 and YCbCr422 conversion method - Google Patents

YCbCr444 and YCbCr422 conversion method Download PDF

Info

Publication number
CN110636304B
CN110636304B CN201911012854.0A CN201911012854A CN110636304B CN 110636304 B CN110636304 B CN 110636304B CN 201911012854 A CN201911012854 A CN 201911012854A CN 110636304 B CN110636304 B CN 110636304B
Authority
CN
China
Prior art keywords
value
mode
component
ycbcr444
ycbcr422
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911012854.0A
Other languages
Chinese (zh)
Other versions
CN110636304A (en
Inventor
章波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vtron Group Co Ltd
Original Assignee
Vtron Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vtron Group Co Ltd filed Critical Vtron Group Co Ltd
Priority to CN201911012854.0A priority Critical patent/CN110636304B/en
Publication of CN110636304A publication Critical patent/CN110636304A/en
Application granted granted Critical
Publication of CN110636304B publication Critical patent/CN110636304B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/6002Corrections within particular colour systems
    • H04N1/6008Corrections within particular colour systems with primary colour signals, e.g. RGB or CMY(K)
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals

Abstract

The invention provides a method for mutually converting YCbCr444 and YCbCr422, which comprises the following steps: dividing the image into subblocks of 2x2 pixel points, dividing four pixel points in the subblocks into two groups to obtain 7 grouping modes, and calculating Y component coding values of each group of pixel points in each grouping mode; calculating the deviation value of the Y component coding value in each grouping mode and the Y component of the corresponding pixel point, and selecting the grouping mode with the minimum deviation value as the coding/decoding mode of the subblock; cb coding and Cr coding are carried out on the YCbCr444 data according to the coding mode to obtain YCbCr422 data; performing Cb decoding and Cr decoding on the YCbCr422 data according to a decoding mode to obtain YCbCr444 data; compared with the traditional conversion method, the YCbCr444 and YCbCr422 conversion method provided by the invention has the advantages that the deviation between the conversion data and the original data is smaller, and the image quality can be effectively improved.

Description

YCbCr444 and YCbCr422 conversion method
Technical Field
The invention relates to the field of color space, in particular to a YCbCr444 and YCbCr422 conversion method.
Background
The RGB model is a commonly used color information expression method, which uses the brightness of three primary colors, red, green and blue, to quantitatively express color, and each pixel in the image is composed of three components, RGB. The YCbCr color space is an internationally standardized variant of YUV, with Y referring to the luminance component, Cb to the blue chrominance component and Cr to the red chrominance component, and has applications in digital television and image compression. The human eye is more sensitive to the Y component of the video and therefore the human eye will not perceive a change in image quality after the chrominance component is reduced by sub-sampling the chrominance component. The main sub-sampling formats are YCbCr4:2: 0, YCbCr4:2:2 and YCbCr4:4:4, where 4:2:0 means 4 luminance components per 4 pixels, 2 chrominance components (yyycbcr), sampling only odd scan lines, the most common format for portable video devices (MPEG-4) and video conferencing (h.263); 4:2:2 means 4 luminance components per 4 pixels, 4 chrominance components (YYYYCbCrCbCr), the most common format for DVD, digital television, HDTV and other consumer video devices; 4:4:4 denotes a full pixel lattice (yyycbcrcbcrcbcrcbcr) for high quality video applications, studios and professional video production.
In the process of image processing, the RGB color space is usually converted into YCbCr4:4:4, then the color information which is relatively insensitive to human eyes is compressed and sampled by utilizing the characteristic that human eyes are insensitive to color components Cb and Cr, and the YCbCr4:4:4 format is converted into YCbCr4:2:2 by compressing the data volume of Cb and Cr, so that a relatively small file is obtained for playing and transmitting, thereby reducing the data bandwidth.
A commonly used method for converting YCbCr4:4:4 to YCbCr4:2:2 is to take the average of two horizontal points and then replace the Cb value or the Cr value corresponding to the original two pixels with the average value. The processing method can not cause obvious difference to the images shot by the camera by human eyes. But this difference is very noticeable when processing text images or drawing images output by a computer. Because the image output by the computer often has a single-pixel color line, after the two-point average conversion, the single-pixel color line is changed into a double-pixel line, and the problem of fuzzy character boundaries occurs.
Disclosure of Invention
The invention provides a method for converting YCbCr444 and YCbCr422, which solves the problem that the image boundary is fuzzy due to the fact that the method of converting the format of YCbCr4:4:4 into the format of YCbCr4:2:2 and averaging two points in the prior art.
In order to solve the technical problems, the technical scheme of the invention is as follows:
a YCbCr444 and YCbCr422 conversion method is characterized by comprising the following steps:
dividing an image into subblocks of 2x2 pixel points, dividing four pixel points in the subblocks into two groups to obtain 7 grouping modes, and calculating Y component coding values of each group of pixel points in each grouping mode;
calculating the deviation value of the Y component coding value and the Y component of the corresponding pixel point in each grouping mode, and selecting the grouping mode with the minimum deviation value to determine the coding/decoding mode of the subblock;
cb coding and Cr coding are carried out on the YCbCr444 data according to the coding mode, and the sub-block YCbCr422 value is obtained through an original Y component value and a coded CbCr value;
and performing Cb decoding and Cr decoding on the YCbCr422 data according to the decoding mode, and obtaining the YCbCr444 sub-block by using the original Y component value and the decoded CbCr value.
Optionally, calculating the Y component encoding value of each group of pixel points in each grouping mode further includes: and the Y component coding value is the average value of the Y components of each group of pixel points in each grouping mode.
Optionally, calculating the deviation value between the Y component encoded value and the Y component of the corresponding pixel point in each grouping mode further includes: and the deviation value is the sum of squares of the differences between the Y component coding value and the Y component of the corresponding pixel point.
Optionally, selecting the grouping mode with the minimum deviation value as the coding/decoding mode of the subblock further includes: the image quality distortion of the sub-blocks is minimal in the coding mode.
Optionally, Cb encoding the YCbCr444 data according to the encoding mode further includes: and calculating the Cb encoding value of each group of pixel points in the encoding mode by adopting an averaging method.
Optionally, Cr-encoding the YCbCr444 data according to the encoding mode further includes: and calculating the Cr encoding value of each group of pixel points in the encoding mode by adopting an averaging method.
Optionally, the original Y component value and the encoded CbCr value obtain the YCbCr422 value of the sub-block, and the method further includes obtaining the converted YCbCr422 value by using four original values of the Y component, two encoded values of the Cb component, and two encoded values of the Cr component.
Optionally, the Cb decoding the YCbCr422 data according to the decoding mode further includes: and taking the Cb value of each group of pixel points in the decoding mode as the Cb component value of the corresponding pixel point.
Optionally, Cr-decoding the YCbCr422 data according to the decoding mode further includes: and taking the Cr value of each group of pixel points in the decoding mode as the Cr component value of the corresponding pixel point.
Optionally, the original Y component value and the decoded CbCr value obtain the YCbCr444 sub-block, and the YCbCr444 sub-block further includes four original values of the Y component, four decoded values of the Cb component, and four decoded values of the Cr component.
The invention provides a YCbCr444 and YCbCr422 conversion method, which comprises the steps of dividing an image into subblocks of 2x2 pixel points, dividing four pixel points in the subblocks into two groups to obtain 7 grouping modes, and calculating a Y component coding value of each group of pixel points in each grouping mode; calculating the deviation value of the Y component coding value and the Y component of the corresponding pixel point in each grouping mode, and selecting the grouping mode with the minimum deviation value as the coding mode of the subblock; cb coding and Cr coding are carried out on the YCbCr444 data according to the coding mode, and an YCbCr422 value of the sub-block is obtained through an original Y component value and a coded CbCr value; and performing Cb decoding and Cr decoding on the YCbCr422 data according to the decoding mode, wherein the sub-block YCbCr444 value is obtained by an original Y component value, a decoded Cb value and a decoded Cr value. Compared with the traditional color space conversion method, the YCbCr444 and YCbCr422 conversion method provided by the invention considers the image data change direction, selects the best mode for multipoint averaging aiming at the image change characteristics, and calculates the Cb value and the Cr value in the YCbCr 422.
Drawings
Fig. 1 is a schematic diagram of a grouping mode of a YCbCr444 and YCbCr422 conversion method according to an embodiment of the present invention.
Fig. 2 is a schematic flow chart of YCbCr444 to YCbCr422 conversion according to the YCbCr444 to YCbCr422 conversion method in the embodiment of the present invention.
Fig. 3 is a schematic flow chart of YCbCr444 to YCbCr422 conversion according to the YCbCr444 to YCbCr422 conversion method in the embodiment of the present invention.
Detailed Description
Compared with the traditional simple multipoint average calculation method, the YCbCr444 and YCbCr422 conversion method provided by the embodiment of the invention can be used for calculating the average value according to the image data correlation, so that the deviation between the converted data and the original data is smaller, and the image quality can be effectively improved.
To facilitate an understanding of the invention, the invention will now be described more fully with reference to the accompanying drawings. Preferred embodiments of the present invention are shown in the drawings. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
The reason why the image boundary is blurred due to the fact that the traditional method of averaging two points of YCbCr4:4:4 to YCbCr4:2:2 is that if the left and right adjacent pixels of the image have relatively large jumps, the average value of the left and right two points has relatively large deviation with respect to the original value of the image, and the image boundary appears blurred.
For example, as shown in Table 1, the image is four pixels in succession, alternating white, red, white and red, with the first and third being white and the second and fourth being red.
Figure BDA0002244733430000041
TABLE 1
The corresponding pixel RGB values, YCbCr444 values, and two-point averaged YCbCr422 values are shown in table 2:
Figure BDA0002244733430000042
TABLE 2
The Cb value and the Cr value of YCbCr4:2:2 are the average value of the left point and the right point as a new Cb value and a new Cr value, so that the Cb value is 113, the Cr value is 192, the four pixels are all changed into the same color, and the original stripes between red and white are blurred.
The embodiment of the invention provides a method for converting YCbCr444 and YCbCr422, which comprises the following steps:
dividing an image into subblocks of 2x2 pixel points, dividing four pixel points in the subblocks into two groups to obtain 7 grouping modes, and calculating Y component coding values of each group of pixel points in each grouping mode;
calculating the deviation value of the Y component coding value and the Y component of the corresponding pixel point in each grouping mode, and selecting the grouping mode with the minimum deviation value to determine the coding/decoding mode of the subblock;
cb coding and Cr coding are carried out on the YCbCr444 data according to the coding mode, and the sub-block YCbCr422 value is obtained through an original Y component value and a coded CbCr value;
and performing Cb decoding and Cr decoding on the YCbCr422 data according to the decoding mode, and obtaining the YCbCr444 sub-block by using the original Y component value and the decoded CbCr value.
The YCbCr444 and YCbCr422 conversion methods provided by the invention are explained as follows according to the YCbCr4:4:4 and YCbCr4:2:2 conversion methods:
the conventional YCbCr4:2:2 conversion method only considers the correlation in the horizontal direction of an image, two-point average processing is carried out on a Cb value and a Cr value, and actually, when the image is locally correlated in the vertical direction or other directions, the conventional conversion strategy can cause image blurring.
The YCbCr444 and YCbCr422 conversion method provided by the invention divides the image into small blocks of 2X2 pixel points, and sets the adjacent color blocks into a group according to the characteristic that the image data always presents block distribution, so that at most seven possible situations appear. Respectively coding each situation according to the traditional averaging method to obtain seven coded values; and then calculating the deviation of each coded value from the original data, and selecting a group with the minimum deviation of the coding mode as the current code, namely the conversion value of YCbCr4:2: 2. The traditional conversion method only uses one of the seven cases fixedly, the deviation is not always minimum relative to the original image, and the image distortion is large.
The YCbCr444 and YCbCr422 conversion method provided by the invention uses an image mode division method as follows:
as shown in fig. 1, the 2x2 image data block is composed of four pixels, which are respectively designated as Pix0, Pix1, Pix2 and Pix3, wherein an o-circle pixel is a group of pixels with strong correlation (i.e. the adjacent pixels have close values and the difference is small), and a star pixel is another group of pixels with strong correlation. Because the pixel points of the image cannot be isolated, the pixel points of the image are always related to the peripheral pixels, so that at most seven grouping modes can be divided, as shown in fig. 1.
Mode 1: pix0-Pix1 have strong correlation as a group; pix2-Pix3 have a strong correlation as another group;
mode 2: pix0-Pix2 have strong correlation as a group; pix1-Pix3 have a strong correlation as another group;
mode 3: pix0-Pix3 have strong correlation as a group; pix1-Pix2 have a strong correlation as another group;
mode 4: pix0-Pix1-Pix2 have strong correlation as a group; pix3, as another group;
mode 5: pix0-Pix1-Pix3 have strong correlation as a group; pix2, as another group;
mode 6: pix1-Pix2-Pix3 have strong correlation as a group; pix0, as another group;
mode 7: pix0-Pix2-Pix3 have strong correlation as a group; pix1 as another group.
In this embodiment, considering that the value of the Y component of each pixel remains unchanged during the conversion of YCbCr4:4:4 to YCbCr4:2:2, the Cb component and the Cr component share the encoding Mode of the Y component when calculating the Cb value and the Cr value according to the encoding Mode (Mode), which can be simplified, and the encoding flow is as shown in fig. 2.
If the values of the original data YCbCr4:4:4 of the four pixels are:
y0Cb0Cr 0Y 1Cb1Cr 1Y 2Cb2Cr 2Y 3Cb3Cr3, and converting the YCbCr4:4:4 of the four pixels into YCbCr4:2:2 format, wherein the conversion steps are as follows:
1) grouping according to the seven modes, and respectively calculating the average value of the Y component.
Mode1:Y1_o_u=(Y0+Y1)/2; Y1_e_u=(Y2+Y3)/2;
Mode2:Y2_o_u=(Y0+Y2)/2; Y2_e_u=(Y1+Y3)/2;
Mode3:Y3_o_u=(Y0+Y3)/2; Y3_e_u=(Y1+Y2)/2;
Mode4:Y4_o_u=(Y0+Y1+Y2)/3; Y4_e_u=Y3;
Mode5:Y5_o_u=(Y0+Y1+Y3)/3; Y5_e_u=Y2;
Mode6:Y6_o_u=(Y1+Y2+Y3)/3; Y6_e_u=Y0;
Mode7:Y7_o_u=(Y0+Y2+Y3)/3; Y7_e_u=Y1;
When the grouping mode Modei is adopted (1< ═ i < ═ 7, i is an integer), Yi _ o _ u represents the Y component mean value of the first group of pixels, and Yi _ e _ u represents the Y component mean value of the second group of pixels.
2) And calculating the difference value square sum of the coded data and the original data under seven modes.
Mode1:diff_1=(Y0-Y1_o_u)2+(Y1-Y1_o_u)2+(Y2-Y1_e_u)2+(Y3-Y1_e_u)2
Mode2:diff_2=(Y0-Y2_o_u)2+(Y2-Y2_o_u)2+(Y1-Y2_e_u)2+(Y3-Y2_e_u)2
Mode3:diff_3=(Y0-Y3_o_u)2+(Y3-Y3_o_u)2+(Y1-Y3_e_u)2+(Y2-Y3_e_u)2
Mode4:diff_4=(Y0-Y4_o_u)2+(Y1-Y4_o_u)2+(Y2-Y4_o_u)2+(Y3-Y4_e_u)2
Mode5:diff_5=(Y0-Y5_o_u)2+(Y1-Y5_o_u)2+(Y3-Y5_o_u)2+(Y2-Y5_e_u)2
Mode6:diff_6=(Y1-Y6_o_u)2+(Y2-Y1_o_u)2+(Y3-Y1_o_u)2+(Y0-Y4_e_u)2
Mode7:diff_7=(Y0-Y7_o_u)2+(Y2-Y7_o_u)2+(Y3-Y7_o_u)2+(Y1-Y7_e_u)2
Wherein diff _ i (1< ═ i < ═ 7, i is an integer) indicates the sum of squares of Y component coded values and Y component values of corresponding pixels when the grouping mode is adopted, and indicates the Y component deviation value of the grouping mode. 3) Selecting the Mode with the smallest sum of squares in the 2) to determine as the encoding Mode of the current 2 × 2 image block, which is expressed by Mode.
Diff_min=Min(diff_1,diff_2,diff_3,diff_4,diff_5,diff_6,diff_7);
If(Diff_min==diff_1) Mode=1;
Else If(Diff_min==diff_2) Mode=2;
Else If(Diff_min==diff_3) Mode=3;
Else If(Diff_min==diff_4) Mode=4;
Else If(Diff_min==diff_5) Mode=5;
Else If(Diff_min==diff_6) Mode=6;
Else If(Diff_min==diff_7) Mode=7;
4) Encoding Cb in the encoding mode determined in 3).
If(Mode==1) Cb_o=(Cb0+Cb1)/2; Cb_e=(Cb2+Cb3)/2;
Else if(Mode==2) Cb_o=(Cb0+Cb2)/2; Cb_e=(Cb1+Cb3)/2;
Else if(Mode==3) Cb_o=(Cb0+Cb3)/2; Cb_e=(Cb1+Cb2)/2;
Else if(Mode==4) Cb_o=(Cb0+Cb1+Cb2)/3; Cb_e=Cb3;
Else if(Mode==5) Cb_o=(Cb0+Cb1+Cb3)/3; Cb_e=Cb2;
Else if(Mode==6) Cb_o=(Cb1+Cb2+Cb3)/3; Cb_e=Cb0;
Else if(Mode==7) Cb_o=(Cb0+Cb2+Cb3)/3; Cb_e=Cb1;
In the Mode encoding Mode, Cb _ o represents the Cb value of the first group of pixels, and Cb _ e represents the Cb value of the second group of pixels.
5) Encoding Cr in the encoding mode determined in 3).
If(Mode==1) Cr_o=(Cr0+Cr1)/2; Cr_e=(Cr2+Cr3)/2;
Else if(Mode==2) Cr_o=(Cr0+Cr2)/2; Cr_e=(Cr1+Cr3)/2;
Else if(Mode==3) Cr_o=(Cr0+Cr3)/2; Cr_e=(Cr1+Cr2)/2;
Else if(Mode==4) Cr_o=(Cr0+Cr1+Cr2)/3; Cr_e=Cr3;
Else if(Mode==5) Cr_o=(Cr0+Cr1+Cr3)/3; Cr_e=Cr2;
Else if(Mode==6) Cr_o=(Cr1+Cr2+Cr3)/3; Cr_e=Cr0;
Else if(Mode==7) Cr_o=(Cr0+Cr2+Cr3)/3; Cr_e=Cr1;
In the Mode encoding Mode, Cr _ o represents the Cr value of the first group of pixels, and Cr _ e represents the Cr value of the second group of pixels.
By this time the encoding is completed, 2x2 of the sub-block, i.e. YCbCr4:2:2 of the four pixels results in [ Y0Cb _ o ], [ Y1Cr _ o ], [ Y2Cb _ e ], [ Y3Cr _ e ].
The decoding process of converting YCbCr422 into YCbCr444 is shown in the attached figure 3:
since the component Y is not encoded, the Mode of each 2 × 2 block is calculated by using the component Y in the same manner as the above encoding process, and the component CbCr restores the corresponding YCbCr444 value in this Mode. (the restoration method is the conventional YCbCr4:2:2 to YCbCr4:4:4 method, i.e. the mean value is used as the decoded value of the corresponding pixel)
Converting original YCbCr4:2:2 data [ Y0Cb0], [ Y1Cr1], [ Y2Cb2] and [ Y3Cr3] into YCbCr4:4:4 format, and comprises the following steps:
1) like the conversion step 1)2)3) in the above-mentioned YCbCr4:4:4 to YCbCr4:2: 2), the image decoding Mode is determined by the Y component;
2) decoding the Cb component according to the decoding mode calculated in 1), and outputting the Cb component by using the coded value (mean value) as an actual decoded value;
If(Mode==1)
Cb0_o=Cb0;Cb1_o=Cb0;Cb0_e=Cb2;Cb1_e=Cb2;
Else if(Mode==2)
Cb0_o=Cb0;Cb1_o=Cb2;Cb0_e=Cb0;Cb1_e=Cb2;
Else if(Mode==3)
Cb0_o=Cb0;Cb1_o=Cb2;Cb0_e=Cb2;Cb1_e=Cb0;
Else if(Mode==4)
Cb0_o=Cb0;Cb1_o=Cb0;Cb0_e=Cb0;Cb1_e=Cb2;
Else if(Mode==5)
Cb0_o=Cb0;Cb1_o=Cb0;Cb0_e=Cb2;Cb1_e=Cb0;
Else if(Mode==6)
Cb0_o=Cb2;Cb1_o=Cb0;Cb0_e=Cb0;Cb1_e=Cb0;
Else if(Mode==7)
Cb0_o=Cb0;Cb1_o=Cb2;Cb0_e=Cb0;Cb1_e=Cb0;
cb0_ o represents the Cb value of the pixel Pix0, Cb1_ o represents the Cb value of the pixel Pix1, Cb0_ e represents the Cb value of the pixel Pix2, and Cb1_ e represents the Cb value of the pixel Pix 3.
3) Decoding a Cr component according to the decoding mode calculated in 1), and outputting an encoded value (mean value) as an actual decoded value;
If(Mode==1)
Cr0_o=Cr0;Cr1_o=Cr0;Cr0_e=Cr2;Cr1_e=Cr2;
Else if(Mode==2)
Cr0_o=Cr0;Cr1_o=Cr2;Cr0_e=Cr0;Cr1_e=Cr2;
Else if(Mode==3)
Cr0_o=Cr0;Cr1_o=Cr2;Cr0_e=Cr2;Cr1_e=Cr0;
Else if(Mode==4)
Cr0_o=Cr0;Cr1_o=Cr0;Cr0_e=Cr0;Cr1_e=Cr2;
Else if(Mode==5)
Cr0_o=Cr0;Cr1_o=Cr0;Cr0_e=Cr2;Cr1_e=Cr0;
Else if(Mode==6)
Cr0_o=Cr2;Cr1_o=Cr0;Cr0_e=Cr0;Cr1_e=Cr0;
Else if(Mode==7)
Cr0_o=Cr0;Cr1_o=Cr2;Cr0_e=Cr0;Cr1_e=Cr0;
cr0_ o represents the Cr value of the pixel Pix0, Cr1_ o represents the Cr value of the pixel Pix1, Cr0_ e represents the Cr value of the pixel Pix2, and Cr1_ e represents the Cr value of the pixel Pix 3.
To this end, decoding is complete, corresponding to YCbCr4:4:4 data: [ Y0Cb0_ oCr0_ o ], [ Y1Cb1_ oCr1_ o ], [ Y2Cb0_ eCr0_ e ], [ Y3Cb1_ eCr1_ e ].
Compared with the traditional color space conversion method, the YCbCr444 and YCbCr422 conversion method provided by the invention considers the image data change direction, selects the best mode for multipoint averaging aiming at the image change characteristics, calculates the CbCr value of the YCbCr422 and can effectively improve the image quality.
The above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (9)

1. A YCbCr444 and YCbCr422 conversion method is characterized by comprising the following steps:
dividing an image into subblocks of 2x2 pixel points, dividing four pixel points in the subblocks into two groups to obtain 7 grouping modes, and calculating Y component coding values of each group of pixel points in each grouping mode;
calculating the deviation value of the Y component coding value and the Y component of the corresponding pixel point in each grouping mode, and selecting the grouping mode with the minimum deviation value to determine the coding/decoding mode of the subblock; the deviation value is the sum of squares of differences between the Y component coding value and the Y component of the corresponding pixel point;
cb coding and Cr coding are carried out on the YCbCr444 data according to the coding mode, and the sub-block YCbCr422 value is obtained through an original Y component value and a coded CbCr value;
and performing Cb decoding and Cr decoding on the YCbCr422 data according to the decoding mode, and obtaining the YCbCr444 sub-block by using the original Y component value and the decoded CbCr value.
2. The YCbCr444 and YCbCr422 conversion method according to claim 1, wherein calculating Y component encoded values for each group of pixels in each grouping mode further comprises: and the Y component coding value is the average value of the Y components of each group of pixel points in each grouping mode.
3. The YCbCr444 and YCbCr422 conversion method according to claim 1, wherein selecting the grouping mode with the smallest deviation value to determine the encoding/decoding mode of said sub-block further comprises: the image quality distortion of the sub-blocks is minimal in the coding mode.
4. The YCbCr444 and YCbCr422 conversion method of claim 1, wherein Cb encoding YCbCr444 data according to said encoding mode further comprises: and calculating the Cb encoding value of each group of pixel points in the encoding mode by adopting an averaging method.
5. The YCbCr444 and YCbCr422 conversion method of claim 1, wherein Cr-encoding YCbCr444 data according to said encoding mode further comprises: and calculating the Cr encoding value of each group of pixel points in the encoding mode by adopting an averaging method.
6. The YCbCr444 and YCbCr422 conversion method according to claim 1, wherein original Y component values and encoded CbCr values get said sub-block YCbCr422 values, further comprising four original values for Y component and two encoded values for Cb component and two encoded values for Cr component get converted YCbCr422 values.
7. The YCbCr444 and YCbCr422 conversion method of claim 1, wherein Cb decoding YCbCr422 data according to said decoding mode further comprises: and taking the Cb value of each group of pixel points in the decoding mode as the Cb component value of the corresponding pixel point.
8. The YCbCr444 and YCbCr422 conversion method of claim 1, wherein Cr-decoding YCbCr422 data according to said decoding mode further comprises: and taking the Cr value of each group of pixel points in the decoding mode as the Cr component value of the corresponding pixel point.
9. The YCbCr444 and YCbCr422 conversion method according to claim 1, wherein original Y component values and decoded CbCr values get said sub-block YCbCr444 values, further comprising four original values of Y component and four decoded values of Cb component and four decoded values of Cr component get converted YCbCr444 values.
CN201911012854.0A 2019-10-23 2019-10-23 YCbCr444 and YCbCr422 conversion method Active CN110636304B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911012854.0A CN110636304B (en) 2019-10-23 2019-10-23 YCbCr444 and YCbCr422 conversion method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911012854.0A CN110636304B (en) 2019-10-23 2019-10-23 YCbCr444 and YCbCr422 conversion method

Publications (2)

Publication Number Publication Date
CN110636304A CN110636304A (en) 2019-12-31
CN110636304B true CN110636304B (en) 2021-11-12

Family

ID=68977487

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911012854.0A Active CN110636304B (en) 2019-10-23 2019-10-23 YCbCr444 and YCbCr422 conversion method

Country Status (1)

Country Link
CN (1) CN110636304B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116170592B (en) * 2023-04-21 2023-08-15 深圳市微智体技术有限公司 High-resolution video transmission method, device, equipment and medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102970541A (en) * 2012-11-22 2013-03-13 深圳市海思半导体有限公司 Image filtering method and device
CN104768019A (en) * 2015-04-01 2015-07-08 北京工业大学 Adjacent disparity vector obtaining method for multi-texture multi-depth video

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NO319007B1 (en) * 2003-05-22 2005-06-06 Tandberg Telecom As Video compression method and apparatus
US9979960B2 (en) * 2012-10-01 2018-05-22 Microsoft Technology Licensing, Llc Frame packing and unpacking between frames of chroma sampling formats with different chroma resolutions
CN103747162B (en) * 2013-12-26 2016-05-04 南京洛菲特数码科技有限公司 A kind of image sampling method for external splicer or hybrid matrix
CN106464887B (en) * 2014-03-06 2019-11-29 三星电子株式会社 Picture decoding method and its device and image encoding method and its device
JP6344082B2 (en) * 2014-06-19 2018-06-20 株式会社ニコン Encoding device, decoding device, encoding method, and decoding method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102970541A (en) * 2012-11-22 2013-03-13 深圳市海思半导体有限公司 Image filtering method and device
CN104768019A (en) * 2015-04-01 2015-07-08 北京工业大学 Adjacent disparity vector obtaining method for multi-texture multi-depth video

Also Published As

Publication number Publication date
CN110636304A (en) 2019-12-31

Similar Documents

Publication Publication Date Title
Poynton Digital video and HD: Algorithms and Interfaces
CN110446041B (en) Video encoding and decoding method, device, system and storage medium
CN107580222B (en) Image or video coding method based on linear model prediction
JP4773966B2 (en) Video compression method and apparatus
CN107615761B (en) Methods, devices and computer readable media for pixel processing and encoding
CN107483934B (en) Coding and decoding method, device and system
Ström et al. Luma adjustment for high dynamic range video
US20200404339A1 (en) Loop filter apparatus and method for video coding
WO2020135357A1 (en) Data compression method and apparatus, and data encoding/decoding method and apparatus
WO2014079036A1 (en) Image compression method and image processing apparatus
WO2019092463A1 (en) Video image processing
WO2018178367A1 (en) Method and device for color gamut mapping
KR20180044291A (en) Coding and decoding methods and corresponding devices
CN101505432B (en) Chroma enhancing system and method
CN106973295A (en) Method for video coding/device and video encoding/decoding method/device
CN110636304B (en) YCbCr444 and YCbCr422 conversion method
CN110719484B (en) Image processing method
Pan et al. HDR video quality assessment: Perceptual evaluation of compressed HDR video
Kobayashi et al. A 4K/60p HEVC real-time encoding system with high quality HDR color representations
CN114788280A (en) Video coding and decoding method and device
CN107925778A (en) Pixel pre-processes and coding
Dong et al. HDR video compression using high efficiency video coding (HEVC)
CN111566694A (en) Light level management of content scan adaptive metadata
CN115118964A (en) Video encoding method, video encoding device, electronic equipment and computer-readable storage medium
Takeuchi et al. A gamut-extension method considering color information restoration using convolutional neural networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant