Embodiment
See also Fig. 1, Fig. 1 is according to the image treatment method flow chart in the specific embodiment of the present invention.This method is execution in step S11 at first, and an image that comprises a plurality of zones is provided.For example, definition is the zonule that the image of 1200*900 pixel can be divided into a plurality of each self-contained 3*1 pixel.These dividing region are virtual, do not represent originally just to exist the physical segregation line in the image, or must add separator bar in image.
Then, step S12 judges these each self-corresponding gray scale variation degree of zone respectively.Step S13 determines according to these gray scale variation degree whether this image is a conversion back image that is converted to high definition by low definition.As discussed previously, change the image that enlarges size through definition and have the milder characteristic of gray scale variation usually.Therefore, if the image that receives is conversion back image among the step S11, the gray scale variation degree in most zones that step S12 found out should can be too not high.According to this principle, step S13 can judge whether this image is the previous image that the definition conversion enlarges size of once crossing.
After above-mentioned steps S13, can further comprise the step S14 that promotes image quality according to image treatment method of the present invention.More particularly, if the image that provides among the step S13 determination step S11 is conversion back image, this method can be carried out image quality lifting at this image and handle.In addition, this image might be to belong to the part that certain comprises the video flowing of many images.If step S13 judges this image and is conversion back image, represent that the video flowing under this image may be through definition conversion from low to high.Therefore, can carry out image quality at the video flowing that comprises this image according to image treatment method of the present invention and promote processing.On the practice, this image quality promotes to handle and can be the sharpening processing, but not as limit.
Relatively, if step S13 determination step S11 in providing image and non-conversion after image, this method just can not carried out image quality to the video flowing under this image or this image and promote and handle.
Suppose that certain zone in this image comprises a plurality of pixels, and these pixels has a GTG value separately.Step S12 can calculate a maximum gray gap (that is the maximum gray value deducts the result of minimum gray value in this zone) in this zone according to these GTG values, as this regional gray scale variation degree.On the other hand, step S13 can at first calculate the summation of each regional gray scale variation degree, more relatively this summation and a summation threshold value.If this summation is lower than this summation threshold value, represent that the gray scale variation degree of this image integral body is on the low side, therefore may be image through the definition conversion.
In one embodiment, each zone of this image all comprises 3 pixels of arranging in regular turn, and the zone that step 12 is being handled is called the target area, and the target area comprises one first pixel, one second pixel and one the 3rd pixel in regular turn.First pixel has the first GTG value P1, and second pixel has the second GTG value P2, and the 3rd pixel has the 3rd GTG value P3.(P3) the maximum gray value in these three GTG values of representative calculating deducts the maximum gray gap of minimum gray value gained to minmax for P1, P2.(P1, P2 P3) represent the middle GTG value of these three GTG values to med; Abs[P2-med (P1, P2, P3)] then be the absolute difference that P2 and this middle GTG value are calculated in representative.
Fig. 2 be according to maximum gray gap minmax (P1, P2, P3) and middle GTG value med (P1, P2 P3) determine the gray scale variation degree methods of this target area.At first, step S201 be calculate maximum gray gap minmax (P1, P2, P3).Then, step S202 relatively maximum gray gap minmax (P1, P2 is P3) with the size of one first threshold value T1.If (P1, P2 P3) greater than the first threshold value T1, represent that scope that these three pixels are formed has gray scale variation to a certain degree to maximum gray gap minmax, and this method is execution in step S204, and one first assessed value is made as A1.If (P3) less than the first threshold value T1, this method is execution in step S203 to maximum gray gap minmax, and this first assessed value is made as A0 for P1, P2.A1 is greater than A0.In other words, (when P3) higher, corresponding first assessed value also can be bigger for P1, P2 as maximum gray gap minmax.
Step S205 be calculated difference absolute value abs[P2-med (P1, P2, P3)].If this result of calculation equals zero, and GTG value med in the middle of expression P2 equals (P1, P2, P3).That is to say that though the scope that these three pixels are formed has gray scale variation (judged result of step S202) to a certain degree, P1, P2, these three values of P3 are to arrange in regular turn from low to high or from high to low.Can find out that thus (P1, P2 P3), low-Gao-low or high-such gray scale variation of low-Gao do not occur to GTG value med within this scope in the middle of P2 equals.In addition, if absolute difference abs[P2-med (P1, P2, P3)] higher, represent that the gray scale variation of low in this zone-Gao-low or high-low-Gao is more violent.
Step S206 be comparing difference absolute value abs[P2-med (P1, P2, P3)] with the size of one second threshold value T2.If absolute difference abs[P2-med (P1, P2, P3)] less than the second threshold value T2, this method is execution in step S207, and one second assessed value is made as B0.If absolute difference abs[P2-med (P1, P2, P3)] greater than the second threshold value T2, then this method can execution in step S208, comparing difference absolute value abs[P2-med (P1, P2, P3)] with the size of one the 3rd threshold value T3.And the 3rd threshold value T3 is higher than the second threshold value T2.
If absolute difference abs[P2-med (P1, P2, P3)] greater than the 3rd threshold value T3, second assessed value can be set as B3 in step S209.If absolute difference abs[P2-med (P1, P2, P3)] less than the 3rd threshold value T3, this method can continue execution in step S210, comparing difference absolute value abs[P2-med (P1, P2, P3)] with the size of one the 4th threshold value T4.The 4th threshold value T4 is between the second threshold value T2 and the 3rd threshold value T3.
If absolute difference abs[P2-med (P1, P2, P3)] greater than the 4th threshold value T4, second assessed value can be set as B2 in step S211.If absolute difference abs[P2-med (P1, P2, P3)] less than the 4th threshold value T4, this method meeting execution in step S212 is made as B1 with this second assessed value.
B0, B1, B2, B3 are four ascending numerical value that increase in regular turn.For example, these four values can be set to 0,1,4,32 respectively.As seen from Figure 2, absolute difference abs[P2-med (P1, P2, P3)] higher, second assessed value of corresponding generation is bigger.Second assessed value can be regarded as absolute difference abs[P2-med (P1, P2, P3)] a pairing weighted value.
Can be according to image treatment method of the present invention simultaneously according to maximum gray gap minmax (P1, P2, P3) with absolute difference abs[P2-med (P1, P2, P3)] assess the gray scale variation degree of this target area, also can be only by selecting a foundation in these two numerical value as the gray scale variation degree of judging the target area.
The image that step S11 provided comprises a plurality of zones, can find out pairing first assessed value of each regional gray scale variation degree and second assessed value respectively at each zone execution step as shown in Figure 2 of this image in step S12 according to image treatment method of the present invention.Whether process in accordance with the present invention S13 can add up first assessed value, second assessed value of All Ranges, be the foundation of conversion back image as whole image of assessment.If first assessed value after adding up and second assessed value show that all the whole gray scale variation of this image is big inadequately, this image of image treatment method decidable then according to the present invention is the previous image that the definition conversion enlarges size of once crossing.In one embodiment, image treatment method of the present invention can be only according to the part of image, judge whether this image is the image that conversion enlarges size through definition, for instance, judge according to 1/4,1/2 or 2/3 presentation content whether this image is the image that conversion enlarges size through definition.
See also Fig. 3 (A), Fig. 3 (A) is according to the image processor calcspar in the specific embodiment of the present invention.This image processor mainly comprises a judging unit 31 and an adjustment unit 32.Whether 31 of judging units comprise an image in a plurality of zones in order to reception, and determine these each self-corresponding gray scale variation degree of zone, be a conversion back image that is converted to high definition by low definition to judge this image according to these gray scale variation degree.If judging unit 31 these images of judgement are image after changing, adjustment unit 32 is promptly carried out an image quality at the video flowing under this image or this image and is promoted processing, and on the practice, this image quality promotes processing and can be the sharpening processing, but not as limit.
Fig. 3 (B) illustrates one of judging unit 31 to implement example.In this example, judging unit 31 comprises the first counting circuit 31A, the first decision-making circuit 31B, the first comparison circuit 31C, the second counting circuit 31D, the second decision-making circuit 31E, adds way circuit 31F, and the second comparison circuit 31G.
The first counting circuit 31A is in order to the step S201 in the execution graph 2, that is according to three GTG values in the target area calculate this target area maximum gray gap minmax (P1, P2, P3).The first decision-making circuit 31B is in order to execution in step S202~step S204, that is (P1, P2 P3) determine first pointer of the gray scale variation degree of this target area, i.e. first assessed value according to minmax.
The first comparison circuit 31C is in order to relatively this first GTG value, this second GTG value and the 3rd GTG value, and by GTG value in the middle of wherein selecting.In other words, the first comparison circuit 31C be responsible for calculating med (P1, P2, P3).The second counting circuit 31D then in order to calculate abs[P2-med (P1, P2, P3)].The second decision-making circuit 31E is execution in step S206~step S212, according to abs[P2-med (P1, P2, P3)] determine second pointer of the gray scale variation degree of this target area, i.e. second assessed value.
Add first assessed value summation and the second assessed value summation of way circuit 31F in order to calculate each zone respectively.The second comparison circuit 31G is then in order to the first assessed value summation and one first summation threshold value relatively, and with the second assessed value summation and the comparison of one second summation threshold value.If these two summations all are lower than each self-corresponding threshold value, then judging unit 31 judges that promptly this image is this conversion back image.
Function block schematic diagram shown in Fig. 3 (B) is a preferred embodiment of judging unit, in the enforcement, judging unit 31 can only comprise the first counting circuit 31A, the first decision-making circuit 31B, add way circuit 31F, and the second comparison circuit 31G, whether be this conversion back image to assess this image by the summation of first assessed value.Whether judging unit 31 can also only comprise the first comparison circuit 31C, the second counting circuit 31D, the second decision-making circuit 31E, add way circuit 31F, and the second comparison circuit 31G, be this conversion back image to assess this image by the summation of second assessed value.
See also Fig. 4 (A), Fig. 4 (A) is according to the image treatment method flow chart in another specific embodiment of the present invention.This method is execution in step S41 at first, and a video flowing is provided.Then, step S42 is for judging that whether comprising M in this video flowing opens by low definition and convert image after the conversion of high definition to, and M is a positive integer.If the judged result of step S42 is for being, this method is execution in step S43, carries out an image quality at this video flowing and promotes and handle (for example a sharpening is handled).Relatively, if the judged result of step S42 is not for, this method re-executes step S42, continues whether to comprise in this video flowing of detecting conversion back image.
When M equals 1, as long as detecting, expression comprises image after the conversion in this video flowing, this method will execution in step S43.When M greater than 1, expression step S42 must detect and comprise not only image after the conversion in this video flowing, this method is just understood execution in step S43.On the practice, step S42 has two kinds of different possibilities.First kind for judging that whether comprising M in this video flowing opens image after the continuous conversion.Second kind then is to judge that whether comprising M in this video flowing opens conversion back image, and image is discontinuous in this video flowing even this M opens the conversion back also has no relations.
Fig. 4 (B) is that one of the image treatment method shown in Fig. 4 (A) extends example.In this example, if the judged result of step S42 is for being, except step S43, this method also can execution in step S44, judges whether comprise the non-conversion image that P opens in this video flowing after the image after the conversion that this M opens, P also is a positive integer.If the judged result of step S44 represents that for being previous judgement has error, perhaps ensuing this section image no longer has been a conversion back image in the video flowing.Therefore, if the judged result of step S44 is for being, this method is execution in step S45, stops this image quality and promotes and handle.Relatively, if the judged result of step S44 is that this method can not re-execute step S44, continue to detect in this video flowing whether comprise the non-conversion image.
When P equals 1, as long as detecting, expression comprises a non-conversion image in this video flowing, this method will execution in step S45.When P greater than 1, expression step S44 must detect and comprise not only image after the conversion in this video flowing, this method is just understood execution in step S45.On the practice, step S44 has two kinds of different possibilities.First kind for to judge that whether comprising P in this video flowing opens continuous non-conversion image.Second kind then is to judge that whether comprising P in this video flowing opens the non-conversion image, and the non-conversion image is discontinuous in this video flowing also to have no relations even this P opens.
On the practice, the process step that step S42 and step S44 can utilize Fig. 1 and Fig. 2 to illustrate judges respectively whether each image in this video flowing is through image after the conversion of definition conversion.
See also Fig. 5, Fig. 5 is according to the image processor calcspar in another specific embodiment of the present invention.This image processor mainly comprises a judging unit 51 and an adjustment unit 52.Judging unit 51 is in order to judge that whether comprising M in the video flowing opens by low definition and convert image after the conversion of high definition to.If the judged result of judging unit 51 is opened conversion back image for this video flowing comprises this M, adjustment unit 52 is promptly carried out an image quality at this video flowing and is promoted processing (for example a sharpening is handled).
In addition, judging that this video flowing comprises after the conversion that this M opens after the image, judging unit 51 can continue to judge whether comprise the non-conversion image that P opens in this video flowing after the image after the conversion that this M opens.If judging unit 51 judges that this video flowing comprises the non-conversion image that this P opens, adjustment unit 52 can stop this image quality and promote processing.
Identical with previous embodiment is that it can also can be discontinuous for continuously in video flowing that this M opens conversion back image.The non-conversion image that this P opens can also can be discontinuous for continuously in video flowing.In addition, judging unit 51 also can utilize the circuit of Fig. 3 (B) to judge respectively whether each image in this video flowing is through image after the conversion of definition conversion.
As mentioned above, the present invention proposes and judge whether image once crossed the method and apparatus of definition conversion from low to high, also propose to improve the method and apparatus of image quality, with the problem of bad after the solution image process definition conversion process at this video flowing.
By the above detailed description of preferred embodiments, be to wish to know more to describe feature of the present invention and spirit, and be not to come category of the present invention is limited with above-mentioned disclosed preferred embodiment.On the contrary, its objective is that hope can contain in the category of claim of being arranged in of various changes and tool equality institute of the present invention desire application.