CN101626508A - Method for judging blockage area in frame rate promotion - Google Patents

Method for judging blockage area in frame rate promotion Download PDF

Info

Publication number
CN101626508A
CN101626508A CN200810132760A CN200810132760A CN101626508A CN 101626508 A CN101626508 A CN 101626508A CN 200810132760 A CN200810132760 A CN 200810132760A CN 200810132760 A CN200810132760 A CN 200810132760A CN 101626508 A CN101626508 A CN 101626508A
Authority
CN
China
Prior art keywords
block
frame
motion vector
blocks
needs
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN200810132760A
Other languages
Chinese (zh)
Inventor
宋海彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BAOLI MICRO-ELECTRONICS SYSTEM HOLDING Co Ltd KY
Original Assignee
BAOLI MICRO-ELECTRONICS SYSTEM HOLDING Co Ltd KY
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BAOLI MICRO-ELECTRONICS SYSTEM HOLDING Co Ltd KY filed Critical BAOLI MICRO-ELECTRONICS SYSTEM HOLDING Co Ltd KY
Priority to CN200810132760A priority Critical patent/CN101626508A/en
Publication of CN101626508A publication Critical patent/CN101626508A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

In order to accurately position a blockage area, the invention provides a method for judging the blockage area in frame rate promotion, which can effectively position the blockage area and can distinguish a sheltering area and an exposure area. The method comprises the following steps: equally dividing the video image required to be inserted into a plurality of blocks; extending the motion vector direction required to judge the blocks on the inserted video image to adjacent front and back frame video images to obtain two blocks; and calculating the similarity between the two blocks, wherein the similarity is obtained on the basis of the statistical quantity from the first order to the nth order of the pixel attribute values in the two blocks, n is more than or equal to 2, and if the similarity reaches the set threshold, namely when the difference between the two blocks reach the scheduled degree, the blocks required to be judged are positioned in the blockage area. The method can be applied to the field of video image interpolation, and has excellent processing effect.

Description

A kind of method of judging occlusion areas in the frame per second lifting
Technical field
The present invention relates to video image processing technology, particularly video image frame per second lift technique.
Background technology
The large-scale application of digital video and multimedia messages has been brought the diversity of display format.Therefore, the conversion between these forms just seems extremely important.And frame per second lifting (FRUC) is exactly that a kind of energy will hang down the technology that the frame-rate video sequence is converted into high frame-rate video sequence.Frame per second promotes has wide application, below for two example explanations.
One of application of frame per second method for improving: panel TV (as LCD) is popular on a large scale now, the frame per second of raw video image is 50 or 60Hz, be equivalent to every two field picture and will on screen, have 20 or 16.67ms, when video image content has tangible motion, when watching LCD, people's subjectivity can feel to have jitter phenomenon to occur.And employing FRUC technology, original frame per second can be promoted one times, reach 100 or 120HZ, the demonstration time of so every two field picture just dropped to 10 or 8.3ms about, can promote subjective effect greatly, the motion of video content also can seem more continuously and nature.
Two of the application of frame per second method for improving: the display frequency of film is 24HZ now, and the broadcast frequency of TV is 60Hz now, if want movie contents is play on TV, must change the movie contents of 24Hz into 60Hz, promote frame per second under this situation that need consider to move, this also is main application of FRUC.
Frame per second promotes and can simply be divided into two classes: a class be adopt that frame duplicates, linear interleave and insertion ash frame or black frame, the common ground of these methods is movable informations of not considering between adjacent two two field pictures, therefore produce strong sawtooth and shake at the edge of the moving object of image, bring serious deteriroation of image quality.And the shortcoming of these class methods can adopt interleave (MCFI) method based on motion estimation/motion compensation to remove, and this also is second class methods introduced below and the different places of first kind method.
The second class frame per second method for improving just is based on motion estimation/motion compensation and carries out interleave: if the movable information between two frames can accurately be estimated, so just can simply carry out interleave along the direction of motion that estimates.But, obturation in image (Occlusion) zone, pixel (pixel) information that needs to insert just is present on former frame or back one two field picture, these pixels just can not find suitable coupling between two frames, the motion vector (MV) that adopts original method for estimating to draw in occlusion areas also is unreliable, incorrect so.Disclosed the notion in the obturation in the image (Occlusion) zone among Fig. 1.Two two field pictures---the N-1 frame and the N frame of two vertically parallel solid line representative orders among Fig. 1.Need insert a two field picture in vertical dotted line present position.The frame of broken lines in the N-1 two field picture that sign 2 is pointed to characterizes a mobile object, certainly in the N two field picture, the coordinate position of object in image that should move will change, suppose that this object just is moved upward, two zones in the sign 1 insertion image pointed do not have the image information that can directly obtain respectively in N-1 frame or N frame in the image that will insert like this, and identifying 1 zone pointed is exactly occlusion areas.Occlusion areas is further classified, can be divided into occlusion area 1.1 (covering area) and appear zone 1.2 (uncovering area), wherein occlusion area is meant the zone that does not have correspondence image information in the former frame image, appears the zone is meant does not have correspondence image information in one two field picture of back zone.
If directly neglect occlusion areas, the direct result of being brought on FRUC is exactly can bring the aperture phenomenon (halo) that makes the people irritated at the edge of moving object so, therefore, solves the occlusion areas problem and just seem extremely important in the FRUC problem.
Solve the occlusion areas problem and need two steps: the one, orient occlusion areas, the 2nd, in occlusion areas, further distinguish occlusion area and appear the zone.Finish after above-mentioned two steps, can give different different motion vector (MV) and the interpolative mode of region allocation, reach the target of perfect interleave.
Existing judgement occlusion areas mainly adopts SAD (Sum of Absolute Difference) method, its cardinal principle is based on piece coupling (block-based), piece is the several portions that a frame video image is divided into, each several part comprises some pixels (as 8 * 8 pixels), and every part all is the base unit when carrying out image processing.The SAD method as reference block, finds the piece that is complementary with him with the piece on the frame A on another frame B.Its process is such: mate at the reference block that we get on a lot of reference block (motion vector of the corresponding reference of each reference block) and the A on another frame B, up to finding that to make the piece of (the mating most in other words conj.or perhaps) of matching value minimum, reference motion vector at this moment is promptly as final motion vector as a result.
In the Practical Calculation process of piece coupling, this seeks the process of mating most is exactly to seek the process that makes SAD one antipode summation minimum.That reference motion vector that makes the SAD minimum is exactly the motion vector of final this piece.
If a moving object all exists on two frames, can between two frames, find suitable coupling so, the value of the SAD that calculates like this can be very not big.If but at occlusion area,, utilize the method for piece coupling wrong at the motion vector that occlusion area finds because between two frames, there is not identical pixel to mate, the corresponding sad value of the motion vector of mistake will be very big.Under actual conditions, selected because threshold value is difficult to, utilize the method for SAD not pick out all occlusion areas.
Summary of the invention
In order accurately to locate occlusion areas, the invention provides a kind of method of judging occlusion areas in the frame per second lifting, can effectively locate occlusion areas, and distinguish occlusion area on this basis and appear the zone.
Technical scheme of the present invention is as follows:
The method of judging occlusion areas in the frame per second lifting comprises the steps:
The video image that needs are inserted is divided into some blocks, the consecutive frame video image obtained two blocks before and after the motion vector direction that needs to judge block on this insertion video image extended to, calculate the similarity between these two blocks, 1 rank that described similarity is based on pixel property value in two blocks obtain to the statistic on n rank, n 〉=2, if similarity reaches preset threshold, when the difference of promptly described two blocks reached predetermined degree, then these needs judged that block is positioned at occlusion areas.
Described similarity is the S index, and less than preset threshold, then this block is positioned at occlusion areas as if the S index; The method of calculating the S index comprises the steps:
Calculate the motion vector that needs to judge block, obtain two blocks that described extension obtains;
Calculate the average u of pixel property value in two blocks that described extension obtains respectively XAnd u Y
Calculate the variances sigma of pixel property value in two blocks that described extension obtains respectively X 2And σ Y 2
Calculate the covariance sigma of two block of pixel property values that described extension obtains XY
Order l ( X , Y ) = 2 u X u Y u X 2 + u Y 2 , c ( X , Y ) = 2 σ X σ Y σ X 2 + σ Y 2 , s ( X , Y ) = σ XY σ X σ Y ;
Calculating S (X, Y)=[l (X, Y)] α* [c (X, Y)] β* [s (X, Y)] γ, α wherein, beta, gamma is an empirical value, (X Y) is the S index to S.
Described judge frame per second promote in the method for occlusion areas also comprise and calculate the step that described needs judge that the motion vector of block and adjacent block motion vector be whether continuous: the block of calculating needs judgement and the motion vector difference of adjacent block, if described motion vector difference, thinks then that the motion vector and the adjacent block motion vector of the block that needs are judged is discontinuous more than or equal to a threshold value; If judge that according to the S index of block block is positioned at occlusion areas, and the motion vector of this block and adjacent block motion vector are discontinuous, and then this block confidence level of being positioned at occlusion areas only is higher than and judges that with the S index this block is positioned at the confidence level of occlusion areas.
The described threshold value of comparing with motion vector determines that method is as follows: needs are inserted the motion vector difference that each block in the two field picture calculates itself and adjacent block, the maximum of the motion vector difference that obtains is max_mv_dif, a * max_mv_dif is described threshold value, the empirical value of a for choosing between 0-1.
Preferably, the value of a is 0.25.
After determine to need judging that block is positioned at occlusion areas, needs are judged that block belongs to occlusion area or appear the determining step in zone as follows:
4 continuous two field pictures are n-2 frame, n-1 frame, n frame and n+1 frame in proper order, the block that needs to judge is on the frame that inserts between n-1 frame and the n frame, calculate and insert the motion vector MV (n-2 that judges block same position block in the frame with described needs between n-2 frame and the n-1 frame, n-1), reach and insert the motion vector MV (n that judges block same position block in the frame with described needs between n frame and the n+1 frame, n+1), two coded representation in the bracket obtain motion vector based on image sequence number;
With MV (n-2, the direction of motion that n-1) shows extends to and obtains corresponding block B1 and B2 on n-1 frame and the n frame;
With MV (n, the direction of motion that n+1) shows extends to and obtains corresponding block B3 and B4 on n frame and the n-1 frame;
S index S between calculating B1 and B2 (B1, B2);
S index S between calculating B3 and B4 (B3, B4);
If S (B1, B2)>S (B3, B4), then block belongs to and appears the zone; If S (B1, B2)<S (B3, B4), then block belongs to occlusion area; S (B1, B2)=(B3, block can belong to and appear the zone S in the time of B4), also can belong to occlusion area.
Technique effect of the present invention:
Fig. 3 is the result who adopts after existing SAD method is carried out the frame per second lifting, and Fig. 4 is the result who adopts after S index method of the present invention carries out the frame per second lifting.Can see that from the contrast of Fig. 3 and Fig. 4 the flaw at moving object edge (artifact) obviously reduces among Fig. 4.Accompanying drawing is if coloured image, then Fig. 4 be better than the effect of Fig. 3 can be more obvious.
Description of drawings
Fig. 1 is a schematic diagram of introducing the occlusion areas notion;
Fig. 2 is a schematic diagram of distinguishing occlusion area and the method that appears the zone for explanation;
Fig. 3 is the image result that adopts after existing SAD method is carried out the frame per second lifting;
Fig. 4 is the image result that adopts after method of the present invention is carried out the frame per second lifting;
Fig. 5 is block and adjacent block location diagram.
Sign lists as follows among the figure:
1, occlusion areas; 1.1, occlusion area; 1.2, appear the zone; 2, the object image that moves in the video image.
Embodiment
Below specify technical scheme of the present invention.
At first need to locate occlusion areas, promptly in insertion frame (needing the image of insertion, as follows), belong to the occlusion areas block.In order to reach this purpose, video image (insertion frame) need be divided into some blocks, block is to comprise that quantity equates the units chunk of a pixel, and the shape of units chunk also is identical, is the square block of 8 * 8 pixels as each block.After having divided block, selected following two blocks: the motion vector direction along the block that need judge whether to belong to occlusion areas (abbreviate needs as and judge block) is extended, intersect the block that obtains with front and back two two field pictures, calculate the similarity of described two blocks, if similarity reaches preset threshold (promptly the dissimilar degree of two blocks acquires a certain degree), can judge that then this block that need judge belongs to occlusion areas.1 rank that similarity is based on pixel property value in two blocks obtain to the statistic on n rank, n 〉=2.Be that the statistic that similarity is based on from 1 rank to high-order calculates, from the knowledge of mathematical statistics as can be known,, can make that certainly judged result is more accurate based on the credible result Du Genggao that multistage statistic obtains.
Be the acquisition methods of example explanation similarity of the present invention below with the S index:
Above-mentioned supposition continues, be that block is the square block of 8 * 8 pixels, subscript X representative is inserted in the following formula needs to judge that the block motion vector extends the block that obtains on the frame former frame image, need to judge that the block motion vector extends the block that obtains on the two field picture behind the subscript Y representative insertion frame; The X of normal font (non-subscript) represents the property value of former frame image pixel, and the Y of normal font represents the property value of back one two field picture pixel, and the property value here can be property values such as the brightness, colourity, gray scale of pixel.
At first need to utilize existing method to obtain the motion vector that needs are judged block.Obtain following block: intersect the block that obtains with forward and backward two two field pictures along the motion vector extension that needs to judge block.
Calculate above-mentioned two motion vectors respectively and extend the average u of pixel property value in the block that obtains XAnd u Y: u X = X ‾ = 1 N × N Σ i = 1 N Σ j = 1 N X i , j ; u Y = Y ‾ = 1 N × N Σ i = 1 N Σ j = 1 N Y i , j .
Calculate the variances sigma of pixel property value in above-mentioned two blocks respectively X 2And σ Y 2: σ X 2 = 1 ( N - 1 ) × ( N - 1 ) Σ i = 1 N Σ j = 1 N ( X i , j - u X ) 2 ; σ Y 2 = 1 ( N - 1 ) × ( N - 1 ) Σ i = 1 N Σ j = 1 N ( Y i , j - u Y ) 2 .
Calculate the covariance sigma of above-mentioned two blocks XY: σ XY = 1 ( N - 1 ) × ( N - 1 ) Σ i = 1 N Σ j = 1 N ( X i , j - u X ) ( Y i , j - u Y ) .
Order l ( X , Y ) = 2 u X u Y u X 2 + u Y 2 , c ( X , Y ) = 2 σ X σ Y σ X 2 + σ Y 2 , s ( X , Y ) = σ XY σ X σ Y .
Then S (X, Y)=[l (X, Y)] α* [c (X, Y)] β* [s (X, Y)] γ, (X Y) is the S index to S.Wherein, α, beta, gamma index are used for controlling the proportion of each component when calculating the S index.General, we get α=β=γ=1, and have this moment: S ( X , Y ) = [ l ( X , Y ) ] × [ c ( X , Y ) ] × [ s ( X , Y ) ] = 4 u X u Y σ XY ( u X 2 + u Y 2 ) ( σ X 2 + σ Y 2 ) . (X Y) is the S index to S.
Can draw by the S index needs to judge whether block belongs to occlusion areas, in order further to increase the confidence level of judging, introduces following auxiliary judgment, i.e. whether the motion vector that calculating need degree of declaring block and the motion vector of adjacent block are continuous.The foundation of introducing this auxiliary degree of declaring is: inaccessible zone takes place all be to occur along the edge of object, there is tangible discontinuity in the motion vector that is in the block at edge with the motion vector of block on every side, and may have a great difference.
Judge the example of block and adjacent block discontinuity:
Get one 3 * 3 window, comprise 9 blocks, relative position as shown in Figure 5:
C is the block that needs judgement, and B2, B4, B5 and B7 are the pieces (being the adjacent block of the upper and lower, left and right four direction of C) of the vicinity of C, can be used for judging the motion vector discontinuity of C.
If the motion vector of C is MV (C)=(MVx (C), MVy (C)), MVx represents the motion vector components of directions X, and MVy represents the motion vector components of Y direction, and is as follows.
Accordingly, the motion vector of other blocks is:
MV(B2)=(MVx(B2),MVy(B2))
MV(B4)=(MVx(B4),MVy(B4))
MV(B5)=(MVx(B3),MVy(B5))
MV(B7)=(MVx(B3),MVy(B7))
Calculate current block C and piece B2 on every side, B4, the motion vector of B5 and B7 poor:
MV_dif(C)=abs(MVx(C)-MVx(B2))+abs(MVy(C)-MVy(B2))+
abs(MVx(C)-MVx(B4))+abs(MVy(C)-MVy(B4))+
abs(MVx(C)-MVx(B5))+abs(MVy(C)-MVy(B5))+
abs(MVx(C)-MVx(B7))+abs(MVy(C)-MVy(B7))
Wherein, the meaning that takes absolute value of abs () expression.
To each block in the two field picture, we calculate motion vector poor of itself and adjacent block, have just obtained the array MV_dif of a motion vector difference, and we add up its maximum, are designated as max_mv_dif
Following deterministic process is arranged then:
If the MV_dif of current block (C)>=max_mv_dif * 25% shows that then the motion vector of needs judgement block and the motion vector of adjacent block are discontinuous.Wherein, 25% is a selected empirical value, can change between 0-100%.
Comprehensive described auxiliary judgment, can draw the method for the higher location occlusion areas of confidence level: if the S index difference of the corresponding block of front and back two two field pictures (promptly extending needs to judge that the block motion vector obtains block) is lower than threshold value, and this needs judge that the motion vector of block and the motion vector of block on every side are discontinuous, and we just think that this block of pixels is in the zone that occlusion issue takes place so.
Secondly, after to judge block be to belong to occlusion areas, also need to determine to be any occlusion areas, promptly belong to occlusion area and still appear the zone.Because the block of occlusion areas only remains with information on one of front and back two two field pictures, therefore, only can't determine also that with front and back two two field pictures occlusion areas belongs to any on earth.The present invention utilize three frames or the more continuous video image of multiframe determine the occlusion areas type of block, concrete grammar in the example below:
Continuous n-2 frame, n-1 frame, n frame, n+1 frame are arranged, need to judge the type of inserting the occlusion areas block on the frame between n-1 frame and the n frame, suppose and insert all middles of two frame periods of frame in front and back.Calculate and insert the motion vector MV (n-2 that judges block same position block in the frame with described needs between n-2 frame and the n-1 frame, n-1), reach and insert the motion vector MV (n that judges block same position block in the frame with described needs between n frame and the n+1 frame, n+1), two coded representation in the bracket obtain motion vector based on image sequence number.
With MV (n-2, n-1) direction of motion that shows extends to and obtains corresponding block B1 and B2 on n-1 frame and the n frame, the coordinate of B1 on the n-1 frame be (p+0.5 * MV (and n-2, n-1), q+0.5 * MV (n-2, n-1)), the coordinate of B2 on the n frame be (p+1.5 * MV (and n-2, n-1), q+1.5 * MV (n-2, n-1)), wherein (p q) judges that for needing block is at the coordinate that inserts on the frame.
(n, the direction of motion that n+1) shows extends to and obtains corresponding block B3 and the B4 on the n-1 frame on the n frame on n frame and the n-1 frame with MV.The coordinate of B3 be (p-0.5 * MV (and n, n+1), q-0.5 * MV (n, n+1)), the coordinate of B4 be (p-1.5 * MV (and n, n+1), q-1.5 * MV (n, n+1)).
Respectively by preceding chat step calculate S index S between B1 and the B2 (B1, B2), and the S index S between B3 and the B4 (B3, B4).
If S (B1, B2)>S (B3, B4), the occlusion areas block that then needs to judge belongs to and appears the zone; If S (B1, B2)<S (B3, B4), the occlusion areas block that then needs to judge belongs to occlusion area; If S (B1, B2)=(B3, B4), the occlusion areas block that then needs to judge can belong to and appear the zone S, also can belong to occlusion area.
So far, realized judgement to the type of the block that belongs to occlusion areas.
Whether belong to occlusion areas at the block of judging the insertion frame, and after belonging to the occlusion areas of which kind of type, can handle targetedly, obtain the good insertion two field picture of effect, image result as shown in Figure 4 through the inventive method processing.
The follow-up processing targetedly can be as following processing mode:
For non-occlusion areas class block, originally the motion vector of the block that calculates based on front and back two two field pictures is constant, or MV (n-1, n), its interpolation method is two-way interpolation.
For appearing the zone, the motion vector of block replaces with the motion vector of the same coordinate block of back one frame, and (n, n+1), its interpolation method is that the back is to unidirectional interpolation for MV.
For occlusion area, the motion vector of block replaces with the motion vector of the same coordinate block of former frame, is MV (n-2, n-1) that its interpolation method is the unidirectional interpolation of forward direction.

Claims (6)

1, judges the method for occlusion areas in the frame per second lifting, it is characterized in that comprising the steps:
The video image that needs are inserted is divided into some blocks, the consecutive frame video image obtained two blocks before and after the motion vector direction that needs to judge block on this insertion video image extended to, calculate the similarity between these two blocks, 1 rank that described similarity is based on pixel property value in two blocks obtain to the statistic on n rank, n 〉=2, if similarity reaches preset threshold, when the difference of promptly described two blocks reached predetermined degree, then these needs judged that block is positioned at occlusion areas.
2, according to the method for occlusion areas in the described judgement frame per second lifting of claim 1, it is characterized in that described similarity is the S index, less than preset threshold, then this block is positioned at occlusion areas as if the S index; The method of calculating the S index comprises the steps:
Calculate the motion vector that needs to judge block, obtain two blocks that described extension obtains;
Calculate the average u of pixel property value in two blocks that described extension obtains respectively XAnd u Y
Calculate the variances sigma of pixel property value in two blocks that described extension obtains respectively X 2And σ Y 2
Calculate the covariance sigma of two block of pixel property values that described extension obtains XY
Order l ( X , Y ) = 2 u X u Y u X 2 + u Y 2 , c ( X , Y ) = 2 σ X σ Y σ X 2 + σ Y 2 , s ( X , Y ) = σ XY σ X σ Y ;
Calculating S (X, Y)=[l (X, Y)] α* [c (X, Y)] β* [s (X, Y)] γ, α wherein, beta, gamma is an empirical value, (X Y) is the S index to S.
3, promote according to claim 1 or 2 described judgement frame per second in the method for occlusion areas, it is characterized in that also comprising and calculate the step that described needs judge that the motion vector of block and adjacent block motion vector be whether continuous: the block of calculating needs judgement and the motion vector difference of adjacent block, if described motion vector difference, thinks then that the motion vector and the adjacent block motion vector of the block that needs are judged is discontinuous more than or equal to a threshold value; If judge that according to the S index of block block is positioned at occlusion areas, and the motion vector of this block and adjacent block motion vector are discontinuous, and then this block confidence level of being positioned at occlusion areas only is higher than and judges that with the S index this block is positioned at the confidence level of occlusion areas.
4, promote according to the described judgement frame per second of claim 3 in the method for occlusion areas, it is characterized in that the described threshold value of comparing with motion vector determines that method is as follows: needs are inserted the motion vector difference that each block in the two field picture calculates itself and adjacent block, the maximum of the motion vector difference that obtains is max_mv_dif, a * max_mv_dif is described threshold value, the empirical value of a for choosing between 0-1.
5, according to the method for occlusion areas in the described judgement frame per second lifting of claim 4, the value that it is characterized in that a is 0.25.
6, promote according to the described judgement frame per second of claim 1 in the method for occlusion areas, it is characterized in that after definite needs judge that block is positioned at occlusion areas, needs judgement block is belonged to occlusion area or appears regional determining step as follows:
4 continuous two field pictures are n-2 frame, n-1 frame, n frame and n+1 frame in proper order, the block that needs to judge is on the frame that inserts between n-1 frame and the n frame, calculate and insert the motion vector MV (n-2 that judges block same position block in the frame with described needs between n-2 frame and the n-1 frame, n-1), reach and insert the motion vector MV (n that judges block same position block in the frame with described needs between n frame and the n+1 frame, n+1), two coded representation in the bracket obtain motion vector based on image sequence number;
With MV (n-2, the direction of motion that n-1) shows extends to and obtains corresponding block B1 and B2 on n-1 frame and the n frame;
With MV (n, the direction of motion that n+1) shows extends to and obtains corresponding block B3 and B4 on n frame and the n-1 frame;
S index S between calculating B1 and B2 (B1, B2);
S index S between calculating B3 and B4 (B3, B4);
If S (B1, B2)>S (B3, B4), then block belongs to and appears the zone; If S (B1, B2)<S (B3, B4), then block belongs to occlusion area; S (B1, B2)=(B3, block can belong to and appear the zone S in the time of B4), also can belong to occlusion area.
CN200810132760A 2008-07-11 2008-07-11 Method for judging blockage area in frame rate promotion Pending CN101626508A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200810132760A CN101626508A (en) 2008-07-11 2008-07-11 Method for judging blockage area in frame rate promotion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200810132760A CN101626508A (en) 2008-07-11 2008-07-11 Method for judging blockage area in frame rate promotion

Publications (1)

Publication Number Publication Date
CN101626508A true CN101626508A (en) 2010-01-13

Family

ID=41522147

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200810132760A Pending CN101626508A (en) 2008-07-11 2008-07-11 Method for judging blockage area in frame rate promotion

Country Status (1)

Country Link
CN (1) CN101626508A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102413270A (en) * 2011-11-21 2012-04-11 苏州希图视鼎微电子有限公司 Method and device for revising motion vectors of boundary area
CN102665061A (en) * 2012-04-27 2012-09-12 中山大学 Motion vector processing-based frame rate up-conversion method and device
CN103313059A (en) * 2013-06-14 2013-09-18 珠海全志科技股份有限公司 Method for judging occlusion area in process of frame rate up-conversion
CN105517671A (en) * 2015-05-25 2016-04-20 北京大学深圳研究生院 Video frame interpolation method and system based on optical flow method
CN109712078A (en) * 2018-07-23 2019-05-03 永康市巴九灵科技有限公司 Cabinet security protection control platform

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020186889A1 (en) * 1998-08-21 2002-12-12 Gerard De Haan Problem area location in an image signal
CN1842165A (en) * 2005-03-31 2006-10-04 株式会社东芝 Method and apparatus for generating interpolation frame
CN1922873A (en) * 2004-02-23 2007-02-28 皇家飞利浦电子股份有限公司 Reducing artefacts in scan-rate conversion of image signals by combining interpolation and extrapolation of images
CN101047779A (en) * 2006-03-30 2007-10-03 株式会社东芝 Apparatus for creating interpolation frame

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020186889A1 (en) * 1998-08-21 2002-12-12 Gerard De Haan Problem area location in an image signal
CN1922873A (en) * 2004-02-23 2007-02-28 皇家飞利浦电子股份有限公司 Reducing artefacts in scan-rate conversion of image signals by combining interpolation and extrapolation of images
CN1842165A (en) * 2005-03-31 2006-10-04 株式会社东芝 Method and apparatus for generating interpolation frame
CN101047779A (en) * 2006-03-30 2007-10-03 株式会社东芝 Apparatus for creating interpolation frame

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102413270A (en) * 2011-11-21 2012-04-11 苏州希图视鼎微电子有限公司 Method and device for revising motion vectors of boundary area
CN102665061A (en) * 2012-04-27 2012-09-12 中山大学 Motion vector processing-based frame rate up-conversion method and device
CN103313059A (en) * 2013-06-14 2013-09-18 珠海全志科技股份有限公司 Method for judging occlusion area in process of frame rate up-conversion
CN103313059B (en) * 2013-06-14 2016-04-27 珠海全志科技股份有限公司 The decision method of occlusion areas during a kind of frame per second promotes
CN105517671A (en) * 2015-05-25 2016-04-20 北京大学深圳研究生院 Video frame interpolation method and system based on optical flow method
CN109712078A (en) * 2018-07-23 2019-05-03 永康市巴九灵科技有限公司 Cabinet security protection control platform

Similar Documents

Publication Publication Date Title
CN100405820C (en) Improved motion vector estimation at image borders
CN1694502B (en) Ticker processing in video sequences
CN106210767B (en) Video frame rate up-conversion method and system for intelligently improving motion fluency
KR100902315B1 (en) Apparatus and method for deinterlacing
US9148622B2 (en) Halo reduction in frame-rate-conversion using hybrid bi-directional motion vectors for occlusion/disocclusion detection
US20090208123A1 (en) Enhanced video processing using motion vector data
CN101207707A (en) System and method for advancing frame frequency based on motion compensation
CN101663885A (en) Image processing device and method, and image display device and method
WO2008038419A1 (en) Image display device and method, and image processing device and method
CN101626508A (en) Method for judging blockage area in frame rate promotion
CN103402098A (en) Video frame interpolation method based on image interpolation
CN102025960A (en) Motion compensation de-interlacing method based on adaptive interpolation
US8471962B2 (en) Apparatus and method for local video detector for mixed cadence sequence
CN105049824A (en) Method for automatically detecting three-dimensional video format
US20080239144A1 (en) Frame rate conversion device and image display apparatus
CN102447870A (en) Detection method for static objects and motion compensation device
CN100425054C (en) Film mode extrapolation
KR100968642B1 (en) Method and interpolation device for calculating a motion vector, display device comprising the interpolation device, and computer program
CN103024331B (en) Video de-interlacing method based on edge detection
CN101483790B (en) Movie mode video signal detection method
WO2016199418A1 (en) Frame rate conversion system
CN102186045B (en) Three-field motion detection method and device for deinterlacing processing and deinterlacing system
CN100407795C (en) Frame field self-adaptive detection method
CN102497492B (en) Detection method for subtitle moving in screen
CN113727176B (en) Video motion subtitle detection method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20100113