CN1842165A - Method and apparatus for generating interpolation frame - Google Patents

Method and apparatus for generating interpolation frame Download PDF

Info

Publication number
CN1842165A
CN1842165A CN 200610071860 CN200610071860A CN1842165A CN 1842165 A CN1842165 A CN 1842165A CN 200610071860 CN200610071860 CN 200610071860 CN 200610071860 A CN200610071860 A CN 200610071860A CN 1842165 A CN1842165 A CN 1842165A
Authority
CN
China
Prior art keywords
zone
motion vector
frame
detect
interpolation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN 200610071860
Other languages
Chinese (zh)
Inventor
大胁一泰
伊藤刚
三岛直
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Corp
Original Assignee
Toshiba Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Corp filed Critical Toshiba Corp
Publication of CN1842165A publication Critical patent/CN1842165A/en
Pending legal-status Critical Current

Links

Images

Abstract

A method for generating an interpolation frame between first and second reference frames includes dividing an interpolation frame into several interpolation areas; detecting a most correlated combination from several combinations between first reference areas and second reference areas for each interpolation area; obtaining a motion vector from the first and second reference areas; determining whether the first reference areas and the second reference areas are in a high-correlated area or a low-correlated area; giving the motion vector to the motion vector detected area, the motion vector detected area corresponding to the interpolation area which is determined to be the high-correlated area in the first and second reference areas; determining a motion vector to be given to the motion vector undetected area; and generating the interpolation frame based on the motion vector given to the motion vector detected area and the motion vector determined for the motion vector undetected area.

Description

Produce the method and apparatus of interpolation frame
Technical field
Insert in the interpolation frame production method and the interpolation frame generation device of the interpolation frame between two frames in the present invention relates to produce.
Background technology
Image display generally comprises two classes; Only write the impulse type display of launching continuously between the afterglow period of back phosphorescence (for example CRT or field emission type display (FED)) at image, and the maintenance display of the demonstration of frame before before newly reading in image, keeping continuously (for example liquid crystal (LCD), electroluminescent display (ELD)).
A problem of maintenance display is the blooming that takes place in moving image shows.Blooming be owing in the image of several frames, occur when moving object, and during the motion of observer's eyes accompany movement object, the image of several frames projects in the mode of stack that the fact on the retina causes.
Before being shown image in the past frame switch to next frame, though the image before same in the frame is shown continuously, but the demonstration of the image of human eye expection image on next frame, and watch this image when moving on the direction of motion of the moving object on prior image frame.In other words, because the accompany movement of eyes is continuous, and the sampling rate frame period that carries out is meticulousr, so human eye watches the image between two consecutive frames, and what see is the image that blurs.
In order to address this problem, can only make the frame period of demonstration shorter.Like this, may improve factitious motion in the motion picture with minority order display frame.As a kind of specific method, consider to utilize the motion compensation of use among MPEG2 (motion picture expert group version 2) to produce interpolated image and with it in insert between the consecutive frame.
The motion vector by the piece matching detection is adopted in motion compensation.But, owing to produce image among the MPEG2 on the piece basis, when comprising its several object that moves different in piece, relevant part and uncorrelated part take place, cause the piece distortion in the uncorrelated part.
Disclosed to solving the frame interpolation method (for example, referring to Japanese Patent Application Publication communique No.2000-224593) of such problem.A piece is resolved into several zones, motion vector is obtained in each zone.Like this, when the different several objects of its motion are included in this piece, can reduce the distortion of piece.And, use and to be adapted to pass through threshold value piece is resolved into the motion vector detection method in a plurality of zones, and use and be suitable for the motion vector detection method of the block of pixels after being broken down into a plurality of zones, thereby be motion vector that the best is detected in each zone.
The frame interpolation method that discloses among the Japanese Patent Application Publication communique No.2000-224593 reduces decrease in image quality, but can not accurately calculate the motion vector of shaded areas.
Summary of the invention
According to an aspect of the present invention, a kind of method that is used to be created in the interpolation frame between first reference frame and second reference frame comprises: interpolation frame is decomposed into several interpolation zones that comprise several pixels; Detect most correlated combination for each interpolation zone the several combinations between first reference zone and second reference zone, first reference zone is in first reference frame and have size and a shape identical with interpolation zone, second reference zone is in second reference frame and have size and the shape identical with interpolation zone, and second reference zone of first reference zone of each combination of each interpolation zone, described a plurality of combinations and each combination of described a plurality of combinations is arranged in direct time mode; First reference zone and second reference zone from be included in detected combination are obtained motion vector; Determine that first reference zone and second reference zone are in high relevant range or in low relevant range; Give motion vector with motion vector and detect the zone, this motion vector detects the zone corresponding to the interpolation zone that is confirmed as high relevant range in first reference zone and second reference zone; Do not detect the zone with motion vector, the first area, the 3rd reference frame, second area and the 4th reference frame are determined to give the motion vector that motion vector does not detect the zone by estimation, the first area is in first reference frame and be confirmed as low relevant range, with interpolation frame as a reference, the 3rd reference frame be in the time go up with the identical direction of first reference frame on, second area is in second reference frame and be confirmed as low relevant range, with interpolation frame as a reference, the 4th reference frame is on the time and on the identical direction of second reference frame, motion vector does not detect the zone corresponding to the interpolation zone that is confirmed as low relevant range in first reference zone and second reference zone; With according to giving motion vector that motion vector detects the zone and producing interpolation frame for motion vector does not detect the motion vector of determining in the zone.
According to another aspect of the present invention, a kind of method that is used to be created in the interpolation frame between first reference frame and second reference frame comprises: first reference frame is decomposed into several each first reference zone that are made of several pixels; Detection has second reference zone size identical with first reference zone and shape and maximally related with it in second reference frame, obtain the second detected reference zone and the motion vector of first reference zone; Determine that first reference zone and second reference zone are in high relevant range or in low relevant range; Give motion vector with motion vector and detect the zone, this motion vector detects the zone and is confirmed as high relevant range; Do not detect the zone and the 3rd reference frame determines that by estimation motion vector does not detect regional motion vector with motion vector, with first reference frame is reference, the 3rd reference frame is in the time to be gone up on the direction opposite with second reference frame, and motion vector does not detect that the zone is confirmed as hanging down the relevant range and in first reference zone; And according to giving motion vector that motion vector detects the zone and producing interpolation frame for motion vector does not detect the motion vector of determining in the zone.
According to a further aspect of the invention, a kind of method that is used to be created in the interpolation frame between first reference frame and second reference frame comprises: interpolation frame is decomposed into several each interpolation zone that are made of several pixels; Detect most correlated combination for each interpolation zone a plurality of combinations between first reference zone and second reference zone, first reference zone is in first reference frame and have size and a shape identical with interpolation zone, second reference zone is in second reference frame and have size and the shape identical with interpolation zone, and second reference zone of first reference zone of each combination of each interpolation zone, described several combinations and each combination of described several combinations is arranged in direct time mode; First reference zone and second reference zone from be included in detected combination are obtained motion vector; Determine that first reference zone and second reference zone are in high relevant range or in low relevant range; Motion vector is given to detect the zone corresponding to the motion vector that is confirmed as the interpolation zone of high relevant range in first reference zone and second reference zone; Motion vector is given not detect the zone corresponding to the motion vector in the interpolation zone that is confirmed as low relevant range in first reference zone and second reference zone; Giving motion vector with basis detects the motion vector in zone and gives the motion vector generation interpolation frame that motion vector does not detect the zone.
According to a further aspect of the invention, a kind of method that is used to be created in the interpolation frame between first reference frame and second reference frame comprises: first reference frame is decomposed into several each first reference zone that are made of several pixels; Detection in second reference frame, have the size identical and shape with first reference zone and with maximally related second reference zone of first reference zone, obtain second reference zone of detection and the motion vector of first reference zone; Determine that first reference zone and second reference zone are in high relevant range or in low relevant range; Give motion vector with motion vector and detect the zone, this motion vector detects the zone and is confirmed as high relevant range in first reference zone; Do not detect the motion vector that motion vector around the zone detects the zone and give motion vector and do not detect the zone being arranged in motion vector, this motion vector does not detect the zone and is confirmed as low relevant range in first reference zone; Giving motion vector with basis detects the motion vector in zone and gives the motion vector generation interpolation frame that motion vector does not detect the zone.
According to a further aspect of the invention, a kind of device that is used to be created in the interpolation frame between first reference frame and second reference frame comprises: interpolation frame is decomposed into several interpolation resolving cells that comprise the interpolation zone of several pixels; Detect the combine detection unit of correlation combiner the several combinations between first reference zone and second reference zone for each interpolation zone, first reference zone is in first reference frame and have size and a shape identical with interpolation zone, second reference zone is in second reference frame and have size and the shape identical with interpolation zone, and second reference zone of first reference zone of each combination of each interpolation zone, described several combinations and each combination of described several combinations is arranged in direct time mode; First reference zone from be included in detected combination and second reference zone are obtained the motion estimation unit of motion vector; Determine first reference zone and second reference zone degree of correlation determining unit in the still low relevant range in high relevant range; Give the unit that gives that motion vector detects the zone with motion vector, this motion vector detects the zone corresponding to the interpolation zone that is confirmed as high relevant range in first reference zone and second reference zone; Do not detect the zone with motion vector, the first area, the 3rd reference frame, second area and the 4th reference frame are determined to give the motion vector determining unit that motion vector does not detect the motion vector in zone by estimation, the first area is in first reference frame and be confirmed as low relevant range, with interpolation frame as a reference, the 3rd reference frame be in the time go up with the identical direction of first reference frame on, second area is in second reference frame and be confirmed as low relevant range, with interpolation frame as a reference, the 4th reference frame is on the time and on the identical direction of second reference frame, motion vector does not detect the zone corresponding to the interpolation zone that is confirmed as low relevant range in first reference zone and second reference zone; Give the motion compensation units that motion vector detects regional motion vector and do not detect the motion vector generation interpolation frame of determining in the zone for motion vector with basis.
According to a further aspect of the invention, a kind of device that is used to be created in the interpolation frame between first reference frame and second reference frame comprises: the regional generation unit that first reference frame is decomposed into several each first reference zone that are made of several pixels; Detection in second reference frame, have the size identical and shape and second reference zone maximally related with it with first reference zone second with reference to detecting unit; Obtain the motion estimation unit of the motion vector of the second detected reference zone and first reference zone; Determine first reference zone and the second reference zone still degree of correlation determining unit in low relevant range in high relevant range; Give the unit that gives that motion vector detects the zone with motion vector, this motion vector detects the zone and is confirmed as high relevant range; Do not detect the zone and the 3rd reference frame determines that by estimation motion vector does not detect the motion vector determining unit of regional motion vector with motion vector, with first reference frame as a reference, the 3rd reference frame is in the time to be gone up on the direction opposite with second reference frame, and motion vector does not detect that the zone is confirmed as hanging down the relevant range and in first reference zone; Give the compensation motion vector unit that motion vector detects regional motion vector and do not detect the motion vector generation interpolation frame of determining in the zone for motion vector with basis.
According to a further aspect of the invention, a kind of device that is used to be created in the interpolation frame between first reference frame and second reference frame comprises: the regional generation unit that interpolation frame is decomposed into several each interpolation zone that are made of several pixels; Detect the combine detection unit of correlation combiner a plurality of combinations between first reference zone and second reference zone for each interpolation zone, first reference zone is in first reference frame and have size and a shape identical with interpolation zone, second reference zone is in second reference frame and have size and the shape identical with interpolation zone, and second reference zone of first reference zone of each combination of each interpolation zone, described a plurality of combinations and each combination of described a plurality of combinations is arranged as direct time mode; First reference zone from be included in detected combination and second reference zone are obtained the motion estimation unit of motion vector; Determine first reference zone and the second reference zone still degree of correlation determining unit in low relevant range in high relevant range; Give to detect the first regional motion vector with motion vector and give the unit corresponding to the motion vector that in first reference zone and second reference zone, is confirmed as the interpolation zone of high relevant range; Give second motion vector that motion vector corresponding to the territory, interpolation zone that is confirmed as low relevant range in first reference zone and second reference zone do not detect the zone with motion vector and give the unit; Giving motion vector with basis detects the motion vector in zone and gives the motion compensation units that motion vector does not detect regional motion vector generation interpolation frame.
According to a further aspect of the invention, a kind of device that is used to be created in the interpolation frame between first reference frame and second reference frame comprises: the regional territory generation unit that first reference frame is decomposed into several each first reference zone that are made of several pixels; Detection have the size identical and shape with first reference zone and in second reference frame with the second reference zone detecting unit of maximally related second reference zone of first reference zone; Obtain the motion estimation unit of the motion vector in the second detected reference zone and the first reference zone territory; Determine first reference zone and the second reference zone still degree of correlation determining unit in low relevant range in high relevant range; Give first motion vector that motion vector detects the zone with motion vector and give the unit, this motion vector detects the zone and is confirmed as high relevant range in first reference zone; Do not detect motion vector that the motion vector of arranging around the zone detects the zone and give second motion vector that motion vector do not detect the zone and give the unit giving motion vector, this motion vector does not detect the zone and is confirmed as low relevant range in first reference zone; Giving motion vector with basis detects the motion vector in zone and gives the motion compensation units that motion vector does not detect regional motion vector generation interpolation frame.
Description of drawings
Fig. 1 is the block diagram that shows the general structure of interpolated image generation device;
Fig. 2 is the schematic diagram of explanation interpolation frame;
Fig. 3 is the schematic diagram of the processing of illustrated block creating unit;
Fig. 4 shows that the interpolated image that is undertaken by the interpolated image generation device produces the flow chart of processing procedure;
Fig. 5 shows the schematic diagram that is included in the image in first frame and second frame;
Fig. 6 be show with second frame in the schematic diagram of second relevant range;
Fig. 7 shows second schematic diagram;
Fig. 8 be show with second frame in the schematic diagram of second relevant range;
Fig. 9 shows second schematic diagram;
Figure 10 is the schematic diagram that shows interpolation frame;
Figure 11 is the schematic diagram of demonstration according to the hardware configuration of the interpolated image generation device of first embodiment;
Figure 12 is demonstration produces processing procedure according to the interpolated image of the interpolated image generation device 100 of second embodiment a flow chart;
Figure 13 is the schematic diagram that first and second processing is extracted in explanation;
Figure 14 is the schematic diagram of the processing procedure (step S310 is to step S314) of the low relevant range of explanation;
Figure 15 is the schematic diagram of the processing procedure (step S310 is to step S314) of the low relevant range of explanation;
Figure 16 is the schematic diagram of inserted block in showing;
Figure 17 is the schematic diagram that explanation is assigned to image the processing in zone;
Figure 18 is the schematic diagram that explanation is assigned to image the processing in another zone;
Figure 19 is the integrally-built block diagram that shows according to the interpolated image generation device of the 3rd embodiment;
Figure 20 is the schematic diagram of explanation interpolation frame;
Figure 21 is the schematic diagram in description references zone;
Figure 22 is the flow chart of the high relevant range motion vector allocation process process that shows that the interpolated image generation device according to the 3rd embodiment carries out;
Figure 23 be explanation try to achieve reference zone on first reference frame and in second reference frame with first reference frame on the maximally related regional location of reference zone (motion vector) and calculate the schematic diagram of the processing of motion vector therebetween;
Figure 24 is the schematic diagram that is presented at the frame that object therebetween moves on static background;
Figure 25 is the schematic diagram that shows second reference zone that extracts with respect to first reference zone that extracts extraction in the processing (step S504) at the processing target reference zone in motion estimation process (step S505);
Figure 26 is presented at first reference zone that extracts on the pixel basis in the motion estimation process (step S505) and the schematic diagram of second reference zone;
Figure 27 is the schematic diagram that shows the relatedness computation result;
Figure 28 shows reference zone and determines to handle the schematic diagram that is categorized into the reference zone of high relevant range and low relevant range in (step S507) according to relatedness computation result shown in Figure 27 respectively in the degree of correlation;
Figure 29 shows first reference frame and the high relevant range of second reference frame and the schematic diagram of low relevant range;
Figure 30 shows when estimation number of times t is 2 to extract the schematic diagram of handling first reference zone that extracts in (step S504) and second reference zone that extracts with respect to first reference zone in motion estimation process (step S505) at reference zone;
Figure 31 is the schematic diagram that is presented at the brightness value of each pixel of first reference zone and second reference zone in second reference frame that extracts in the motion estimation process (step S505);
Figure 32 is the schematic diagram that is presented at the correlation calculations value (absolute difference) in the relatedness computation processing (step S506) in the zone;
Figure 33 is presented at the schematic diagram that the degree of correlation is determined the high relevant range of middle first reference zone of classifying of processing (step S507) and second reference zone;
Figure 34 is the schematic diagram that shows the degree of correlation between first reference frame and second reference frame when finishing the degree of correlation and determine processing;
Figure 35 shows by motion vector not detect the flow chart that motion vector that regional processing unit carries out does not detect regional processing procedure;
Figure 36 is the schematic diagram that the account for motion vector does not detect zone motion vector on every side;
Figure 37 is the schematic diagram that shows another the 3rd reference frame on the direction opposite with second reference frame with first reference frame on the time that is in;
Figure 38 is that the schematic diagram of handling the zone in the 3rd reference frame that extracts in (step S603) is extracted in the viewing area;
Figure 39 is the schematic diagram that the processing when having several motion vector around motion vector does not detect the zone is described;
Figure 40 is the schematic diagram of the transformation of scale of account for motion vector;
Figure 41 is that explanation is resolved into the schematic diagram of the processing in a plurality of zones with interpolation frame;
Figure 42 is that explanation is resolved into the schematic diagram of the processing in a plurality of zones with interpolation frame;
Figure 43 is the schematic diagram of explanation object example of horizontal movement on static background;
Figure 44 is the flow chart that shows the high relevant range motion vector allocation process process of being undertaken by the interpolated image generation device according to the 4th embodiment;
Figure 45 shows the schematic diagram that carries out with the degree of correlation in each frame after the processing of Figure 44 explanation;
Figure 46 shows by motion vector not detect the flow chart that motion vector that regional processing unit carries out does not detect regional processing procedure;
Figure 47 shows that motion vector does not detect the schematic diagram of zone motion vector on every side;
Figure 48 is the schematic diagram of the concrete processing of explanation in extracted region processing (step S803);
Figure 49 is that explanation is assigned to the schematic diagram that motion vector does not detect the processing in zone with motion vector;
Figure 50 illustrates that the motion vector that is used in the interpolation frame does not detect the schematic diagram of the motion estimation process of the low relevant range in zone, first reference frame and the 3rd reference frame;
Embodiment
Below with reference to the exemplary embodiment of accompanying drawing detailed description according to interpolation frame production method of the present invention and interpolation frame generation device.The present invention is not limited by the examples.
Fig. 1 is the integrally-built block diagram that shows according to the interpolated image generation device 100 (interpolation frame generating means) of the first embodiment of the present invention.Interpolated image generation device 100 comprises: piece generation unit 102, frame memory 106, related blocks extraction unit 108, subregion designating unit 120, high relevant portion regional movement vector computing unit 124, low relevant portion regional movement vector computing unit 128 and motion compensation units 130.
Insert in the interpolation frame between the several frames that are included in the input picture in interpolated image generation device 100 generations according to present embodiment.Fig. 2 shows three successive frames that are included in the input picture, i.e. first frame 210, second frame 220 and the 3rd frame 230.By the situation that inserts in the interpolation frame 300 between first frame 210 and second frame 220 in the example generation present embodiment is described.
Explanation interpolated image generation device 100 produces the situation that two successive frames are the interpolation frame between first frame 210 and second frame 220 among first embodiment, but insert in two frames between the different frame in by the interpolation frame that interpolated image generation device 100 produces being, and be not limited to present embodiment.
Piece creating unit 102 in the interpolated image generation device 100 shown in Figure 1 is obtained the input motion picture from the external world.Then, this device resolves into arranged several with the reference frame in the input motion picture that obtains.Here, reference frame is the frame that is used for setting piece when the coupling implemented based on piece.For example, first frame 210 or second frame 220 are assumed to be reference frame.
Fig. 3 is the schematic diagram of the processing of illustrated block creating unit 102.As shown in Figure 3, piece creating unit 102 is decomposed second frame 220 to produce second.In other words, second is each piece that is included in second frame 220.As shown in Figure 3, piece creating unit 102 resolves in matrix 9 second 221 to 229 with second frame 220.Here second frame 220 is reference frames.
Frame memory 106 obtains motion picture and keeps these pictures from the outside.Related blocks extraction unit 108 obtains from piece creating unit 102 and is decomposed into several second frame 220.In addition, unit 108 obtains first frame 210 from frame memory 106.Then, unit 108 extracts maximally related with respect to each second in second frame 220 from first frame 210.Hereinafter, that extract from first frame 210 and with second frame 220 predetermined second maximally related be called as first.
High relevant portion zone (motion vector detects the zone) and low relevant subregion (motion vector does not detect the zone) during subregion designating unit 120 is specified second.Here, high relevant portion zone be with first frame 210 in the more relevant zone of presumptive area, low relevant portion zone be with first frame 210 in the less relevant zone of presumptive area.
For example, when supposition relevance degree when being a difference, the zone that the value that is included in the pixel in the low related blocks and difference between the value of the pixel in first frame 210 are not less than predetermined threshold is assumed that hangs down the relevant portion zone.On the other hand, this difference is assumed that high relevant portion zone less than the zone of this threshold value.Described relevance degree preferably adopts the value of being determined by the number of the summation of the absolute difference of the absolute difference of monochrome information, colour difference information, absolute difference or high pixel of being correlated with.
High relevant portion regional movement vector computing unit 124 calculates by the subregional high relevant portion regional movement vector of the high dependent part of subregion designating unit 120 appointments.Here, high relevant portion regional movement vector is high relevant portion zone and corresponding to the motion vector between the zone in subregional first frame 210 of this high dependent part.
The low relevant portion regional movement vector that low relevant portion regional movement vector computing unit 128 calculates by the subregion, low relevant range of subregion designating unit 120 appointments.Here, low relevant portion regional movement vector be at low relevant portion regional and corresponding to this motion vector between zone in subregional the 3rd frame 230 of low dependent part.
Motion compensation units 130 is obtained by the high relevant portion zone of subregion designating unit 120 appointments with corresponding to the zone in subregional first frame 210 of this high dependent part.In addition, unit 130 obtains high relevant portion regional movement vector from high relevant portion regional movement vector computing unit 124.Unit 130 produces the image of presumptive area in the interpolation frame according to high relevant portion zone, corresponding to the regional and high relevant portion regional movement vector in subregional first frame 210 of this high dependent part then.At this moment, motion vector is subjected to transformation of scale, to produce interpolation frame in the precalculated position.
Motion compensation units 130 and then obtain by the low relevant portion zone of subregion designating unit 120 appointments with corresponding to the zone in subregional the 3rd frame 230 of this low dependent part.In addition, unit 130 obtains low relevant portion regional movement vector from low relevant portion regional movement vector computing unit 128.Unit 130 produces the image of presumptive area in the interpolation frame according to zone corresponding in low relevant portion zone, the 3rd frame 230 and low relevant portion regional movement vector then.
When not having any image by the presumptive area of above-mentioned processing in interpolation frame 300 and produce, can according to adjacent to this regional image, be included in image in first frame 210, second frame 220 etc. and the motion vector between these frames and handle by weighted average or intermediate value and produce this regional image.
Specifically, be given the mean value or the intermediate value of motion vector that image do not produce the zone and be utilized to produce this regional image.
Fig. 4 shows that the interpolated image that is undertaken by interpolated image generation device 100 produces the flow chart of processing procedure.At first, interpolated image generation device 100 obtains input picture.Suppose that interpolated image generation device 100 has obtained first frame 210, and frame memory 106 keeps this first frame 210.Then, piece creating unit 102 is obtained second frame 220 (step S100) of following first frame 210.Then, unit 102 decomposes second frame 220 to obtain several second 221 to 229 (step S102).
Then, related blocks extraction unit 108 extracts the related blocks with respect to second, promptly extracts first (step S104) from first frame 210.Here first has identical size and shape with second.
Related blocks extraction unit 108 is concrete to calculate difference between the value of the pixel of predetermined block in the value that is included in each pixel in second and first frame 210.Then, extract the piece of the summation minimum of difference wherein as first.
As another example, the quantity of the difference between the value of the pixel of predetermined block in the value that is included in each pixel in second that is not more than predetermined threshold and first frame 210 is counted.The piece of institute's count number maximum can be assumed that first.
Then, high relevant portion zone and low relevant portion zone (step S200) during subregion designating unit 120 is specified first 210.High relevant portion zone is to comprise in first that the expression relevance degree is equal to or greater than the zone of the pixel of threshold value.Low relevant portion zone is to comprise the zone of expression relevance degree less than the pixel of threshold value in first.
Here, with reference to figure 5 to 9 explanations high relevant portion zone and low relevant portion zone.With the situation of explanation be, second can resolve into high relevant portion zone and low relevant portion zone, but whole second can be high relevant portion zone.Fig. 5 represents to be included in the image in first frame 210 and second frame 220.First frame 210 shown in Figure 5 comprise background area 430 and on background area 430 with the regular-shape motion object 432 of the horizontal motion of frame.Background area 430 is rest images.Second frame 220 comprises background area 430 and is similar to the moving object 432 of first frame 210.In second frame 220, moving object 432 positive motions are to than the moving object in first frame 432 position more to the right.
As shown in Figure 6, for example, with second 225 maximally related zone in second frame 220 be zone 211 in first frame 210.Its image is coupling in background 430 only.Therefore, this zone is high relevant portion zone.Other zones are low relevant portion zones.In other words, as shown in Figure 7, second 225 is broken down into high relevant portion zone 2251 and low relevant portion zone 2252.
As shown in Figure 8, with second frame 220 in second 226 maximally related zone be in first frame 210 zone 212.Its image is coupling in moving object 432 only.Therefore, this zone is high relevant portion zone.Other zones are low relevant portion zones.In other words, as shown in Figure 9, second 226 is broken down into high relevant portion zone 2261 and low relevant portion zone 2262.
Get back to Fig. 4.When the target subregion is high relevant portion zone (step S202 is for being), high relevant portion regional movement vector computing unit 124 calculate high relevant portion zone and corresponding to this high relevant portion regional movement vector (step S204) between subregional zone of high dependent part.Here, be the zone in first frame 210 corresponding to the subregional zone of high dependent part and have shape and the size identical with high relevant portion zone.
In the illustrated example of Fig. 5 to 9, calculate the motion vector in high relevant portion zone 2251 and the motion vector in high relevant portion zone 2261.
Then, motion compensation units 130 reaches the image (step S206) of presumptive area in the high relevant portion regional movement vector generation interpolation frame that is calculated by high relevant portion regional movement vector computing unit 124 according to the high relevant portion zone of being obtained by subregion designating unit 120, corresponding to the zone in subregional first frame 210 of this high dependent part.
Figure 10 shows the interpolation frame 300 that produces by above-mentioned processing.As mentioned above, in interpolation frame 300, moving object 432 is arranged between the moving object 432 and the moving object 432 in second frame 220 in first frame 210.In the processing of step S204 and step S206, image is not assigned to the zone 301 and 302 of moving object 432 both sides.
When the target subregion was low relevant portion zone (step S202 is for denying), low relevant portion regional movement vector computing unit 128 extracted corresponding to this subregional zone of low dependent part (step S220) from other frames.In the present embodiment, unit 128 extracts this zone from the 3rd frame 230.Then, low relevant portion regional movement vector computing unit 128 calculates the low relevant portion regional movement vector (step S222) between the zone of the correspondence of hanging down the relevant portion zone and extracting from the 3rd frame 230.Here, Dui Ying zone has and hangs down identical shape and the size in relevant portion zone.
In the example of Fig. 5 to 9 explanation, calculate the motion vector in low relevant portion zone 2252 and the motion vector in low relevant portion zone 2262.
Hereinafter will describe the processing of being undertaken by low relevant portion regional movement vector computing unit 128 among the step S220 in detail.As shown in figure 10, it is not still stayed on the part of interpolation frame in the zone 301 and 302 of distribution diagram picture.
This zone is promptly to be shaded areas by the definite zone of first frame 210, second frame 220 and the motion vector between them.For example, when the object that is covered by another object or background occur, this shaded areas just appears in second frame in first frame.In order to produce this regional image, must extract corresponding to the subregional zone of low dependent part.
For example, in first frame 210, the superimposed zone that will occur of moving object 432 in the background 430 identical with the background 430 of expression in the low relevant portion zone 2252.Therefore, identical with low relevant portion zone 2252 image does not appear in first frame 210.
Yet because moving object 432 is in continuous motion, the background 430 identical with the background 430 of expression in low relevant portion zone 2252 is included in the frame beyond first frame 210.In other words, can extract this identical image from the frame beyond first frame 210.
Extract the zone that comprises the background 430 (identical shaped and big or small) identical from the frame beyond first frame 210 and second frame 220 according to the interpolated image generation device 100 of present embodiment with low relevant portion zone 2252.In the present embodiment, extract the background 430 identical with low relevant portion regional 2252 from the 3rd frame 230 of following second frame 220.
Be assumed that the target that detects among first embodiment though follow the 3rd frame 230 of second frame 220, but target frame can be the frame beyond first frame 210, second frame 220, or be reference with second frame 220, the time that is in go up with first frame, 210 rightabouts on frame, and be not limited to the 3rd frame.
Then, motion compensation units 130 is assigned to presumptive area (step S224) in the interpolation frame 300 according to zone corresponding in the low relevant portion regional movement vector that is calculated by low relevant portion regional movement vector computing unit 128, low relevant portion zone and the 3rd frame 230 with image.
When the interpolation frame that produces between two frames,, thereby also image is assigned to the shaded areas of interpolation frame with good precision with reference to the outer frame of this two frame.
Figure 11 is the schematic diagram of demonstration according to the hardware configuration of the interpolated image generation device 100 of first embodiment.Constitute as hardware, interpolated image generation device 100 comprises: store therein and carry out the ROM 52 that interpolated image produces the interpolated image generating routine of handling in the interpolated image generation device 100, CPU 51 according to each unit in the program control interpolated image generation device 100 among the ROM 52, the RAM 53 of the required every data of storage control interpolated image generation device 100 therein, be used to be connected to the communication I/F 57 that network communicates, with the bus 62 that is used to be connected each unit.
But but the interpolation frame generating routine in the above-mentioned interpolated image generation device 100 can be recorded in CD-ROM such as the file of band installation form or execute form, floppy disk (trade mark brand) (FD) or the computer readable recording medium storing program for performing of DVD and being provided.
In this case, the interpolation frame generating routine is read from recording medium and in interpolated image generation device 100, carried out and be loaded in the main storage means, like this, in main storage, be created in each unit that illustrates in the software configuration.
Interpolation frame generating routine according to first embodiment can store the computer that is connected in such as on the network of the Internet into, and is provided by network download.
This device has been described, but the foregoing description can be done multiple modification and improvement with embodiments of the invention.
Revise as first example, in the present embodiment, piece is resolved into high relevant portion zone and low relevant portion zone, motion vector is assigned to each subregion, but as an alternative, can be on the basis of piece assigned motion vector.In other words, determine that each piece is high relevant or low relevant.High relevant portion regional movement vector computing unit 124 calculates the motion vector that is confirmed as high high related blocks of being correlated with.Low relevant portion regional movement vector computing unit 128 calculates the motion vector that is confirmed as low low related blocks of being correlated with.
Revise as second example, high relevant range and low relevant range can further be resolved in low relevant portion zone, and motion vector computation unit, subregion, high relevant range 124 and low relevant portion regional movement vector computing unit 128 can calculate the motion vector in zone separately.
And then, determine and the various processing of motion vector computation can be implemented recursion ground, low relevant range such as estimation, the degree of correlation, high relevant range and low relevant range can further be resolved in low like this relevant portion zone, and the low relevant range that at this moment obtains can be resolved into high relevant range and low relevant range again.
Then, with the interpolated image generation device 100 of explanation according to second embodiment.By interpolation frame 300 is resolved into a plurality of, and serves as with reference to producing interpolation frame 300 from first frame 210 and the high relevant range of second frame, 220 search with interpolation frame 300 according to the interpolated image generation device 100 of second embodiment.Be different from interpolated image generation device 100 in this according to first embodiment.
Piece creating unit 102 will treat the interpolation frame 300 of interpolation resolve into treat interpolation with several of arranged, and obtain several in inserted blocks.Relatedness computation is implemented in zone on the straight line of each piece by treating interpolation in 108 pairs first frames 210 of related blocks extraction unit and second frame 220.Like this, extract a pair of maximally related zone.
Figure 12 is demonstration produces processing procedure according to the interpolated image of the interpolated image generation device 100 of second embodiment a flow chart.After obtaining second frame 220, related blocks extraction unit 108 decomposes interpolation frame 300 and obtains several interior inserted blocks (step S300).Then, extract with the size identical and shape first and second (step S302) from first frame 210 and second frame 220 with the piece for the treatment of interpolation.Here, first is high relevant piece with second.
Figure 13 is the schematic diagram that first and second processing is extracted in explanation.As shown in figure 13, from first and second, extract a pair of by the zone on the straight line of inserted block in the interpolation frame 300 each as first and second.In other words, extract first and be arranged in locational second as a reference corresponding to first with interior inserted block.
A pair of first and second has a plurality of alternate chunks.Method by the definite related blocks described in first embodiment is selected first 212 and second 222 pair of block from a plurality of alternate chunks.
In step S200, high relevant portion zone and low relevant portion zone in the pair of block that in step 302, obtains in the physical block.When pending zone is high relevant portion zone (step S202 is for being), the processing in implementation step 204 and the step 206.
On the other hand, when pending zone is low relevant portion zone (step S202 is for denying), low relevant portion regional movement vector computing unit 128 extracts corresponding to this subregional zone of low dependent part (step S310) from other frames.In the present embodiment, extract this zone with the 0th frame 200 or the 3rd frame 230.Zone to be extracted is to have the size identical with low relevant portion zone and the zone of shape.
Then, low relevant portion regional movement vector computing unit 128 calculates in low relevant portion zone with from the motion vector (step S213) between the zone of the 0th frame 200 or 230 extractions of the 3rd frame.Then, motion compensation units 130 is assigned to presumptive area (step S314) in the interpolation frame 300 according to the subregional motion vector of low dependent part that is calculated by low relevant portion regional movement vector computing unit 128, low relevant portion zone and corresponding zone with image.
Figure 14 and 15 is explanation schematic diagrames for the low subregional processing of dependent part (step S310 is to step S314).As shown in figure 14, suppose moving object 434 motion from left to right on the frame of the 0th frame 200 to the 3rd frames 230.In this case, as shown in figure 15, the interior inserted block 301,302,303,304,306,307,308 and 309 in the interpolation frame 300 is in high relevant range.Background 430 is assigned to this zone by the processing among step S204 and the step S206.
Figure 16 shows interior inserted block 305 shown in Figure 15.The zone 3051 of center for the treatment of the piece 305 of interpolation is high relevant range.By the processing in step S204 and the step 206 moving object 434 is dispensed to this zone.
The zone 3052 and 3053 of interior inserted block 305 both sides is corresponding to the subregional zone of low dependent part, and is the zone of covering.By by utilizing the 0th frame 200 and the 3rd frame 230 background 430 being assigned to this zone to the processing of step S314 from step S310.
Image to zone 3052 and 3053 to be allocated appears in the 0th frame 200 or the 3rd frame 230.Therefore can distribute the image of present the 0th frame 200 or the 3rd frame 230.The image of further implementing to determine to be distributed in the 0th frame 200 still is the treatment of picture of the 3rd frame 230.
In other words, the high relevant portion zone conductively-closed in first frame 210 and second frame 220.Then, determine that the degree of correlation between promptly low relevant portion zone, unscreened zone and its external frame (the 0th frame 200 or the 3rd frame 230) stipulates regional 3052 and 3053 the image that arrives to be allocated.
Figure 17 is the schematic diagram that explanation is assigned to image the processing in zone 3052.When specifying the image to zone 3052 to be allocated, be reference with zone 3052, implement the coupling between the presumptive area in low relevant portion zone the 2101 and the 0th frame 200 in first frame 210.In addition, hypothesis district 3052 is reference, implements the coupling between the presumptive area in low relevant portion zone the 2201 and the 3rd frame 230 in second frame 220.
In the coupling between first frame 210 and the 0th frame 200, by shielding, the zone in first frame 210 is only limited to low relevant portion zone.Like this,, only pass the zone of determining in the low relevant portion zone 2101 in first frame 210 2001 by motion vector MV10 in enforcement and the 0th frame 200 and be complementary as starting point with the zone in the interpolation frame 300 3052.By this way, 210 conductively-closeds of first frame, thus coupling is only limited to the zone 2001 in the 0th frame 200.The degree of correlation between the zone 2001 in low relevant portion zone the 2101 and the 0th frame 200 in first frame 210 is low.
In the coupling between second frame 220 and the 3rd frame 230, by shielding, the zone in second frame 220 is limited to low relevant portion zone 2201.Therefore,, only pass the zone of determining in the low relevant portion zone 2201 in second frame 220 2301 by motion vector MV12 in enforcement and the 3rd frame 230 and be complementary as starting point with zone 3052 in the interpolation frame 300.By this way, on second frame 220, implement shielding, thereby coupling is only limited to the zone 2301 in the 3rd frame 230.The zone 2302 on the left side in the zone 2301 in the zone 2202 on the left side in the low relevant portion zone 2201 in second frame 220 and the 3rd frame 230 has identical background 430, and has the high degree of correlation.
As mentioned above, only a frame from the 0th frame 200 and the 3rd frame 230 detects high relevant range.Therefore, high relevant range is assigned to zone 3052.
In this mode, only implement the coupling with external frame on the low relevant portion zone in first frame 210 and second frame 220, thereby suitable image is assigned to shaded areas.
Figure 18 is the schematic diagram that explanation is assigned to image the processing in zone 3053.When specifying the image to zone 3053 to be allocated, also be similar in zone 3052, be reference with zone 3053, implement the coupling between the zone in low relevant portion zone the 2101 and the 0th frame 200 in first frame 210.And then, be reference with zone 3053, implement the coupling between the zone in low relevant portion zone the 2201 and the 3rd frame 230 in second frame 220.
In the coupling between first frame 210 and the 0th frame 200, as starting point, only pass the zone of determining in the low relevant portion zone 2101 in first frame 210 2005 by motion vector MV20 in enforcement and the 0th frame 200 and be complementary with the zone in the interpolation frame 300 3053.
Zone, the left side 2006 in the zone 2302, the left side in low relevant portion zone 2102 in first frame 210, the zone 2301 of the 3rd frame 230 and the zone 2005 of the 0th frame 200 has identical background 430, and has the high degree of correlation.
In the coupling between second frame 220 and the 3rd frame 230, as starting point, only pass the zone of determining in the low relevant portion zone 2101 in second frame 210 2301 by motion vector MV22 in enforcement and the 3rd frame 230 and be complementary with the zone in the interpolation frame 300 3053.The degree of correlation between the zone 2301 in low relevant portion zone the 2201 and the 3rd frame 230 in second frame 220 is low.As mentioned above, zone 2102 and the zone 2006 in the 0th frame that has in first frame 210 of the high degree of correlation is assigned to zone 3053.
When even the suitable motion vector in the zone that does not have the distribution diagram picture is not specified with reference to other frame yet, can suppose that the mean value of motion vector in the zone around the zone that does not have the distribution diagram picture or intermediate value as this regional motion vector, produce this regional image.
In interpolated image generation device 100, situation about existing in a piece corresponding to the image of two kinds of different motion vectors has been described, but has the invention is not restricted to this according to second embodiment.
For example, when the image that in a piece, exists corresponding to three above different motion vectors, can only calculate motion vector corresponding to this.And, can only extract subregion corresponding to this.Like this, can be from existence corresponding to producing interpolation frame with high accuracy two frames of the image of several motion vectors.
In addition, several motion vectors can be given a zone.In this case, distribute by several motion vectors that are given mean value of the image in the reference frame of appointment respectively to interpolation frame.
As other examples, can select motion vector by the relevance degree between the zone of several motion vector appointments that are given.Specifically, specifying in several reference frames will be by the zone of the motion vector appointment that is given.For example, in two reference frames, specify described zone.Obtain the relevance degree between these several designated zones then.This process is implemented for each motion vector that is given.Selection is corresponding to the motion vector of minimum relevance degree.Image is assigned to this zone according to selected motion vector.
Figure 19 is the integrally-built block diagram that shows according to the interpolated image generation device 500 of the 3rd embodiment.Is that motion vector does not detect motion vector around the zone and gives motion vector with motion vector and do not detect the zone according to the interpolated image generation device 500 of the 3rd embodiment according to low relevant range.
Interpolated image generation device 500 comprises: frame memory cell 502, zone generation unit 504, motion estimation unit 506, degree of correlation determining unit 508, motion vector detects regional processing unit 510, and motion vector does not detect regional processing unit 512 and motion compensation units 514.
Figure 20 is the schematic diagram of explanation interpolation frame.Figure 20 shows two continuous reference frames that are included in the input picture, i.e. first reference frame 210 and second reference frame 220.Hereinafter will narrate the situation that inserts in the interpolation frame 300 between first reference frame 210 and second reference frame 220 in producing.
Hereinafter narration is created in the i.e. situation of the interpolation frame between first reference frame 210 and second reference frame 220 of two continuous frames, but is inserted in the frame between the different reference frames and is not limited to present embodiment in by the interpolation frame that the interpolated image generation device produces being.
Zone generation unit 504 extracts first reference frame 210 from frame memory cell 502.Then first reference frame 210 is resolved into the reference zone that is made of several pixels.
Figure 21 is the schematic diagram in description references zone.In the present embodiment, regional generation unit 504 resolves into the square region that each takes advantage of five horizontal pixels to constitute by five vertical pixels with frame, as shown in figure 21.In other words, first reference frame 210 is broken down into 16 reference zones from reference zone 2102a to reference zone 2102p.The shape and the size of reference zone are not limited to present embodiment, can be different with present embodiment.
Motion estimation unit 506 is calculated each motion vector of several reference zones in first reference frame 210 according to second reference frame 220.The degree of correlation between each zone in the presumptive area in degree of correlation determining unit 508 definite second reference frames 220 and first reference frame 210 in several reference zones.
Motion vector detects in 510 pairs of reference zones of regional processing unit the zone that is defined as high relevant range by degree of correlation determining unit 508 and promptly motion vector is detected the zone and implement to handle.Motion vector does not detect in 512 pairs of reference zones of regional processing unit the zone that is defined as low relevant range by degree of correlation determining unit 508 and promptly motion vector is not detected the zone and implement to handle.Motion compensation units 514 produces the image of the presumptive area of interpolated image generation device 500 according to the processing of being undertaken by the surveyed area processing unit 510 and 512 of motion vector detection/not.
Figure 22 is the flow chart of the high relevant range motion vector allocation process process that shows that the interpolated image generation device 500 according to the 3rd embodiment carries out.The degree of correlation determines that initialization step S501 will import all pixels in the reference frame and be assumed to " low relevant " pixel.
At first, first reference frame 210 and all pixels in second reference frame 220 are set at low relevant pixel (step S501).Then, first reference frame 210 is decomposed into n reference zone (step S502).As shown in figure 21, in the present embodiment, first reference frame 210 is broken down into 16 reference zones.
Then, set estimation number of times t (step S503).Here, estimation number of times t is that (step S504) determines each processing of processing (step S507) to the degree of correlation number of times is handled in following extracting from reference zone.
Interpolated image generation device 500 according to present embodiment resolves into high relevant range and low relevant range with reference zone.High relevant range and low relevant range are further resolved in the low relevant range that obtains.As mentioned above, repeat reference zone is divided into the processing in thinner zone.In other words, implementing recursion handles.Estimation number of times t is corresponding to the number of times of this processing.In the present embodiment, estimation number of times t is set to 2, represents that simultaneously the counter of current estimation number of times is set to 1.When estimation number of times t was set at 2, the processing after the step S504 was implemented twice.
Estimation number of times t can be set at 1 naturally.Like this, whole reference zone becomes high relevant range or low relevant range.As other examples, can not set estimation number of times t.As an alternative, reprocessing automatically detects in reference zone less than till the high relevant range.
Then, extraction will stand the reference zone (step S504) of motion estimation process.In the present embodiment, 16 reference zones are arranged, as shown in figure 21.Therefore the number n of reference zone is set to 16.The counter t that represents current estimation number of times is set to 1.
Then, extract low relevant range in the low relevant low relevant range and extraction high relevant range (step S505) from the low relevant low relevant range of being set to of second reference frame 220 will extracting in the processing (step S504) being set to of reference zone (calling " first reference zone " in the following text) of handling and extracting at reference zone.The zone of being extracted is called as second reference zone.
Specifically, in the low relevant range of second reference frame, set several zones with size and the shape identical with first reference zone.In each zone, calculate absolute difference and, and extract this and value for the zone of minimum as second reference zone.This absolute difference is corresponding to the absolute value of the difference between the pixel of this pixel in the pixel in second reference zone and first reference zone.
As other examples, the number that is not more than the pixel of predetermined threshold corresponding to the absolute difference of each pixel of first reference zone in second reference frame 220 and presumptive area is counted, and the zone that can extract its number maximum of counting is as second reference zone.
Because all pixels are set to low relevant pixel in low related setting processing (step S501), so when implementing to extract the processing (step S505) of high relevant range for the first time, all reference zones are set to low relevant range, and the All Ranges in second reference frame is set to low relevant range.
Then, utilize low relevant range in first reference frame 210 and the low relevant range in second reference frame 220 to obtain motion vector MV (step S505).This time t does not utilize the high relevant range in second reference frame 220.Therefore can calculate more suitably motion vector.
In the present embodiment, as shown in figure 23, the reference zone in first reference frame 210 is tried to achieve position, relevant range (motion vector) in second reference frame 220, and calculate the motion vector between them.Handle in (t=1) in first recursion, since on first reference frame 210 in the reference zone all pixels (piece that the square zone of taking advantage of five horizontal pixels to form by five vertical pixels constitutes) be " low being correlated with " pixel, historical facts or anecdotes is granted general piece and is mated identical relatedness computation.
Then, implement relatedness computation (step S506), with the degree of correlation on pixel basis in the zone of first reference zone of determining extraction in motion estimation process (step S505) and second reference zone to each corresponding pixel.In the present embodiment, absolute difference is calculated as relevance degree.
Then, determine according to the degree of correlation that the relatedness computation result of relatedness computation (step S506) in the zone makes on the pixel basis, and first reference zone and second reference zone are categorized into high relevant range and low relevant range (step S507).The setting that is confirmed as high relevant pixel changes to " high relevant " from " low relevant ", give high relevant range (step S508) with the motion vector MV that tries to achieve in the motion estimation process (step S505).
All first reference zones in first reference frame 210 (16 zones) are implemented above-mentioned processing, and repetition and the as many number of times of estimation number of times t (step S509, S510).High relevant range motion vector allocation process is finished.
Below, specify the processing after the processing target reference zone extracts processing (step S504).Figure 24 is the schematic diagram of display-object 442 mobile frame on static background 440.
Figure 25 is the schematic diagram that shows the second reference zone 222f that extracts for the first reference zone 212f that extracts extraction in the processing (step S504) at the processing target reference zone in motion estimation process (step S505).In this mode, be extracted as second reference zone with the maximally related zone of first reference zone.
Figure 26 is presented at the first reference zone 212f that extracts in the motion estimation process (step S505) and the schematic diagram of the second reference zone 222f on pixel basis.The brightness value of each pixel of numeral among the figure.In the zone, among the relatedness computation step S506, be to determine the degree of correlation on the pixel basis in this reference zone, to a pair of first reference zone and the enforcement of second reference zone relatedness computation according to the brightness value of each respective pixel.In illustrated example, absolute difference is calculated as the relatedness computation value.
Figure 27 is the schematic diagram that shows the relatedness computation result.Figure 28 shows to be classified as the reference zone 212f of high relevant range and low relevant range and the schematic diagram of reference zone 222f according to relatedness computation result shown in Figure 27.
For relevant high relevant or low relevant determining, setting threshold on the absolute difference result of calculation of in the zone, trying to achieve in the relatedness computation (step S506), and when result of calculation is not more than this threshold value, the zone is confirmed as high relevant zone, when result of calculation during greater than this threshold value, the zone is confirmed as low relevant zone.In the example shown in Figure 28, it is low relevant that the result of calculation of absolute difference is that 24 pixel is confirmed as, and is that 0 pixel is confirmed as high relevant.Then, the motion vector MV that tries to achieve in motion estimation process (step S505) is given high relevant range shown in Figure 28.
Figure 29 shows the high relevant range in first reference frame 210 and second reference frame 220 and the schematic diagram of low relevant range.Handle (step S508) from the motion vector of processing target reference zone extraction processing (step S504) and on first reference frame 210, proceed to reference zone 212p (step S509) from reference zone 212a, thereby first reference frame 210 is categorized into high relevant range and low relevant range, as shown in figure 29.
In the present embodiment, in estimation number of times setting processing (step S503), the estimation number of times is set at 2.Therefore, the low relevant range in first reference frame 210 shown in Figure 29 is repeated processing (step S511) from step S504 to step S509 once more.
Figure 30 shows when estimation number of times t is 2 to extract the schematic diagram of handling first reference zone 232f that extracts in (step S504) and the second reference zone 242f that extracts for the first reference zone 232f in motion estimation process (step S505) at reference zone.Here, the first reference zone 232f is when estimation number of times t is 1, handles the low relevant range that is set in (step S507) among the first reference zone 212f that hangs down the relevant range in degree of correlation classification.
Figure 31 is the schematic diagram that is presented at the brightness value of the first reference zone 232f in second reference frame 220 that extracts in the motion estimation process (step S505) and each pixel among the second reference zone 242f.Figure 32 is the schematic diagram that is presented at the relatedness computation value (absolute difference) in the relatedness computation processing (step S506) in the zone.
Figure 33 is presented at the degree of correlation to determine to handle the first reference zone 232f of classification in (step S507) and the schematic diagram of the high relevant range among the second reference zone 242f.In this example, as shown in figure 33, whole reference zone is high relevant range.The motion vector MV62 that tries to achieve in motion estimation process (step S505) is given and is confirmed as high relevant whole reference zone.Similarly, all reference zones are implemented second handle, determine to handle thereby finish the degree of correlation.
As mentioned above, even when in a reference zone, comprising several motion, also can try to achieve several motion vectors corresponding to each autokinesis.
Figure 34 is the schematic diagram that shows the degree of correlation between first reference frame 210 and second reference frame 220 when finishing the degree of correlation and determine processing.As shown in figure 34, when reference zone was decomposed and asks several motion vector in this reference zone with two reference frames, generation can not be asked the zone of motion vector, and promptly motion vector does not detect zone 2104.This zone occur be because the background of being covered by object by this movement of objects, perhaps should the zone disappearing be because the background of being watched is covered by this object, this is one and can not realizes the zone of two couplings between the reference frame and be called shaded areas.In addition, this zone also comprises the zone that can not realize two couplings between the frame because of noise and so on.
Motion vector does not detect regional processing unit 512 is found the solution the zone that motion vector can not find the solution between two frames motion vector.Figure 35 shows by motion vector not detect the flow chart that motion vector that regional processing unit 512 carries out does not detect regional processing procedure.At first, motion vector does not detect regional processing unit 512 and extracts low relevant range from first reference frame, and promptly motion vector does not detect zone (step S601).
Secondly, extraction is assigned to motion vector and does not detect zone 2104 pixel motion vector does not on every side detect the motion vector in zone 2104 as motion vector alternative (step S602).Pixel does not on every side preferably detect the pixel in zone 2104 adjacent to motion vector.
Shaded areas seldom may have the motion vector around being different from.In other words, shaded areas is more may have and shaded areas regional identical motion vector on every side.Therefore, at step S202, the motion vector that the extraction motion vector does not detect around the zone 2104 does not detect the alternative of regional motion vector as motion vector.
Figure 36 explains that motion vector does not detect the schematic diagram of zone 2104 motion vector on every side.As shown in figure 36, motion vector does not detect the motion vector of zone around 2104 and comprises corresponding to the motion vector MV30 in the zone 2105 of object 442 with corresponding to the motion vector MV32 of the high relevant range 2106 of background 440.In them any one is assigned to motion vector and do not detect zone 2104.
Get back to Figure 35.Behind the motion vector around extracting, first reference frame 210 is extracted motion vector do not detect zone 2104, and extract by the zone (step S603) in the 3rd reference frame 230 that the motion vector around extracting among the step S602 be multiply by-1 vector appointment that obtains.
Shaded areas is the zone that can not try to achieve motion vector from two reference frames.In order from several motion vector candidates, to select a motion vector not detect regional 2104 motion vectors, as shown in figure 37, utilize another to be in the time and go up the 3rd reference frame on the direction opposite with second reference frame with first reference frame.
Figure 38 is presented at the schematic diagram that extracted region is handled the zone in the 3rd reference frame 230 that extracts in (step S603).As shown in figure 38, extract regional 2304a and regional 2304b from motion vector MV30 and motion vector MV32 respectively.
Get back to Figure 35.Implementing extracted region respectively handles regional 2304a, 2304b of extracting in (step S603) and motion vector and does not detect relatedness computation (step S604) between the zone 2104.In the present embodiment, absolute difference is calculated as relevance degree.
Secondly, from several motion vector candidates, select optimum movement vector (step S605).Specifically, handle motion vector candidate that relatedness computation result in (step S604) selects relevant range does not detect the zone as motion vector motion vector according to relatedness computation.
In the present embodiment.Selection does not detect the motion vector in zone as motion vector for the motion vector candidate in minimum zone corresponding to absolute difference.
In the example of Figure 38, be the zone that object 442 occurs by the regional 2304a of motion vector MV30 appointment.On the other hand, be zone by the regional 2304b of motion vector MV32 appointment corresponding to background 440.Therefore, not detect zone 2104 high relevant for regional 2304b and motion vector.Like this, the motion vector MV32 corresponding to regional 2304b is selected as the motion vector that motion vector does not detect the zone.
Motion vector does not detect relatedness computation between the zone in zone and the 3rd reference frame 230 and is preferably on the pixel basis and implements.Therefore, can be with the accuracy selection motion vector on the pixel basis.
In addition, as another example, can on detecting the basis of shape in zone, motion vector not implement relatedness computation.And, as another example, can on the piece basis, implement relatedness computation.
Figure 39 be explanation when do not detect at motion vector zone 2104 around the schematic diagram of processing when several motion vectors (mv2, mv11 to mv13) occurring.In this case, do not detect the zone according to the motion vector of reference Figure 35 explanation and handle, with the motion vector mv2 of peripheral region, mv11 to mv13 is applied to motion vector respectively and does not detect pixel in the zone 2104.Implement then not detect the relatedness computation of zone between 2104 by zone in the 3rd reference frame 230 of each motion vector appointment and motion vector.Select the motion vector candidate of relevant range not detect the motion vector in zone 2104 as motion vector.
By this way, even around motion vector detects zone 2104, three above motion vectors do not occur, also can therefrom select optimum movement vector not detect the motion vector in zone as this motion vector.
As mentioned above, detect the motion vector that motion vector does not detect zone 2104.The 3rd reference frame 230 is utilized to from motion vector candidate to select motion vector and is not used for carrying out estimation, has therefore reduced amount of calculation.
An object may be covered by another object in first reference frame.This object can have the motion that is different from background and another object.So this object can occur as shaded areas in second reference frame.
In this case, it is difficult asking the correct motion vector of shaded areas with this method.Therefore, according to the relatedness computation value that does not detect by zone in the 3rd reference frame 230 of the motion vector appointment of selecting at motion vector to select in the processing (step S605) and motion vector between the zone 2104, determine whether selected motion vector is assigned to motion vector and does not detect the zone.
Specifically, set the threshold value of relatedness computation value in advance.Then, be implemented in and select handle the 3rd reference frame 230 of motion vector appointment of selecting in (step S605) and the zone in second reference frame 220 and motion vector by motion vector and do not detect relatedness computation between regional 2104.When relevance degree less than threshold value, promptly during the image similarity (step S606 is for being) in two zones, motion vector is selected to handle motion vector of selecting in (step S605) and is confirmed as not detecting the motion vector (step S608) in zone with being assigned to motion vector.
On the other hand, when relevance degree is not less than threshold value (step S606 is for denying), do not utilize motion vector to select to handle the motion vector of selecting in (step S605), do not detect zone the 2104 and the 3rd reference frame by motion vector and implement estimation, come calculating kinematical vector not detect the motion vector (step S607) in zone 2104.In other words, by being similar to processing enforcement estimation according to the interpolated image generation device 100 of first embodiment.
Motion compensation units 514 is utilized in motion estimation unit 506 and motion vector and is not detected the motion vector enforcement motion compensation of trying to achieve in the regional processing unit 512.As shown in figure 40, motion vector produces the position according to interpolation frame and stands transformation of scale, thereby produces interpolation frame on the precalculated position.Motion compensation process is not subjected to specific limited.
When implementing motion compensation by above-mentioned processing, applied zone can be superimposed mutually, or between the applied zone gap can take place.Mutual when superimposed when applied zone, the mean value of congruent region or high relevant range or intermediate value always are rewritten between former frame and next frame.When the gap takes place, implement in the frame or frame interpolation.
To narrate interpolated image generation device 500 below according to the 4th embodiment.Interpolated image generation device according to the 4th embodiment is decomposed into each zone with interpolation frame 300 shown in Figure 41 and 42, and with interpolation frame 300 as with reference to the estimation of implementing first reference frame 210 and second reference frame 220 to produce interpolation frame 300.
Because interpolation frame 300 is broken down into each zone, therefore do not have possibility on the interpolation frame 300 that will produce, produce image, or the zone that do not produce image takes place, thereby produce interpolated image with high accuracy in superimposed mode.
In the present embodiment, as shown in figure 41, with a kind of situation that on the position that between two successive frames is the half the time length between first reference frame 210 and second reference frame 220, produces interpolation frame 300 of explanation, but interpolation frame is not must appear on the position of half the time length of two different frames, and can only be to be inserted into two frames in the different frame, specific limited not be made in its position.
Here will specify an example, wherein object 442 flatly moves on static background 440, as shown in figure 43.
Figure 44 is the flow chart that shows the high relevant range motion vector allocation process process of being undertaken by the interpolated image generation device 500 according to the 4th embodiment.Shown in Figure 41 and 42, in the present embodiment, motion estimation unit 106 enforcement first reference frames 210 and second reference frame 220 are about the estimation of interpolation frame 300.
In the present embodiment, first reference frame 210 and all pixels in second reference frame 220 are set at low relevant pixel (step S701), then interpolation frame 300 are decomposed into n interpolation zone (step S702).
Set estimation number of times t (step S703), extract interpolation zone (step S704) from interpolation frame then.Then, with the interpolation zone on the interpolation frame as a reference, extract low relevant range in first reference frame 210 and the high correlation combiner between the low relevant range in second reference frame 220, and obtain the motion vector MV (step S705) between them.
Then, in order to determine the degree of correlation on the pixel basis in the zone, a pair of first reference zone of extraction in motion estimation process (step S705) and each respective pixel on second reference zone are implemented relatedness computation (step S706).Then, determine according to the degree of correlation that the relatedness computation result in the relatedness computation processing (step S706) in the zone makes on the pixel basis, and first reference zone and second reference zone are categorized as high relevant range and low relevant range (step S707).
From " low being correlated with " to " high relevant " changes the setting that is confirmed as high pixel of being correlated with, and the motion vector MV that will try to achieve in motion estimation process (step S705) gives the high relevant range in the interpolation frame reference zone (step S708).
Here, be confirmed as the setting of the pixel relevant with second reference image height from low relevant paramount relevant change, but this is the processing for the pixel in first reference frame and second reference frame with first reference frame.And then, give pixel on the interpolation frame with motion vector.
All reference zones in the interpolation frame 300 (16 zone) are implemented above-mentioned processing, and further repeat and the as many number of times of estimation number of times t (step S709, step S710).High relevant range motion vector allocation process is finished.
Figure 45 is the schematic diagram that the degree of correlation in each frame of back is implemented in the processing that is presented at Figure 44 explanation.In such mode, when finishing the processing of Figure 44 explanation, staying motion vector does not have the detected motion vector not detect the zone.
Figure 46 shows by motion vector not detect the flow chart that motion vector that regional processing unit 512 carries out does not detect regional processing procedure.At first, motion vector does not detect regional processing unit 512 and does not detect zone (step S801) from interpolation frame 300 extraction motion vectors.Then, unit 512 extracts motion vector and does not detect zone motion vector (step S802) on every side.
Figure 47 shows that motion vector does not detect the schematic diagram of regional 3001a and 3001b motion vector on every side.The motion vector MV71 of object 420 is given motion vector and does not detect regional 3001a and motion vector and do not detect zone between the regional 3001b.The motion vector MV72 of background 400 is given other zones.Therefore, in example shown in Figure 47, in step S802, extract motion vector MV71 and motion vector MV72.
Then, handle object area motion vector MV71 and the background area motion vector MV72 that does not detect regional 3001a and 3001b extraction in (step S802) about motion vector according to extracting, extract corresponding zone (step S803) from first reference frame, second reference frame, the 3rd reference frame and the 4th reference frame at motion vector candidate.
The 3rd reference frame is to appear at first reference frame 210 reference frame before on the time.The 4th reference frame is to appear at second reference frame 220 reference frame afterwards on the time.
Figure 48 is the schematic diagram of the concrete processing procedure of explanation in extracted region processing (step S803).As shown in figure 48, motion vector MV71 extends upward in positive direction and losing side.Extract the zone that the motion vector MV71 and first reference frame 210, second reference frame 220, the 3rd reference frame 230 and the 4th reference frame 240 intersect then respectively.As shown in figure 48, extract zone 2111, extract zone 2211, extract zone 2311, extract zone 2411 from the 4th reference frame from the 3rd reference frame from second reference frame 220 from first reference frame 210.
Extract the zone that the motion vector MV72 and first reference frame 210, second reference frame 220, the 3rd reference frame 230 and the 4th reference frame 240 intersect respectively.As shown in figure 48, extract zone 2121 from first reference frame 210.Extract zone 2221 from second reference frame 220.Extract zone 2321 from the 3rd reference frame.The outside of motion vector MV72 designated frame in the 4th reference frame.
Return Figure 46.After extracting described zone (step S803), identical motion vector is implemented from the relatedness computation between the zone of first reference frame 210 and 230 extractions of the 3rd reference frame from each reference frame.And then, identical motion vector is implemented from the relatedness computation (step S804) between the zone of second reference frame 220 and 240 extractions of the 4th reference frame.In the 4th embodiment, relatedness computation is used absolute difference.
In example shown in Figure 48, motion vector MV71 is implemented from the relatedness computation between the zone 2111 and regional 2311 that first reference frame 210 and the 3rd reference frame 230 extract respectively.Motion vector MV71 is implemented from the relatedness computation between the zone 2211 and regional 2411 that second reference frame 220 and the 4th reference frame 240 extract respectively.
Motion vector MV72 is implemented from the relatedness computation between the zone 2112 and regional 2312 that first reference frame 210 and the 3rd reference frame 230 extract respectively.Motion vector MV72 is implemented from the relatedness computation between the zone 2212 and regional 2412 that second reference frame 220 and the 4th reference frame 240 extract respectively.
According to the relatedness computation result, from several motion vectors, select to give the motion vector (step S805) that motion vector does not detect the zone.In the present embodiment, selecting absolute difference is the motion vector in the zone of minimum.
In example shown in Figure 48, from motion vector MV71 and motion vector MV72, select a motion vector.In this case, the absolute difference between the zone 2321 in zone 2121 in first reference frame 210 and the 3rd reference frame 230 is minimum.Therefore, be that motion vector MV72 is given motion vector and does not detect regional 3001a corresponding to this regional motion vector.
Figure 49 is that explanation is distributed to the schematic diagram that motion vector does not detect the processing in zone with motion vector.As shown in figure 49, during the motion vector appointed area of extracting in by step S802, only notice the low relevant range in first reference frame 210 and second reference frame 220.Therefore, the number as the zone that will stand relatedness computation to be extracted becomes less.
For example, in example shown in Figure 48, by any zone in second reference frame 220 of motion vector MV71 and motion vector MV72 appointment in high relevant range.Therefore should not be extracted in the zone.
In such mode, when extracting will be by motion vector appointment regional the time, the target area is restricted to low relevant range, therefore can limit the number in the zone that will be extracted, thereby energy is more convenient and select motion vector not detect the motion vector in zone more accurately.
Secondly, the relatedness computation value according between the zone in two frames of the motion vector appointment of being selected by motion vector to select in the processing (step S805) determines whether that selected motion vector is distributed to motion vector does not detect the zone.
Specifically, set the threshold value of relatedness computation value in advance.When the relatedness computation value between two zones of select handling the motion vector appointment of selecting in (step S805) by motion vector during less than threshold value, promptly, when the image in two zones similar mutually (step S806 is for being), selects to handle the motion vector of selecting in (step S805) at motion vector and be confirmed as to distribute to motion vector and do not detect regional motion vector (step S809).
On the other hand, when the relatedness computation value is not less than threshold value (step S806 is for denying), the then unfavorable as shown in figure 50 motion vector that is used in selects to handle the motion vector of selecting in (step S805), and estimation (step S807) is carried out in the low relevant range of adopting the motion vector in the interpolation frame not detect in zone, first reference frame and the 3rd reference frame.And estimation (step S808) is carried out in the low relevant range of adopting the motion vector in the interpolation frame not detect in zone, second reference frame and the 4th reference frame.Determine that according to its result motion vector does not detect regional motion vector (step S809) in the interpolation frame then.
Because have restriction, so first reference frame and the 3rd reference frame are only adopted low relevant range, estimation region also is restricted, thereby implement estimation with less calculating and higher precision.
Motion compensation units 114 utilizes motion estimation unit 106 and motion vector not to detect the motion vector enforcement motion compensation that regional processing unit 512 is tried to achieve.In the present embodiment, according to this motion vector the target area is applied on the interpolation frame.
Other advantage and modification are to realize easily for these professional those of skill in the art.Therefore, the present invention is not limited to detail and the representational embodiment that this paper shows and illustrates in its wide various aspects.Thereby, can make various modifications and do not deviate from the spirit and scope of the overall inventive concept that limits by attached claim and equivalent thereof.

Claims (40)

1. a method that is used to be created in the interpolation frame between first reference frame and second reference frame is characterized in that, comprising:
Interpolation frame is decomposed into several interpolation zones that comprise several pixels;
Detect most correlated combination for each interpolation zone the several combinations between first reference zone and second reference zone, first reference zone is in first reference frame and have size and a shape identical with interpolation zone, second reference zone is in second reference frame and have size and the shape identical with interpolation zone, and second reference zone of first reference zone of each combination of each interpolation zone, described several combinations and each combination of described several combinations is arranged in direct time mode;
First reference zone the combination that detects from being included in and second reference zone obtain motion vector;
Determine that first reference zone and second reference zone are in high relevant range or in low relevant range;
Give motion vector with motion vector and detect the zone, this motion vector detects the zone corresponding to the interpolation zone that is confirmed as high relevant range in first reference zone and second reference zone;
Adopt motion vector not detect zone, first area, the 3rd reference frame, second area and the 4th reference frame and determine to give the motion vector that motion vector does not detect the zone by estimation, the first area is in first reference frame and be confirmed as low relevant range; With interpolation frame as a reference, the 3rd reference frame be in the time go up with the identical direction of first reference frame on; Second area is in second reference frame and be confirmed as low relevant range; With interpolation frame as a reference, the 4th reference frame be in the time go up with the identical direction of second reference frame on, this motion vector does not detect the zone corresponding to the interpolation zone that is confirmed as low relevant range in first reference zone and second reference zone; With
Detect regional motion vector and produce interpolation frame according to giving motion vector for motion vector does not detect the motion vector of determining in the zone.
2. the method for claim 1 is characterized in that, further comprises:
Detect most correlated combination the several combinations between first reference zone and the 3rd reference zone, first reference zone is in first reference frame and be defined as in the zone of low relevant range and having with motion vector not detecting zone identical size and shape, and the 3rd reference zone is in the 3rd reference frame and have with motion vector and do not detect regional identical size and shape;
The most correlated combination of acquisition between first reference zone and the 3rd reference zone;
Determine motion vector the detected combination between first reference zone and the 3rd reference zone;
From the several combinations between second reference zone and the 4th reference zone, detect most correlated combination, second reference zone is in second reference frame and be defined as in the zone of low relevant range and having with motion vector not detecting zone identical size and shape, and the 4th reference zone is in the 4th reference frame and have with motion vector and do not detect regional identical size and shape;
The most correlated combination of acquisition between second reference zone and the 4th reference zone; With
Determine motion vector the specific combination between second area and the 4th zone,
Wherein, determine to give the motion vector that motion vector does not detect the zone according to the motion vector of determining from the combination between first reference zone and the 3rd reference zone with from the motion vector that the combination between second reference zone and the 4th reference zone is determined.
3. method as claimed in claim 2 is characterized in that, the degree of correlation between the degree of correlation between first reference zone and the 3rd reference zone and second reference zone and the 4th reference zone compared, and
Motion vector corresponding to more relevant zone is confirmed as and will gives the motion vector that motion vector does not detect the zone.
4. method as claimed in claim 2, it is characterized in that, when the degree of correlation between the degree of correlation between first reference zone and the 3rd reference zone and second reference zone and the 4th reference zone during, there is not motion vector to be assumed that motion vector does not detect the motion vector in zone all less than pre-set threshold.
5. method as claimed in claim 4, it is characterized in that, when the motion vector that does not detect the zone for motion vector also was not determined, the mean value or the intermediate value that the motion vector that gives to arrange around motion vector does not detect the zone are detected the motion vector in zone were confirmed as the motion vector that motion vector does not detect the zone.
6. the method for claim 1 is characterized in that, further comprises: by the recursion processing motion vector is not detected the zone and further resolve into high relevant range and low relevant range,
Wherein, handle the zone of resolving into high relevant range by recursion and be assumed that motion vector detects the zone,
Give motion vector with the motion vector that is detected and detect the zone,
The zone of resolving into low relevant range is assumed that motion vector does not detect the zone, and,
Adopt motion vector not detect zone, the 3rd zone, the 3rd reference frame, the 4th zone and the 4th reference frame and determine to give the motion vector that motion vector does not detect the zone by estimation, the 3rd zone is in first reference frame and be confirmed as low relevant range; With interpolation frame as a reference, the 3rd reference frame be in the time go up with the identical direction of first reference frame on, the 4th zone is in second reference frame and be confirmed as low relevant range; With interpolation frame as a reference, the 4th reference frame be in the time go up with the identical direction of second reference frame on.
7. method as claimed in claim 6 is characterized in that,
The relevance degree of each corresponding region is handled by recursion by second reference zone and is calculated in first reference zone, and
By the relevance degree that will be calculated and preset threshold value comparison and motion vector is not detected Region Decomposition is high relevant range and low relevant range.
8. method as claimed in claim 6 is characterized in that, to described low relevant district, estimation, the degree of correlation determine and give that each processing that motion vector detects in the zone all carries out in the recursion mode.
9. the method for claim 1 is characterized in that, calculates the relevance degree of each corresponding region in first reference zone and second reference zone, and
Determine that by the relevance degree and the preset threshold value comparison of will be calculated described zone is high relevant range or low relevant range.
10. the method for claim 1 is characterized in that, calculates the determined relevance degree of number by absolute difference, absolute difference sum and the high related pixel of the absolute difference of monochrome information, colour difference information.
11. the method for claim 1, it is characterized in that, when several motion vectors being given one when regional, will be as a reference with first reference frame, the mean value by the image in other a plurality of reference frames of these several motion vector appointments is assigned to this interpolation frame respectively.
12. the method for claim 1, it is characterized in that, when several motion vectors being given one when regional, with first reference frame as a reference, the motion vector by the relevance degree minimum between each zone in several other reference frames of these several motion vector detection is confirmed as giving the motion vector that motion vector does not detect the zone.
13. a method that is used to be created in the interpolation frame between first reference frame and second reference frame is characterized in that, comprising:
Described first reference frame is decomposed into several each first reference zone that are made of several pixels;
Detection has size identical with first reference zone and shape and second reference zone maximally related with it in second reference frame,
Obtain the second detected reference zone and the motion vector of first reference zone;
Determine that first reference zone and second reference zone are in high relevant range or in low relevant range;
Give motion vector with motion vector and detect the zone, this motion vector detects the zone and is confirmed as high relevant range;
The employing motion vector does not detect the zone and the 3rd reference frame determines that by estimation motion vector does not detect the motion vector in zone, with first reference frame is reference, the 3rd reference frame is in the time to be gone up on the direction opposite with second reference frame, and this motion vector does not detect that the zone is confirmed as hanging down the relevant range and in first reference zone; With
Detect regional motion vector and produce interpolation frame according to giving motion vector for motion vector does not detect the motion vector of determining in the zone.
14. method as claimed in claim 13 is characterized in that, produces described interpolation frame according to the time location of interpolation frame between first reference frame and second reference frame.
15. method as claimed in claim 13 is characterized in that, further comprises: by the recursion processing motion vector is not detected the zone and further resolve into high relevant range and low relevant range,
The zone of wherein resolving into high relevant range is assumed that motion vector detects the zone,
Give motion vector with described motion vector and detect the zone,
The zone of resolving into low relevant range is assumed that motion vector does not detect the zone, and,
The employing motion vector does not detect the zone and the 3rd reference frame determines that by estimation motion vector does not detect the motion vector in zone.
16. method as claimed in claim 15 is characterized in that, calculates the relevance degree of each corresponding region in first reference zone and second reference zone, and
By the relevance degree that will be calculated and preset threshold value comparison and motion vector is not detected Region Decomposition is high relevant range and low relevant range.
17. method as claimed in claim 15 is characterized in that, for low relevant range, estimation, each processing that the degree of correlation is determined and motion vector detects in the regional movement vector are all carried out in the recursion mode.
18. a method that is used to be created in the interpolation frame between first reference frame and second reference frame is characterized in that, comprising:
Interpolation frame is decomposed into several each interpolation zone that are made of several pixels;
Detect most correlated combination for each interpolation zone the combination between first reference zone and second reference zone, first reference zone is in first reference frame and has and identical size and the shape in interpolation zone, second reference zone is in second reference frame and have size and the shape identical with interpolation zone, and second reference zone of first reference zone of each combination of each interpolation frame, described several combinations and each combination of described several combinations is arranged in direct time mode;
First reference zone and second reference zone that detect the combination from being included in obtain motion vector;
Determine that first reference zone and second reference zone are in high relevant range or in low relevant range;
Motion vector is given to detect the zone corresponding to the motion vector that is confirmed as the interpolation zone of high relevant range in first reference zone and second reference zone;
Motion vector is given not detect the zone corresponding to the motion vector in the interpolation zone that is confirmed as low relevant range in first reference zone and second reference zone; With
Detect the motion vector in zone and give the motion vector generation interpolation frame that motion vector does not detect the zone according to giving motion vector.
19. method as claimed in claim 18 is characterized in that, the motion vector that gives to detect adjacent to the motion vector that motion vector does not detect the zone zone is given described motion vector and do not detect the zone.
20. method as claimed in claim 18 is characterized in that, further comprises:
Obtain the relevance degree between first reference zone and the 3rd reference zone, wherein, first reference zone is in first reference frame, do not detect motion vector that the motion vector of arranging around the zone detects the zone and do not detect the zone with motion vector and determine as a reference by giving motion vector, the 3rd reference zone, with interpolation frame as a reference, in the 3rd reference frame, be on the time with the identical direction of first reference frame on and determine by described motion vector; With
Obtain the relevance degree between second reference zone and the 4th reference zone, wherein, second reference zone is in second reference frame, do not detect motion vector that the motion vector of arranging around the zone detects the zone and do not detect the zone with motion vector and determine as a reference by giving motion vector, the 4th reference zone, with interpolation frame as a reference, in the 4th reference frame, the time that is in upward and on the identical direction of second reference frame and by motion vector is determined
Wherein, determine to give the motion vector that motion vector does not detect the zone according to the relevance degree between the relevance degree between first reference zone and the 3rd reference zone and second reference zone and the 4th reference zone.
21. method as claimed in claim 20, it is characterized in that, when having motion vector when not detecting the motion vector of arranging around the zone and detecting several motion vector in zone, determine and to give the motion vector that motion vector does not detect the zone from described several motion vectors according to the relevance degree between the relevance degree between first reference zone and the 3rd reference zone and second reference zone and the 4th reference zone.
22. method as claimed in claim 21, it is characterized in that, will be confirmed as corresponding to the motion vector in maximally related zone and will give the motion vector that motion vector does not detect the zone according to the relevance degree between the relevance degree between first reference zone and the 3rd reference zone and second reference zone and the 4th reference zone.
23. method as claimed in claim 20, it is characterized in that, when the relevance degree between the relevance degree between first reference zone and the 3rd reference zone and second reference zone and the 4th reference zone during, be assumed that without any motion vector motion vector does not detect the motion vector in zone all less than preset threshold value.
24. method as claimed in claim 23 is characterized in that, when giving motion vector and not detecting motion vector that the motion vector of arranging around the zone detects the zone and be not assumed that motion vector does not detect the motion vector in zone,
Motion vector is given motion vector and does not detect the zone, adopt motion vector not detect the zone, the first area, the 3rd reference frame, second area and the 4th reference frame are determined described motion vector by estimation, the first area is in first reference frame and be confirmed as low relevant range, with interpolation frame as a reference, the 3rd reference frame be in the time go up with the identical direction of first reference frame on, second area is in second reference frame and be confirmed as low relevant range, with interpolation frame as a reference, the 4th reference frame be in the time go up with the identical direction of second reference frame on.
25. method as claimed in claim 18 is characterized in that,
When not detect first reference zone in first reference frame that motion vector that the motion vector of arranging around the zone detects the zone determines be when being confirmed as low relevant range regional by giving motion vector, obtain the relevance degree between first reference zone and the 3rd reference zone, and
In degree of correlation determining step when not detect second reference zone in second reference frame that motion vector that the motion vector of arranging around the zone detects the zone determines be when being confirmed as low relevant range regional, to obtain the relevance degree between second reference zone and the 4th reference zone by giving motion vector.
26. method as claimed in claim 18 is characterized in that, comprises that further by the recursion processing motion vector not being detected the zone further resolves into high relevant range and low relevant range,
The zone that wherein is confirmed as high relevant range is assumed that motion vector detects the zone,
Give motion vector with motion vector and detect the zone, and
The motion vector that employing is confirmed as low relevant range does not detect zone and the 3rd reference frame and is given motion vector by the definite motion vector of estimation and does not detect the zone.
27. method as claimed in claim 26 is characterized in that, carries out the relatedness computation for each corresponding region in first reference zone and second reference zone, and
By result of calculation and predetermined threshold are determined that relatively high relevant range still is low relevant range.
28. method as claimed in claim 26 is characterized in that, in low relevant range, estimation, each processing that the degree of correlation is determined and motion vector detects in the regional movement vector assignment are all carried out in the recursion mode.
29. a method that is used to be created in the interpolation frame between first reference frame and second reference frame is characterized in that, comprising:
First reference frame is decomposed into several each first reference zone that are made of several pixels;
Detect have in second reference frame size identical and shape with first reference zone and with maximally related second reference zone of first reference zone;
Obtain the second detected reference zone and the motion vector of first reference zone;
Determine that first reference zone and second reference zone are in high relevant range or in low relevant range;
Give motion vector with motion vector and detect the zone, motion vector detects the zone and is confirmed as high relevant range in first reference zone;
Do not detect motion vector that the motion vector of arranging around the zone detects the zone and give motion vector and do not detect the zone giving motion vector, described motion vector does not detect the zone and is confirmed as low relevant range in first reference zone; With
Detect the motion vector in zone and give the motion vector generation interpolation frame that motion vector does not detect the zone according to giving motion vector.
30. method as claimed in claim 29 is characterized in that, the motion vector that gives to detect adjacent to the motion vector that motion vector does not detect the zone zone is given described motion vector and do not detect the zone.
31. method as claimed in claim 29 is characterized in that, further comprises:
Not detecting the zone with motion vector is reference, obtain by give motion vector do not detect second reference zone in second reference frame that motion vector that the motion vector of arranging around the zone detects the zone determines and described motion vector do not detect between the zone relevance degree and
Not detecting the zone with motion vector is reference, obtain by giving motion vector and do not detect the 3rd reference zone in the 3rd reference frame that motion vector that the motion vector of arranging around the zone detects the zone determines and described motion vector and do not detect relevance degree between the zone
Wherein, not detecting the relevance degree that relevance degree between the zone and the 3rd reference zone and motion vector do not detect between the zone according to second reference zone and motion vector determines to give the motion vector that motion vector does not detect the zone.
32. method as claimed in claim 31, it is characterized in that, when having motion vector when not detecting the motion vector of arranging around the zone and detecting several motion vector in zone, do not detect the phasic property value that relevance degree between the zone and the 3rd reference zone and motion vector do not detect between the zone according to second reference zone and motion vector and determine and to give the motion vector that motion vector does not detect the zone from described several motion vectors.
33. method as claimed in claim 31, it is characterized in that, do not detect the relevance degree that relevance degree between the zone and the 3rd reference zone and motion vector do not detect between the zone according to second reference zone and motion vector and will be defined as corresponding to the motion vector in maximally related zone giving the motion vector that motion vector does not detect the zone.
34. method as claimed in claim 31, it is characterized in that, wherein when second reference zone and motion vector did not detect relevance degree between the zone and the 3rd reference zone and motion vector and do not detect relevance degree between the zone all less than preset threshold value, any motion vector corresponding to the zone was not assumed that motion vector does not detect the motion vector in zone.
35. method as claimed in claim 34, it is characterized in that, when giving motion vector and not detecting motion vector that the motion vector of arranging around the zone detects the zone and be not assumed that motion vector does not detect the motion vector in zone, give motion vector with described motion vector and do not detect the zone, described motion vector adopts motion vector not detect the zone, the first area, the 3rd reference frame, second area and the 4th reference frame are determined by estimation, the first area is in first reference frame and be confirmed as low relevant range, with interpolation frame as a reference, the 3rd reference frame be in the time go up with the identical direction of first reference frame on, second area is in second reference frame and be confirmed as low relevant range, with interpolation frame as a reference, the 4th reference frame be in the time go up with the identical direction of second reference frame on.
36. method as claimed in claim 29 is characterized in that, according to the temporal position generation interpolation frame of the interpolation frame between first reference frame and second reference frame.
37. a device that is used to be created in the interpolation frame between first reference frame and second reference frame is characterized in that, comprising:
The interpolation resolving cell, this unit is decomposed into several interpolation zones that comprise several pixels with interpolation frame;
The combine detection unit, this unit is for detecting most correlated combination the several combinations of each interpolation zone between first reference zone and second reference zone, first reference zone is in first reference frame and have size and a shape identical with interpolation zone, second reference zone is in second reference frame and have size and the shape identical with interpolation zone, and second reference zone of first reference zone of each combination of each interpolation zone, described several combinations and each combination of described several combinations is arranged in direct time mode;
Motion estimation unit, this unit first reference zone and second reference zone from be included in detected combination obtains motion vector;
Degree of correlation determining unit, this unit determine that first reference zone and second reference zone are in high relevant range or in low relevant range;
Give the unit, this unit gives motion vector with motion vector and detects the zone, and this motion vector detects the zone corresponding to the interpolation zone that is confirmed as high relevant range in first reference zone and second reference zone;
The motion vector determining unit, this unit adopts motion vector not detect the zone, the first area, the 3rd reference frame, second area and the 4th reference frame are determined to give the motion vector that motion vector does not detect the zone by estimation, the first area is in first reference frame and be confirmed as low relevant range, with interpolation frame as a reference, the 3rd reference frame be in the time go up with the identical direction of first reference frame on, second area is in second reference frame and be confirmed as low relevant range, with interpolation frame as a reference, the 4th reference frame is on the time and on the identical direction of second reference frame, this motion vector does not detect the zone corresponding to the interpolation zone that is confirmed as low relevant range in first reference zone and second reference zone; With
Motion compensation units, this unit detects regional motion vector and produces interpolation frame for motion vector does not detect the motion vector of determining in the zone according to giving motion vector.
38. a device that is used to be created in the interpolation frame between first reference frame and second reference frame is characterized in that, comprising:
The zone generation unit, this unit is decomposed into several each first reference zone that are made of several pixels with first reference frame;
Second with reference to detecting unit, and this unit detects has the size identical with first reference zone and shape and second reference zone maximally related with it in second reference frame;
Motion estimation unit, second reference zone that this unit acquisition is detected and the motion vector of first reference zone;
Degree of correlation determining unit, this unit determine that first reference zone and second reference zone are in high relevant range or in low relevant range;
Motion vector gives the unit, and this unit gives motion vector with motion vector and detects the zone, and this motion vector detects the zone and is confirmed as high relevant range;
The motion vector determining unit, this unit adopts motion vector not detect the zone and the 3rd reference frame determines that by estimation motion vector does not detect the motion vector in zone, with first reference frame as a reference, the 3rd reference frame is in the time to be gone up on the direction opposite with second reference frame, and this motion vector does not detect that the zone is confirmed as hanging down the relevant range and in first reference zone; With
The compensation motion vector unit, this unit detects regional motion vector and produces interpolation frame for motion vector does not detect the motion vector of determining in the zone according to giving motion vector.
39. a device that is used to be created in the interpolation frame between first reference frame and second reference frame is characterized in that, comprising:
The zone generation unit, this unit is decomposed into several each interpolation zone that are made of several pixels with interpolation frame;
The combine detection unit, this unit is for detecting most correlated combination the several combinations of each interpolation zone between first reference zone and second reference zone, first reference zone is in first reference frame and have size and a shape identical with interpolation zone, second reference zone is in second reference frame and have size and the shape identical with interpolation zone, and the direct time mode of second reference zone of first reference zone of each combination of each interpolation frame, described a plurality of combinations and each combination of described a plurality of combinations is arranged;
Motion estimation unit, this unit first reference zone and second reference zone from be included in detected combination obtains motion vector;
Degree of correlation determining unit, this unit determine that first reference zone and second reference zone are in high relevant range or in low relevant range;
First motion vector gives the unit, and this unit gives motion vector to detect the zone corresponding to the motion vector that is confirmed as the interpolation zone of high relevant range in first reference zone and second reference zone;
Second motion vector gives the unit, and this unit gives motion vector not detect the zone corresponding to the motion vector in the interpolation zone that is confirmed as low relevant range in first reference zone and second reference zone; With
Motion compensation units, this unit detects the motion vector in zone and gives the motion vector generation interpolation frame that motion vector does not detect the zone according to giving motion vector.
40. a device that is used to be created in the interpolation frame between first reference frame and second reference frame is characterized in that, comprising:
The zone generation unit, this unit is decomposed into several each first reference zone that are made of several pixels with first reference frame;
The second reference zone detecting unit, this unit detect have in second reference frame size identical and shape with first reference zone and with maximally related second reference zone of first reference zone;
Motion estimation unit, this unit obtains the second detected reference zone and the motion vector of first reference zone;
Degree of correlation determining unit, this unit determine that first reference zone and second reference zone are in high relevant range or in low relevant range;
First motion vector gives the unit, and this unit gives motion vector with motion vector and detects the zone, and this motion vector detects the zone and is confirmed as high relevant range in first reference zone;
Second motion vector gives the unit, this unit will give motion vector and not detect motion vector that the motion vector of arranging around the zone detects the zone and give motion vector and do not detect the zone, and this motion vector does not detect the zone and is confirmed as low relevant range in first reference zone; With
Motion compensation units, this unit detects the motion vector in zone and gives the motion vector generation interpolation frame that motion vector does not detect the zone according to giving motion vector.
CN 200610071860 2005-03-31 2006-03-30 Method and apparatus for generating interpolation frame Pending CN1842165A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2005100856 2005-03-31
JP2005100856 2005-03-31
JP2005271077 2005-09-16

Publications (1)

Publication Number Publication Date
CN1842165A true CN1842165A (en) 2006-10-04

Family

ID=37030985

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 200610071860 Pending CN1842165A (en) 2005-03-31 2006-03-30 Method and apparatus for generating interpolation frame

Country Status (1)

Country Link
CN (1) CN1842165A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009138037A1 (en) * 2008-05-13 2009-11-19 华为技术有限公司 Video service system, video service apparatus and extracting method of key frame thereof
CN101626508A (en) * 2008-07-11 2010-01-13 宝利微电子系统控股公司 Method for judging blockage area in frame rate promotion
CN101621693B (en) * 2009-07-31 2011-01-05 重庆大学 Frame frequency lifting method for combining target partition and irregular block compensation
CN101547368B (en) * 2009-04-24 2011-03-09 炬力集成电路设计有限公司 Device, method and decoder for processing reference frame in image
CN102123283A (en) * 2011-03-11 2011-07-13 杭州海康威视软件有限公司 Interpolated frame acquisition method and device in video frame rate conversion
CN101567963B (en) * 2008-04-22 2011-07-20 承景科技股份有限公司 Interpolated frame generating method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101567963B (en) * 2008-04-22 2011-07-20 承景科技股份有限公司 Interpolated frame generating method
WO2009138037A1 (en) * 2008-05-13 2009-11-19 华为技术有限公司 Video service system, video service apparatus and extracting method of key frame thereof
CN101626508A (en) * 2008-07-11 2010-01-13 宝利微电子系统控股公司 Method for judging blockage area in frame rate promotion
CN101547368B (en) * 2009-04-24 2011-03-09 炬力集成电路设计有限公司 Device, method and decoder for processing reference frame in image
CN101621693B (en) * 2009-07-31 2011-01-05 重庆大学 Frame frequency lifting method for combining target partition and irregular block compensation
CN102123283A (en) * 2011-03-11 2011-07-13 杭州海康威视软件有限公司 Interpolated frame acquisition method and device in video frame rate conversion

Similar Documents

Publication Publication Date Title
CN1110198C (en) Video format conversion apparatus and method
CN1070670C (en) efficient motion vector detection
CN1184796C (en) Image processing method and equipment, image processing system and storage medium
CN101079254A (en) High resolution enabling apparatus and method
CN1816154A (en) Method and apparatus for motion estimation
CN1168052A (en) Image data interpolating apparatus
CN1842165A (en) Method and apparatus for generating interpolation frame
CN1195374C (en) Imaging signal processing method and device
CN1897694A (en) Moving-object tracking control apparatus, moving-object tracking system, moving-object tracking control method, and program
CN1123497A (en) Apparatus for converting frame format of television signal
CN1671176A (en) Image processing apparatus for correcting distortion of image and image shooting apparatus for correcting distortion of shot image
CN1917578A (en) Data processing apparatus,data processing method and program
CN1913585A (en) Method and system for estimating motion and compensating for perceived motion blur in digital video
CN1945580A (en) Image processing apparatus
CN1627814A (en) Image processing method and apparatus and program
CN1595958A (en) Image quality correction apparatus and image quality correction method
CN1531711A (en) Method and system for calculating transformed image from digital image
CN1669052A (en) Image matching system using 3-dimensional object model, image matching method, and image matching program
CN101032159A (en) Image processing device, method, and image processing program
CN1460452A (en) 3D back projection method and X-ray computer laminated imaging device
CN1684492A (en) Image dictionary creating apparatus, coding apparatus, image dictionary creating method
CN1942899A (en) Face image creation device and method
CN1960431A (en) Image processing device, image processing method, program for the same, and memory medium for storing the program
CN1551017A (en) Image searching device, method, programmme and storage medium storing said programme
CN1311692C (en) Apparatus and method for checking dynamic vector

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Open date: 20061004