US20100079665A1 - Frame Interpolation Device - Google Patents

Frame Interpolation Device Download PDF

Info

Publication number
US20100079665A1
US20100079665A1 US12/475,265 US47526509A US2010079665A1 US 20100079665 A1 US20100079665 A1 US 20100079665A1 US 47526509 A US47526509 A US 47526509A US 2010079665 A1 US2010079665 A1 US 2010079665A1
Authority
US
United States
Prior art keywords
input frame
interpolation
frame image
spatial frequency
current input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/475,265
Inventor
Himio Yamauchi
Hiroyuki Michie
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Corp
Original Assignee
Toshiba Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Corp filed Critical Toshiba Corp
Assigned to KABUSHIKI KAISHA TOSHIBA reassignment KABUSHIKI KAISHA TOSHIBA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HICHIE, HIROYUKI, YAMAUCHI, HIMIO
Assigned to KABUSHIKI KAISHA TOSHIBA reassignment KABUSHIKI KAISHA TOSHIBA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICHIE, HIROYUKI, YAMAUCHI, HIMIO
Publication of US20100079665A1 publication Critical patent/US20100079665A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/587Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal sub-sampling or interpolation, e.g. decimation or subsequent interpolation of pictures in a video sequence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/117Filters, e.g. for pre-processing or post-processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/132Sampling, masking or truncation of coding units, e.g. adaptive resampling, frame skipping, frame interpolation or high-frequency transform coefficient masking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • H04N19/14Coding unit complexity, e.g. amount of activity or edge presence estimation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/172Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/57Motion estimation characterised by a search window with variable size or shape

Definitions

  • the present invention relates to a frame interpolation device and method for performing frame interpolation.
  • motion vectors obtained by block matching are compared with an average motion vector obtained over the entire image. If patterns of regions indicated by a motion vector are similar, it is determined that a repetitive pattern exists in those regions and the motion vector is changed.
  • An example of such devices is disclosed in JP-A-2007-235403.
  • FIG. 1 shows a configuration of a frame interpolation device according to the present invention.
  • FIG. 2 shows the configuration of a frame interpolation device according to the invention in which the frame interpolation processing method is changed from one divisional region to another.
  • FIG. 3 shows a more detailed configuration of an analyzing module.
  • FIG. 4 shows the configuration of a frame interpolation device for realizing a first example interpolation frame generation method.
  • FIG. 5 shows search ranges that are used in determining a motion vector.
  • FIG. 6 shows the configuration of a frame interpolation device capable of performing a third example method for preventing erroneous detection of a motion vector due to a repetitive pattern.
  • a method for reducing unsmoothness that occurs when a video image is displayed by a hold-display type display device or smoothly displaying a video image such as a movie which is small in the number of frames per second is known.
  • an interpolation frame is generated from plural successive frames and inserting the generated interpolation frame between original frames used. This method can improve the image quality of the video image.
  • a frame interpolation device may detect an erroneous motion from original images. An interpolation frame generated based on such an erroneous motion is different from an image that should be generated. Insertion of such a low-accuracy interpolation frame between original frame images used lowers the image quality contrary to the intention.
  • a high-accuracy interpolation frame is generated by preventing a frame interpolation device from generating a low-resolution interpolation frame in the above manner.
  • the device described below implements frame interpolation method having the following procedure.
  • features of patterns of an original frame image are detected based on the spatial frequency characteristic, that is, the degree of complexity of patterns, of a frame image signal to be used for generating an interpolation frame.
  • An interpolation frame generation method suitable for the image signal is selected according to the detected no features of patterns.
  • FIG. 1 shows the configuration of a frame interpolation device according to the embodiment.
  • the frame interpolation device includes an analyzing module 101 for analyzing features of patterns of an input image, a frame memory 103 for storing the input image, a motion vector detecting module 104 for detecting a motion between plural successive frames, an interpolation image generating module 105 for generating an interpolation frame based on the motion detected by the motion vector detecting module 104 , a control module 102 for controlling the motion vector detecting module 104 and the interpolation image generating module 105 , and a display unit 119 for displaying an image signal 118 that has been subjected to frame interpolation processing.
  • an image signal 111 of each frame of a video image is input to the analyzing module 101 .
  • the analyzing module 101 extracts features of the image based on a spatial frequency characteristic, that is, the degree of complexity of patterns, of the image signal of the frame, and sends a control signal 112 indicating how to generate an interpolation frame to the control module 102 .
  • the motion vector detecting module 104 performs a search for determining an image variation from an immediately preceding frame image signal 113 and sends a motion vector 115 between the frame images to the interpolation image generating module 105 .
  • block matching is performed in which each image used is divided into plural small blocks and a search is performed to determine a motion vector indicating an interframe motion of each block.
  • the interpolation image generating module 105 generates an interpolation frame image signal 117 based on the motion vectors 115 , the image signal 111 of the current frame, and the image signal 113 of the preceding frame according to an interpolation frame generation method control signal 116 which is sent from the control module 102 .
  • the interpolation frame is inserted between the two original frames and a resulting image signal 118 that has been subjected to the frame interpolation processing is output.
  • each input frame image is divided into plural regions but the same frame interpolation processing method is employed in all the regions.
  • another operation is possible in which each input frame image is divided into plural regions and the frame interpolation processing method is changed from one divisional region to another.
  • FIG. 2 shows the configuration of a frame interpolation device according to the embodiment in which the frame interpolation processing method is changed from one divisional region to another.
  • the same modules as shown in FIG. 1 will not be described in detail.
  • frame interpolation processing control signals 112 for respective regions of an input frame image are sent to a method storing module 201 .
  • the control module 102 refers to control signals 211 stored in the method storing module 201 .
  • FIG. 3 shows a more detailed configuration of the analyzing module 101 .
  • the analyzing module 101 includes plural filters 301 - 303 for acquiring plural spatial frequency components, a histogram acquiring module 304 - 306 for generating histograms from outputs of the filters 301 - 303 , respectively, and an determining module 307 for analyzing features of the image signal based on the histograms that are obtained from the respective histogram acquiring module 304 - 306 .
  • an input image signal 111 is input to the plural spatial frequency selecting modules (filters) 301 - 303 having different characteristics.
  • Output signals 312 - 314 of the filters 301 - 303 are input to the histogram acquiring modules 304 - 306 , respectively, which generate respective histograms 315 - 317 .
  • the histograms 315 - 317 reflect components in different frequency ranges of the input image signal 111 , respectively.
  • the determining module 307 which obtains a distribution of spatial frequency components outputs a control signal 112 which indicates a frame interpolation processing method that is most suitable for the input image signal 111 .
  • the interpolation frame generation method is changed in that region so that a search for determining a motion between successive frames is performed by using a reduced image.
  • An interpolation frame is generated based on resulting motion vectors and the original (i.e., non-reduced) frame image signal.
  • FIG. 4 shows the configuration of a frame interpolation device for realizing the first example no interpolation frame generation method.
  • an input frame image signal 111 is input to the analyzing module 101 and an image reducing module 401 .
  • the analyzing module 101 detects features of the input image 111 and sends control signals 112 indicating frame interpolation methods.
  • the control signals 112 are stored in the frame interpolation control method storing module 201 .
  • the control signals 112 are sent from the storing module 201 to the control module 102 .
  • the image reducing module 401 reduces the input image 111 .
  • the original input image 111 and a reduced image 411 which is sent from the image reducing module 401 are stored in the frame memory 103 .
  • a search for determining a motion is performed based on the reduced input image 411 and a 1-frame-preceding reduced input image 412 .
  • a search for determining a motion is performed based on the non-reduced input image 111 and a non-reduced image signal 113 of the preceding frame.
  • An interpolation image 117 is generated based on thus-obtained motion vectors 115 , the frame image signal 111 , and the non-reduced image signal 113 of the preceding frame.
  • the interpolation image is inserted between the original frame images and an image signal 118 which has been subjected to the frame interpolation processing is output.
  • This frame interpolation method makes it possible to find a motion in a wider region while preventing reduction in the accuracy of a generated interpolation frame.
  • FIG. 5 shows an example in which the group of a motion vector that can be found is changed.
  • the search range shown in section (b) of FIG. 5 which is used in determining a motion vector, makes it possible to detect a larger motion vector than the search range shown in section (a) of FIG. 5 while the total number of candidate motion vectors is kept the same. Conversely, the search range shown in section (a) of FIG. 5 makes it possible to detect a finer motion than the search range shown in section (b) of FIG. 5 because the interval between adjacent candidate motion vectors is smaller in the former search range than in the latter one.
  • a wider search range as shown in section (b) of FIG. 5 is used in regions where spatial frequency components of an input image are concentrated only in a low-frequency range and a smaller search range as shown in section (a) of FIG. 5 is used in other regions.
  • This frame interpolation method makes it possible to find a motion in a wider region while preventing reduction in the accuracy of a generated interpolation frame.
  • a third example of generating interpolation frame will be described below.
  • patterns of the image have gentle variations.
  • similar patterns occupy a wide region. Therefore, due to noise etc. in the image, a motion vector that is different from a motion vector that should be selected is prone to be selected by block matching.
  • the motion vector detecting module 104 shown in FIG. 1 increases the block size of block matching to compare patterns in a wider region. With this measure, a pattern variation appears more likely in each block of the block matching and a motion vector that should be selected is more apt to be selected.
  • An interpolation frame is generated based on resulting motion vectors and the original frame image signal. This frame interpolation method makes it possible to generate a more accurate interpolation frame.
  • a fourth example of generating interpolation frame will be described below.
  • frame interpolation is performed by changing the frame interpolation method so as to prevent erroneous detection of a motion between frames due to a repetitive pattern.
  • the motion vector search range in the motion vector detecting module 104 shown in FIG. 1 is changed so as to be narrower than the region of the repetitive pattern. This is effective because refraining from referring to the next repetition cycle of a repetitive pattern in block matching reduces the number of candidate motion vectors in a motion vector search and hence lowers the probability of selection of an erroneous motion vector. This method makes it possible to prevent reduction in the accuracy of an interpolation frame due to a repetitive pattern.
  • the range of motion vectors that can be used in the interpolation image generating module 105 shown in FIG. 1 is made narrower than usual.
  • the interpolation image generating module 105 is made a device capable of changing the range of motion vectors that can be used with respect to the range of motion vectors detected by the motion vector detecting module 104 .
  • a technique disclosed in JP-A-2008-067205 may be employed as a technique for narrowing the range of motion vectors that can be used in the interpolation image generating module 105 .
  • the interpolation image generating module 105 narrows the range of motion vectors to be used with respect to the range of detected motion vectors and uses resulting motion vectors for generation of an interpolation frame.
  • an interpolation frame is generated by using motion vectors that are reduced from original motion vectors, which reduces the effect of the frame interpolation.
  • the image quality reduction can still be decreased because the reduction in the accuracy of an interpolation frame due to motion vectors that have been detected erroneously due to a repetitive pattern is decreased.
  • JP-A-2008-067222 (counterpart U.S. publication is: US 2008/0063289 A1) may be used as a technique for smoothing motion vectors.
  • the publication JP-A-2008-067222 discloses a technique for controlling the function of filtering on motion vectors according to features of an input image signal.
  • FIG. 6 shows the configuration of a frame interpolation device capable of performing the third example method for preventing erroneous detection of a motion vector due to a repetitive pattern.
  • motion vectors 115 of one frame that have been found by the motion vector detecting module 104 are stored in a motion vector memory 601 .
  • motion vectors 611 of blocks around a motion search block and a motion vector 611 , detected one frame before, of the block at the same position are sent from the motion vector memory 601 to the motion vector detecting module 104 .
  • a motion vector found is changed so as not to be much different from the neighboring motion vectors 611 .
  • differences from neighboring motion vectors are corrected more strongly.
  • a motion vector that has been detected erroneously due to a repetitive pattern is corrected by using correctly detected neighboring motion vectors, whereby reduction in the accuracy of a generated interpolation frame 117 can be prevented.
  • the device analyzes an input image signal based on a distribution of its spatial frequency components and an analysis result is reflected in the interpolation frame generation method. This makes it possible to provide a high-accuracy interpolation frame generation method and device which are lower in the probability of occurrence of failures.
  • a repetitive pattern is detected based on a spatial frequency characteristic of an input image signal and a detection result is reflected in the interpolation frame generation method. This also contributes to providing a high-accuracy interpolation frame generation method and device which are lower in the probability of occurrence of failures.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Television Systems (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

A frame interpolation device includes: a motion vector detecting module configured to divide each of input frame images into a plurality of blocks and detect motion vectors by performing block matching; a distribution detecting module configured to detect a distribution of spatial frequency components of a current input frame image; a feature analyzing module configured to analyze a feature of the current input frame image based on the distribution; a control module configured to control the motion vector detecting module according to the detected feature; an interpolation frame generating module configured to generate an interpolation frame image using the detected motion vectors according to the detected feature; and an output module configured to insert the interpolation frame image between the input frame images and output resulting output frame images.

Description

    CROSS REFERENCE TO RELATED APPLICATION(S)
  • The present disclosure relates to the subject matters contained in Japanese Patent Application No. 2008-248963 filed on Sep. 26, 2009, which are incorporated herein by reference in its entirety.
  • FIELD
  • The present invention relates to a frame interpolation device and method for performing frame interpolation.
  • BACKGROUND
  • In conventional frame interpolation devices, motion vectors obtained by block matching are compared with an average motion vector obtained over the entire image. If patterns of regions indicated by a motion vector are similar, it is determined that a repetitive pattern exists in those regions and the motion vector is changed. An example of such devices is disclosed in JP-A-2007-235403.
  • However, erroneous detection may tend to occur by use of the technique described in the publication JP-A-2007-235403 because a repetitive pattern is detected only from motion vectors.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • A general configuration that implements the various feature of the invention will be described with reference to the drawings. The drawings and the associated descriptions are provided to illustrate embodiments of the invention and not to limit the scope of the invention.
  • FIG. 1 shows a configuration of a frame interpolation device according to the present invention.
  • FIG. 2 shows the configuration of a frame interpolation device according to the invention in which the frame interpolation processing method is changed from one divisional region to another.
  • FIG. 3 shows a more detailed configuration of an analyzing module.
  • FIG. 4 shows the configuration of a frame interpolation device for realizing a first example interpolation frame generation method.
  • FIG. 5 shows search ranges that are used in determining a motion vector.
  • FIG. 6 shows the configuration of a frame interpolation device capable of performing a third example method for preventing erroneous detection of a motion vector due to a repetitive pattern.
  • DETAILED DESCRIPTION OF THE EMBODIMENTS
  • Embodiments of the present invention will be hereinafter described with reference to the drawings.
  • A method for reducing unsmoothness that occurs when a video image is displayed by a hold-display type display device or smoothly displaying a video image such as a movie which is small in the number of frames per second is known. In this method, an interpolation frame is generated from plural successive frames and inserting the generated interpolation frame between original frames used. This method can improve the image quality of the video image.
  • However, where there is a large image motion between plural successive frames or it is difficult to determine an interframe motion due to a repetitive pattern or the like, a frame interpolation device may detect an erroneous motion from original images. An interpolation frame generated based on such an erroneous motion is different from an image that should be generated. Insertion of such a low-accuracy interpolation frame between original frame images used lowers the image quality contrary to the intention.
  • In a device described below, a high-accuracy interpolation frame is generated by preventing a frame interpolation device from generating a low-resolution interpolation frame in the above manner.
  • In order to generate the high-accuracy interpolation frame, the device described below implements frame interpolation method having the following procedure.
  • First, features of patterns of an original frame image are detected based on the spatial frequency characteristic, that is, the degree of complexity of patterns, of a frame image signal to be used for generating an interpolation frame. An interpolation frame generation method suitable for the image signal is selected according to the detected no features of patterns.
  • In the following description, details of a configuration and operation of the device will be hereinafter described in detail. In the embodiment, a description will be made of how the interpolation frame generation method is changed to generate a high-accuracy interpolation frame depending on the features of an original frame image of a video image. The configuration of a frame interpolation device according to the embodiment will be described.
  • FIG. 1 shows the configuration of a frame interpolation device according to the embodiment.
  • The frame interpolation device according to the embodiment includes an analyzing module 101 for analyzing features of patterns of an input image, a frame memory 103 for storing the input image, a motion vector detecting module 104 for detecting a motion between plural successive frames, an interpolation image generating module 105 for generating an interpolation frame based on the motion detected by the motion vector detecting module 104, a control module 102 for controlling the motion vector detecting module 104 and the interpolation image generating module 105, and a display unit 119 for displaying an image signal 118 that has been subjected to frame interpolation processing.
  • Next, the operation of the above-configured frame interpolation device according to the embodiment will be described.
  • First, an image signal 111 of each frame of a video image is input to the analyzing module 101. The analyzing module 101 extracts features of the image based on a spatial frequency characteristic, that is, the degree of complexity of patterns, of the image signal of the frame, and sends a control signal 112 indicating how to generate an interpolation frame to the control module 102.
  • On the other hand, while the same frame image signal 111 is recorded in the frame memory 103, the motion vector detecting module 104 performs a search for determining an image variation from an immediately preceding frame image signal 113 and sends a motion vector 115 between the frame images to the interpolation image generating module 105. At this time, block matching is performed in which each image used is divided into plural small blocks and a search is performed to determine a motion vector indicating an interframe motion of each block. The interpolation image generating module 105 generates an interpolation frame image signal 117 based on the motion vectors 115, the image signal 111 of the current frame, and the image signal 113 of the preceding frame according to an interpolation frame generation method control signal 116 which is sent from the control module 102. The interpolation frame is inserted between the two original frames and a resulting image signal 118 that has been subjected to the frame interpolation processing is output.
  • In the frame interpolation device of FIG. 1, each input frame image is divided into plural regions but the same frame interpolation processing method is employed in all the regions. However, another operation is possible in which each input frame image is divided into plural regions and the frame interpolation processing method is changed from one divisional region to another.
  • FIG. 2 shows the configuration of a frame interpolation device according to the embodiment in which the frame interpolation processing method is changed from one divisional region to another. The same modules as shown in FIG. 1 will not be described in detail.
  • Referring to FIG. 2, frame interpolation processing control signals 112 for respective regions of an input frame image are sent to a method storing module 201. In generating an interpolation frame, the control module 102 refers to control signals 211 stored in the method storing module 201.
  • A more detailed configuration and operation of the analyzing module 101 will be described with reference to FIG. 3.
  • FIG. 3 shows a more detailed configuration of the analyzing module 101.
  • The analyzing module 101 includes plural filters 301-303 for acquiring plural spatial frequency components, a histogram acquiring module 304-306 for generating histograms from outputs of the filters 301-303, respectively, and an determining module 307 for analyzing features of the image signal based on the histograms that are obtained from the respective histogram acquiring module 304-306.
  • Next, the operation of the above-configured analyzing module 101 will be described.
  • First, an input image signal 111 is input to the plural spatial frequency selecting modules (filters) 301-303 having different characteristics. Output signals 312-314 of the filters 301-303 are input to the histogram acquiring modules 304-306, respectively, which generate respective histograms 315-317. The histograms 315-317 reflect components in different frequency ranges of the input image signal 111, respectively.
  • Therefore, a distribution of spatial frequency components of the input image signal 111, that is, spatial variations of patterns of the image, can be found. The determining module 307 which obtains a distribution of spatial frequency components outputs a control signal 112 which indicates a frame interpolation processing method that is most suitable for the input image signal 111.
  • A description will be made below of how to detect features of patterns of an input image and generate an interpolation frame according to a detection result in the frame interpolation device according to the embodiment.
  • A first example of generating interpolation frame will be described below.
  • Where spatial frequency components of an input image are concentrated only in a low-frequency range, patterns of the image have gentle variations. In this case, when high spatial frequency components are eliminated by reducing the image, the input image suffers only small reduction in image quality and fine variations of patterns are rarely lost. Therefore, if the above feature is detected in a certain region of the input image, the interpolation frame generation method is changed in that region so that a search for determining a motion between successive frames is performed by using a reduced image. An interpolation frame is generated based on resulting motion vectors and the original (i.e., non-reduced) frame image signal.
  • FIG. 4 shows the configuration of a frame interpolation device for realizing the first example no interpolation frame generation method.
  • Referring to FIG. 4, an input frame image signal 111 is input to the analyzing module 101 and an image reducing module 401. The analyzing module 101 detects features of the input image 111 and sends control signals 112 indicating frame interpolation methods. The control signals 112 are stored in the frame interpolation control method storing module 201. The control signals 112 are sent from the storing module 201 to the control module 102. On the other hand, the image reducing module 401 reduces the input image 111. The original input image 111 and a reduced image 411 which is sent from the image reducing module 401 are stored in the frame memory 103.
  • In regions of the input frame image 111 where spatial frequency components are concentrated only in a low-frequency rage, a search for determining a motion is performed based on the reduced input image 411 and a 1-frame-preceding reduced input image 412.
  • In the other regions where spatial frequency components are not concentrated only in a low-frequency rage, a search for determining a motion is performed based on the non-reduced input image 111 and a non-reduced image signal 113 of the preceding frame.
  • An interpolation image 117 is generated based on thus-obtained motion vectors 115, the frame image signal 111, and the non-reduced image signal 113 of the preceding frame. The interpolation image is inserted between the original frame images and an image signal 118 which has been subjected to the frame interpolation processing is output. This frame interpolation method makes it possible to find a motion in a wider region while preventing reduction in the accuracy of a generated interpolation frame.
  • A second example of generating interpolation frame will be described below.
  • Where spatial frequency components of an input image are concentrated only in a low-frequency range, patterns of the image have gentle variations. In this case, the input image has no fine motions. Therefore, if the above feature is detected in an input image, a group of vectors, which is selectable in a search for determining a motion between successive frames as the motion vector by the motion vector detecting module 104 shown in FIG. 1, is changed. An interpolation frame is generated based on resulting motion vectors and the original image signal. FIG. 5 shows an example in which the group of a motion vector that can be found is changed.
  • The search range shown in section (b) of FIG. 5, which is used in determining a motion vector, makes it possible to detect a larger motion vector than the search range shown in section (a) of FIG. 5 while the total number of candidate motion vectors is kept the same. Conversely, the search range shown in section (a) of FIG. 5 makes it possible to detect a finer motion than the search range shown in section (b) of FIG. 5 because the interval between adjacent candidate motion vectors is smaller in the former search range than in the latter one.
  • Therefore, in the second example interpolation frame generation method, a wider search range as shown in section (b) of FIG. 5 is used in regions where spatial frequency components of an input image are concentrated only in a low-frequency range and a smaller search range as shown in section (a) of FIG. 5 is used in other regions. This frame interpolation method makes it possible to find a motion in a wider region while preventing reduction in the accuracy of a generated interpolation frame.
  • A third example of generating interpolation frame will be described below. Where spatial frequency components of an input image are concentrated only in a low-frequency range, patterns of the image have gentle variations. In this case, in the input image, similar patterns occupy a wide region. Therefore, due to noise etc. in the image, a motion vector that is different from a motion vector that should be selected is prone to be selected by block matching. In view of this, if the above feature is detected in an input image, in a search for determining a motion between successive frames the motion vector detecting module 104 shown in FIG. 1 increases the block size of block matching to compare patterns in a wider region. With this measure, a pattern variation appears more likely in each block of the block matching and a motion vector that should be selected is more apt to be selected. An interpolation frame is generated based on resulting motion vectors and the original frame image signal. This frame interpolation method makes it possible to generate a more accurate interpolation frame.
  • A fourth example of generating interpolation frame will be described below. Where spatial frequency components of an input image are concentrated in a certain frequency range, the input image has a repetitive pattern. In view of this, frame interpolation is performed by changing the frame interpolation method so as to prevent erroneous detection of a motion between frames due to a repetitive pattern.
  • In connection with the above fourth example interpolation frame generation method, example methods for preventing erroneous detection of a motion vector due to a repetitive pattern will be described below sequentially.
  • The reason why erroneous detection of a motion vector tends to occur in an image having a repetitive pattern is that several kinds of motion can be found in a region having the repetitive pattern and hence it is difficult to determine a proper motion. In particular, an erroneous motion vector tends to be found when the position of a repetitive pattern slightly moves between two successive frames or a repetitive pattern varies slightly due to noise or the like.
  • In a first example method for preventing erroneous detection of a motion vector due to a repetitive pattern, the motion vector search range in the motion vector detecting module 104 shown in FIG. 1 is changed so as to be narrower than the region of the repetitive pattern. This is effective because refraining from referring to the next repetition cycle of a repetitive pattern in block matching reduces the number of candidate motion vectors in a motion vector search and hence lowers the probability of selection of an erroneous motion vector. This method makes it possible to prevent reduction in the accuracy of an interpolation frame due to a repetitive pattern.
  • In a second example method for preventing erroneous detection of a motion vector due to a repetitive pattern, the range of motion vectors that can be used in the interpolation image generating module 105 shown in FIG. 1 is made narrower than usual.
  • To this end, the interpolation image generating module 105 is made a device capable of changing the range of motion vectors that can be used with respect to the range of motion vectors detected by the motion vector detecting module 104. A technique disclosed in JP-A-2008-067205 may be employed as a technique for narrowing the range of motion vectors that can be used in the interpolation image generating module 105. The interpolation image generating module 105 narrows the range of motion vectors to be used with respect to the range of detected motion vectors and uses resulting motion vectors for generation of an interpolation frame.
  • In this case, an interpolation frame is generated by using motion vectors that are reduced from original motion vectors, which reduces the effect of the frame interpolation. However, the image quality reduction can still be decreased because the reduction in the accuracy of an interpolation frame due to motion vectors that have been detected erroneously due to a repetitive pattern is decreased.
  • In a third example method for preventing erroneous detection of a motion vector due to a repetitive pattern, large variation between a motion vector of a certain block and a motion vector of a block that is adjacent to the former block spatially or temporally is prohibited more strictly. A technique disclosed in JP-A-2008-067222 (counterpart U.S. publication is: US 2008/0063289 A1) may be used as a technique for smoothing motion vectors. The publication JP-A-2008-067222 discloses a technique for controlling the function of filtering on motion vectors according to features of an input image signal.
  • FIG. 6 shows the configuration of a frame interpolation device capable of performing the third example method for preventing erroneous detection of a motion vector due to a repetitive pattern.
  • Referring to FIG. 6, motion vectors 115 of one frame that have been found by the motion vector detecting module 104 are stored in a motion vector memory 601. In a motion vector search for the next frame, motion vectors 611 of blocks around a motion search block and a motion vector 611, detected one frame before, of the block at the same position are sent from the motion vector memory 601 to the motion vector detecting module 104. A motion vector found is changed so as not to be much different from the neighboring motion vectors 611. In particular, when a repetitive pattern is detected in an input frame image, differences from neighboring motion vectors are corrected more strongly. With this method, a motion vector that has been detected erroneously due to a repetitive pattern is corrected by using correctly detected neighboring motion vectors, whereby reduction in the accuracy of a generated interpolation frame 117 can be prevented.
  • As described above, the device according to the embodiment analyzes an input image signal based on a distribution of its spatial frequency components and an analysis result is reflected in the interpolation frame generation method. This makes it possible to provide a high-accuracy interpolation frame generation method and device which are lower in the probability of occurrence of failures.
  • In the device according to the embodiment, a repetitive pattern is detected based on a spatial frequency characteristic of an input image signal and a detection result is reflected in the interpolation frame generation method. This also contributes to providing a high-accuracy interpolation frame generation method and device which are lower in the probability of occurrence of failures.
  • Although the embodiments according to the present invention have been described above, the present invention is not limited to the above-mentioned embodiments but can be variously modified.
  • Additional advantages and modifications will readily occur to those skilled in the art. Therefore, the invention in its broader aspects is not limited to the specific details and representative embodiments shown and described herein. Accordingly, various modifications may be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents.

Claims (7)

1. A frame interpolation device for generating a new interpolation frame image to be inserted between input frame images using two or more of the input frame images, the device comprising:
a motion vector detecting module configured to divide each of the input frame images into a plurality of blocks and detect motion vectors of objects in the input frame images for each of the blocks by performing block matching between the input frame images;
a distribution detecting module configured to detect a distribution of spatial frequency components of a current input frame image;
a feature analyzing module configured to analyze a feature of the current input frame image based on the distribution of the spatial frequency components detected by the distribution detecting module;
a control module configured to control the motion vector detecting module according to the feature detected by the feature analyzing module;
an interpolation frame generating module configured to generate an interpolation frame image using the detected motion vectors according to the feature detected by the feature analyzing module; and
an output module configured to insert the interpolation frame image generated by the interpolation frame generating module between the input frame images and output resulting output frame images.
2. The device of claim 1, wherein the feature analyzing module detects whether the spatial frequency components of the current input frame image are concentrated only in a low-frequency range, and
wherein the control module controls the motion vector detecting module to:
change images to be used for detection of the motion vectors from the input frame images to images obtained by reducing the input frame images when the spatial frequency components of the current input frame image are concentrated only in a low-frequency range; and
keep the images to be used for detection of the motion vectors unchanged when the spatial frequency components of the current input frame image are not concentrated only in a low-frequency range.
3. The device of claim 1, wherein the feature analyzing module detects whether the spatial frequency components of the current input frame image are concentrated only in a low-frequency range, and
wherein the control module controls the motion vector detecting module to:
change a group of vectors, which is selectable in a search for determining a motion between successive frames as the motion vector by the motion vector detecting module, when the spatial frequency components of the current input frame image are concentrated only in a low-frequency range; and
keep the group of vectors unchanged when the spatial frequency components of the current input frame image are not concentrated only in a low-frequency range.
4. The device of claim 1, wherein the feature analyzing module detects whether the spatial frequency components of the current input frame image are concentrated only in a low-frequency range, and
wherein the control module controls the motion vector detecting module to:
change a block size of the block matching when the spatial frequency components of the current input frame image are concentrated only in a low-frequency range; and
keep the block size of the block matching unchanged when the spatial frequency components of the current input frame image are not concentrated only in a low-frequency range.
5. The device of claim 1, wherein the feature analyzing module detects whether the current input frame image has a repetitive pattern by detecting whether the spatial frequency components of the current input frame image are concentrated in a particular frequency range, and
wherein, when the feature analyzing module analyzes that a repetitive pattern exists in the current input frame image, the control module controls the interpolation frame generating module to select a frame interpolation method, from a plurality of available methods, to be employed for generating the interpolation frame image while preventing erroneous interpolation due to the repetitive pattern
6. The device of claim 1 further comprising a display unit configured to display the output frame images.
7. A method for performing frame interpolation for generating a new interpolation frame image to be inserted between input frame images using two or more of the input frame images, the method comprising:
dividing each of the input frame images into a plurality of blocks;
detecting motion vectors of objects in the input frame images for each of the blocks by performing block matching between the input frame images;
detecting a distribution of spatial frequency components of a current input frame image;
analyzing a feature of the current input frame image based on the distribution of the spatial frequency components;
controlling the motion vector detecting module according to the detected feature;
generating an interpolation frame image using the detected motion vectors according to the detected feature; and
inserting the interpolation frame image generated by the interpolation frame generating module between the input frame images.
US12/475,265 2008-09-26 2009-05-29 Frame Interpolation Device Abandoned US20100079665A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008248963A JP2010081411A (en) 2008-09-26 2008-09-26 Frame interpolation device and frame interpolation method
JP2008-248963 2008-09-26

Publications (1)

Publication Number Publication Date
US20100079665A1 true US20100079665A1 (en) 2010-04-01

Family

ID=42057061

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/475,265 Abandoned US20100079665A1 (en) 2008-09-26 2009-05-29 Frame Interpolation Device

Country Status (2)

Country Link
US (1) US20100079665A1 (en)
JP (1) JP2010081411A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100278433A1 (en) * 2009-05-01 2010-11-04 Makoto Ooishi Intermediate image generating apparatus and method of controlling operation of same
US8471959B1 (en) * 2009-09-17 2013-06-25 Pixelworks, Inc. Multi-channel video frame interpolation
US20130176488A1 (en) * 2012-01-11 2013-07-11 Panasonic Corporation Image processing apparatus, image capturing apparatus, and program
US20130235274A1 (en) * 2010-11-17 2013-09-12 Mitsubishi Electric Corporation Motion vector detection device, motion vector detection method, frame interpolation device, and frame interpolation method
US20170150095A1 (en) * 2015-11-25 2017-05-25 Samsung Electronics Co., Ltd. Apparatus and method for frame rate conversion
US20190068972A1 (en) * 2017-08-23 2019-02-28 Canon Kabushiki Kaisha Image processing apparatus, image pickup apparatus, and control method of image processing apparatus

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101747717B1 (en) 2010-08-12 2017-06-15 엘지디스플레이 주식회사 Driving apparatus for image display device and method for driving the same

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080063289A1 (en) * 2006-09-08 2008-03-13 Kabushiki Kaisha Toshiba Frame interpolating circuit, frame interpolating method, and display apparatus

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2590999B2 (en) * 1987-12-29 1997-03-19 日本電気株式会社 Image signal motion vector detector
JPH0837664A (en) * 1994-07-26 1996-02-06 Toshiba Corp Moving picture encoding/decoding device
JPH11239354A (en) * 1998-02-23 1999-08-31 Mitsubishi Electric Corp Motion vector detector
JP2000023154A (en) * 1998-06-30 2000-01-21 Toshiba Corp Moving image encoder
JP2006178642A (en) * 2004-12-21 2006-07-06 Olympus Corp Apparatus for detecting motion vector
JP4641892B2 (en) * 2005-07-27 2011-03-02 パナソニック株式会社 Moving picture encoding apparatus, method, and program
JP2007259106A (en) * 2006-03-23 2007-10-04 Fujifilm Corp Method of detecting moving object in picked-up image and apparatus thereof

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080063289A1 (en) * 2006-09-08 2008-03-13 Kabushiki Kaisha Toshiba Frame interpolating circuit, frame interpolating method, and display apparatus

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100278433A1 (en) * 2009-05-01 2010-11-04 Makoto Ooishi Intermediate image generating apparatus and method of controlling operation of same
US8280170B2 (en) * 2009-05-01 2012-10-02 Fujifilm Corporation Intermediate image generating apparatus and method of controlling operation of same
US8471959B1 (en) * 2009-09-17 2013-06-25 Pixelworks, Inc. Multi-channel video frame interpolation
US20130235274A1 (en) * 2010-11-17 2013-09-12 Mitsubishi Electric Corporation Motion vector detection device, motion vector detection method, frame interpolation device, and frame interpolation method
US20130176488A1 (en) * 2012-01-11 2013-07-11 Panasonic Corporation Image processing apparatus, image capturing apparatus, and program
US8976258B2 (en) * 2012-01-11 2015-03-10 Panasonic Intellectual Property Management Co., Ltd. Image processing apparatus, image capturing apparatus, and program
US20170150095A1 (en) * 2015-11-25 2017-05-25 Samsung Electronics Co., Ltd. Apparatus and method for frame rate conversion
US10091455B2 (en) * 2015-11-25 2018-10-02 Samsung Electronics Co., Ltd. Apparatus and method for frame rate conversion
US20190068972A1 (en) * 2017-08-23 2019-02-28 Canon Kabushiki Kaisha Image processing apparatus, image pickup apparatus, and control method of image processing apparatus
US10805609B2 (en) * 2017-08-23 2020-10-13 Canon Kabushiki Kaisha Image processing apparatus to generate panoramic image, image pickup apparatus to generate panoramic image, control method of image processing apparatus to generate panoramic image, and non-transitory computer readable storage medium to generate panoramic image

Also Published As

Publication number Publication date
JP2010081411A (en) 2010-04-08

Similar Documents

Publication Publication Date Title
US20100079665A1 (en) Frame Interpolation Device
KR100670003B1 (en) The apparatus for detecting the homogeneous region in the image using the adaptive threshold value
US6061100A (en) Noise reduction for video signals
US8509481B2 (en) Image processing apparatus, image processing method, imaging apparatus
EP2413586B1 (en) Method and device for adaptive noise measurement of a video signal
US9591258B2 (en) Image processing apparatus, image processing method, program and storage medium
US20080118163A1 (en) Methods and apparatuses for motion detection
JPH04229795A (en) Video system converter correcting movement
US8401318B2 (en) Motion vector detecting apparatus, motion vector detecting method, and program
KR20010033552A (en) Detection of transitions in video sequences
US20060221253A1 (en) Noise reducing apparatus and noise reducing method
US7330592B2 (en) Method and apparatus for detecting the location and luminance transition range of slant image edges
EP2178289B1 (en) Method and unit for motion detection based on a difference histogram
US7999876B2 (en) Pull-down detection apparatus and pull-down detection method
US9147115B2 (en) Method and device for detecting an object in an image
US20040218787A1 (en) Motion detector, image processing system, motion detecting method, program, and recordig medium
US6784944B2 (en) Motion adaptive noise reduction method and system
JP2009533887A (en) Motion vector field correction apparatus and method
US8330858B2 (en) Pull-down detection apparatus and pull-down detection method
CN101141655A (en) Video signal picture element point chromatic value regulation means
US20060187301A1 (en) Pull-down detection apparatus and pull-down detection method
US7634132B2 (en) Method and apparatus of false color suppression
US20100027666A1 (en) Motion vector detecting apparatus, motion vector detecting method, and program
US9532053B2 (en) Method and apparatus for analysing an array of pixel-to-pixel dissimilarity values by combining outputs of partial filters in a non-linear operation
JP4608136B2 (en) Motion vector and parallax vector detection device

Legal Events

Date Code Title Description
AS Assignment

Owner name: KABUSHIKI KAISHA TOSHIBA,JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:YAMAUCHI, HIMIO;HICHIE, HIROYUKI;SIGNING DATES FROM 20090427 TO 20090507;REEL/FRAME:022756/0215

AS Assignment

Owner name: KABUSHIKI KAISHA TOSHIBA,JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:YAMAUCHI, HIMIO;MICHIE, HIROYUKI;SIGNING DATES FROM 20090427 TO 20090507;REEL/FRAME:022761/0090

STCB Information on status: application discontinuation

Free format text: EXPRESSLY ABANDONED -- DURING EXAMINATION