WO2009147614A1 - Masquage d’artefacts pour conversion ascendante temporelle - Google Patents

Masquage d’artefacts pour conversion ascendante temporelle Download PDF

Info

Publication number
WO2009147614A1
WO2009147614A1 PCT/IB2009/052304 IB2009052304W WO2009147614A1 WO 2009147614 A1 WO2009147614 A1 WO 2009147614A1 IB 2009052304 W IB2009052304 W IB 2009052304W WO 2009147614 A1 WO2009147614 A1 WO 2009147614A1
Authority
WO
WIPO (PCT)
Prior art keywords
video signal
mcfi
output
frame
fme
Prior art date
Application number
PCT/IB2009/052304
Other languages
English (en)
Inventor
Erwin B. Bellers
Jacobus Willem Van Gurp
Original Assignee
Nxp B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nxp B.V. filed Critical Nxp B.V.
Publication of WO2009147614A1 publication Critical patent/WO2009147614A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0135Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving interpolation processes
    • H04N7/014Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving interpolation processes involving the use of motion vectors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0127Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter
    • H04N7/0132Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter the field or frame frequency of the incoming video signal being multiplied by a positive integer, e.g. for flicker reduction

Definitions

  • LCDs Liquid Crystal Displays
  • LCDs typically suffer from motion blur. This is caused by the limited response time of the LC material and the sample-and-hold principle of the LCD. The contribution of the sample and hold effect can be reduced by using higher refresh rates in the display. Higher display refresh rates in combination with fast LCD response times (8ms or less) and frame rate conversion can dramatically improve the motion portrayal on LCD panels.
  • Frame rate conversion technology like Philips Natural Motion enables the conversion of an input video stream at refresh rate X into an output video stream with refresh rate Y. In general, Y>X.
  • the frame rate conversion technique may apply motion-compensated temporal interpolation.
  • DFI Dynamic Frame Insertion
  • the unsharp picture is a low pass filtered (blurry) version of the source picture.
  • the sharp picture is such that the average of the combination of the sharp and blurred pictures corresponds to the source picture again.
  • the sharp and blurred pictures are switched fast enough, then the result is perceived with the correct intensity although the frame rate is double.
  • the details (in the sharp picture) are only "held” for half of the time, as they are not visible in the blurred picture. This causes motion blur reduction. Due to the high refresh rate, the human eye is not capable of detecting the individual pictures and, as a result, integrates the sharp and blurred pictures.
  • the up-conversion from 50 to 100 Hz requires doubling of the frame rate, i.e., the interpolation of one new frame for every original frame. For 24 Hz to 96 Hz, this means the interpolation of three frames for every original frame. The more frames interpolated with respect to the original frames, the relatively more time the interpolated frames are being displayed, i.e., 75% of the time in case of the conversion from 24 to 96 Hz. As a consequence, potential artifacts resulting from the temporal interpolation are likely to become more visible.
  • DFI would not make sense for temporal up-conversions below approximately 90 Hz, as the flicker becomes more noticeable at lower refresh rates. This problem does not exist for Natural Motion techniques.
  • the disadvantages of DFI compared to Natural Motion are the loss of sharpness in moving image parts, the potential visibility of some flicker, and a dependency on panel specifications.
  • the disadvantage of Natural Motion is the risk of introducing interpolation artifacts.
  • the apparatus is a frame rate converter (FRC).
  • the FRC includes a motion compensated frame interpolator (MCFI) and a frame modification engine (FME).
  • the MCFI performs motion compensated temporal frame interpolation on the input video signal with an input frame rate and generates from the input video signal a supplemental MCFI output video signal with an intermediate frame rate and an MCFI output video signal with an output frame rate.
  • the intermediate frame rate is greater than the input frame rate of the input video signal and less than the output frame rate of a final output video signal.
  • the FME generates an FME output video signal based on the MCFI video signal.
  • Each frame of the MCFI output video signal, supplemental MCFI output video signal, and the FME output video signal includes a pixel intensity value.
  • Other embodiments of the apparatus are also described.
  • the method is a method for converting a frame rate of an input video signal.
  • An embodiment of the method includes receiving the input video signal.
  • the input video signal includes an input frame rate.
  • the method also includes performing motion compensated temporal frame interpolation on the input video signal.
  • the method also includes generating an MCFI output video signal with an output frame rate from the input video signal.
  • Each frame of the MCFI output video signal includes a pixel intensity value.
  • the method also includes generating a supplemental MCFI output video signal with an intermediate frame rate from the input video signal. The intermediate frame rate is greater than the frame rate of the input video signal and less than the frame rate of a final output video signal.
  • Each frame of the supplemental MCFI output video signal includes a pixel intensity value.
  • the method also includes generating an FME output video signal based on the MCFI video signal.
  • Each frame of the FME output video signal includes a pixel intensity value.
  • Other embodiments of the method are also described.
  • Fig. 1 depicts a schematic block diagram of one embodiment of a video display system.
  • Fig. 2 depicts a schematic block diagram of one embodiment of the image controller of the video display system of Fig. 1.
  • Fig. 3 depicts a schematic block diagram of one embodiment of the frame rate converter of the image controller of Fig. 2.
  • Fig. 4 depicts a schematic block diagram of one embodiment of a mixed frame rate conversion process in conjunction with the frame rate converter of Fig. 3.
  • Fig. 5 depicts a schematic flow chart diagram of one embodiment of a mixed frame rate conversion method for mixing the output of the motion compensated frame interpolator with the output of the dynamic frame inserter in the frame rate converter of Fig. 3.
  • Fig. 6 depicts a schematic block diagram of one embodiment of a frame rate conversion process in conjunction with the frame rate converter of Fig. 3.
  • Fig. 7 depicts a schematic flow chart diagram of one embodiment of a frame rate conversion method for performing dynamic frame insertion on the output of the motion compensated frame interpolator in the frame rate converter of Fig. 3. Throughout the description, similar reference numbers may be used to identify similar elements.
  • frame rate conversion includes two conversion stages: a motion compensated (MC) frame interpolation stage and a frame modification stage.
  • MC motion compensated
  • Embodiments of the motion compensated frame interpolation stage are performed based on MC temporal interpolation algorithms and/or MC temporal up-conversion algorithms. For example, some embodiments implement a conventional block matching algorithm in conjunction with the MC frame interpolation.
  • the resultant temporal interpolated video may be used as a source to the frame modification stage to further refine and enhance the temporal interpolated video through frame modification techniques.
  • the frame modification stage include copying each frame of video generated by the MC frame interpolation stage and performing frame modification techniques on the original and copied frames from the MC frame interpolation stage.
  • the frame modification stage performs frame modification techniques only on the original frames generated by the MC frame interpolation stage, thus, maintaining the frame rate of the video generated by the MC frame interpolation stage.
  • the undesired side effects of MC temporal interpolation and frame modification are reduced, relative to conventional techniques, by implementing the multi-stage MC frame rate conversion scheme.
  • the undesired side effects of both the MC frame interpolation stage and the frame modification stage are reduced.
  • embodiments which implement the MC frame interpolation and frame modification stages, or the MC frame interpolation and DFI frame conversion stages incur relatively low conversion complexity while maintaining relatively high picture quality.
  • Fig. 1 depicts a schematic block diagram of one embodiment of a video display system 100.
  • the illustrated video display system 100 includes a video source 102, an image controller 104, and a display device 106. Although the depicted video display system 100 includes several functional blocks described herein, other embodiments of the video display system 100 may include fewer or more functional blocks to implement more or less functionality.
  • the video source 102 generates a video signal.
  • the video source 102 may include digital video from a digital video disk (DVD) playing on a DVD player, a digital video from a digital camera, a digital video file stored on a media player, or a digital video from another source.
  • the video source 102 may also include analog video such as a video home system (VHS) cassette playing on a VHS player or an analog video from an analog camera or another.
  • VHS video home system
  • the image controller 104 receives the video from the video source 102 and processes the video for display on the display device 106.
  • the display device 106 includes an LCD panel or another display device used to display video from the video source 102.
  • Fig. 2 depicts a schematic block diagram of one embodiment of the image controller 104 of the video display system 100 of Fig. 1.
  • the illustrated image controller 104 includes a frame rate converter (FRC) 108, a processor 110, a memory device 112, and at least one bus interface 114.
  • the bus interface 114 facilitates communications related to the image controller 104 and/or frame conversion algorithms executing on the image controller 104, including processing frame conversion commands, as well as storing, sending, and receiving data packets associated with the frame conversion operations of the image controller 104.
  • the depicted image controller 104 includes several functional blocks described herein, other embodiments of the image controller 104 may include fewer or more functional blocks to implement more or less functionality.
  • the FRC 108 performs a combination of motion compensated temporal frame interpolation, dynamic frame insertion, and/or peak/blur functions and related algorithms.
  • performing the motion compensated frame interpolation in conjunction with the dynamic frame insertion or the peak/blur works to improve the effects of motion compensated frame interpolation and dynamic frame insertion compared with using motion compensated frame interpolation alone or using dynamic frame insertion alone.
  • performing the motion compensated frame interpolation in conjunction with the dynamic frame insertion, or the peak/blur works to reduce the undesired side effects of performing motion compensated frame interpolation and/or dynamic frame insertion.
  • dynamic frame insertion is performed following motion compensated frame interpolation.
  • the DFI function of the frame modification stage masks at least some of the potential artifacts generated by the motion compensated frame interpolation, compared to performing motion compensated frame interpolation alone.
  • the DFI function of the frame modification stage masks at least some of the potential artifacts generated by the motion compensated frame interpolation, compared to performing motion compensated frame interpolation alone.
  • at least some of the flicker generated by the DFI function of the frame modification stage is minimized compared to performing dynamic frame insertion by itself. Details of the functions and operations of the FRC 108 are described in detail below with reference to Fig. 3.
  • the memory device 112 is utilized by the image controller 104 to perform frame rate conversions and motion compensation operations and other related functions.
  • a dedicated memory device may be embedded in the FRC 108 and/or other components of the image controller 104.
  • the memory device 112 is a random access memory (RAM) or another type of dynamic storage device.
  • the memory device 112 is a read-only memory (ROM) or another type of static storage device.
  • the illustrated memory device 112 is representative of both RAM and static storage memory within the image controller 104. Additionally, some embodiments store protocols and/or instructions related to frame rate conversion and motion compensation operations as firmware such as embedded code, basic input/output system (BIOS) code, and/or other similar code.
  • BIOS basic input/output system
  • the processor 110 is utilized by the image controller 104 to perform frame rate conversion and motion compensation operations and other related functions.
  • a dedicated processor may be embedded in the FRC 108 and/or other components of the image controller 104.
  • the processor 110 is a central processing unit (CPU) with one or more processing cores.
  • the processor 110 is a general purpose processor, an application specific processor, a multi-core processor, or a microprocessor.
  • the processor 110 executes one or more instructions to provide operational functionality to the video display system 100. Protocols and/or instructions related to frame rate conversion and the modes of operation may be stored locally in the processor 110 or in the memory device 112. Alternatively, the instructions may be distributed across one or more devices such as the processor 110, the memory device 112, and/or another data storage device.
  • Fig. 3 depicts a schematic block diagram of one embodiment of the FRC 108 of the image controller 104 of Fig. 2.
  • the illustrated FRC 108 includes a motion compensated frame interpolator (MCFI) 116, a frame modification engine (FME) 118, and a frame rate conversion mixer 120. Additionally, the MCFI 116 includes a source detector 122, a motion estimator 124, and a temporal up-converter 126.
  • MCFI motion compensated frame interpolator
  • FME frame modification engine
  • the depicted FRC 108 includes several functional blocks described herein, other embodiments of the FRC 108 may include fewer or more functional blocks to implement more or less functionality.
  • the MCFI 116 performs up-conversion from an input signal of a certain input frame rate using temporal frame interpolation techniques. In some embodiments, the MCFI 116 generates an MCFI signal at the targeted output frame rate. In some embodiments, the MCFI 116 generates an MCFI signal at an intermediate frame rate, or mid-rate, that is between the input frame rate and the targeted output frame rate. In some embodiments, the MCFI 116 generates at least two signals where one of the signals generated by the MCFI 116 is at the targeted output frame rate and another signal generated by the MCFI 116 is at an intermediate frame rate.
  • the MCFI 116 may perform an up-conversion from 24 Hz source material to 48 Hz using temporal frame interpolation techniques.
  • the FME 118 would then perform dynamic frame insertion from 48 Hz to 96 Hz. This example is explained in more detail below with reference to Fig. 4.
  • the MCFI 116 may perform an up-conversion from 24 Hz source material to 96 Hz, where 96 Hz is again the targeted output frame rate.
  • the FME 118 then alternates between sharpening and blurring each of the frames from the MCFI 116 and outputs a signal at 96 Hz from a combination of motion compensated temporal frame interpolation and frame modification techniques.
  • the source detector 122 detects a cadence associated with a video signal from the video source 102.
  • the source detector detects a 2:2 pull down often used with 50 Hz video signals.
  • the source detector 122 detects a 2:3 pull down often used with 60 Hz video signals. Other pull down ratios are also possible.
  • the source detector 122 detects a frame rate associated with the video signal from the video source 102.
  • the source detector 122 detects the type of video source 102 and the associated input frame rate.
  • the source detector 122 may detect whether the type of source provided as the video signal from the video source 102 is an over-the-air TV broadcast at 60 Hz for a national television systems committee (NTSC) system or 50 Hz for a phase alternating line (PAL) system.
  • NTSC national television systems committee
  • PAL phase alternating line
  • the source detector 122 may detect whether the video source 102 is a satellite broadcast with a specific input frame rate.
  • the source detector 122 may detect other features of the video signal from the video source 102 such as a source pixel intensity associated with the video signal from the video source 102.
  • the MCFI 116 may perform temporal frame interpolation techniques according to the features of the video signal from the video source 102 detected by the source detector 122. For example, the MCFI 116 may generate a signal with an intermediate frame rate when the source detector 122 detects a film source with an input frame rate of 24 Hz. Alternatively, when the source detector 122 detects a 60 Hz broadcast, the MCFI 116 may generate a signal with a targeted output frame rate of 120 Hz.
  • the motion estimator 124 detects motion associated with objects that are in motion as seen in the video signal from the video source 102.
  • the motion estimator 124 generates motion vectors associated with the detected motion of the objects in the video signal from the video source 102.
  • the MCFI 116 then implements motion compensated temporal frame interpolation depending on the magnitude and direction of the motion vectors generated by the motion estimator 124.
  • the MCFI 116 implements one or more bi-directional motion compensated techniques according to the bi-directional nature of the motion vectors generated by the motion estimator 124.
  • the temporal up-converter 126 performs temporal frame interpolation algorithms to improve the temporal quality of the interpolated frames added to the video signal from the video source 102 to compensate for the loss of motion captured between the frames of the video signal from the video source 102. For example, the amount of motion captured in a 24 Hz film source is going have gaps of motion compared to the motion that would be captured using a camera that implements a 100 Hz film source. Thus, the temporal up-converter 126 implements the temporal frame interpolation algorithms to fill in some of the missing motion when up-converting from, for example, a 24 Hz film source to a video signal with a targeted output frame rate of 96 Hz.
  • the FME 118 includes a frame duplicator 128, a frame de- sharpener 130, and a frame sharpener 132.
  • the FME 118 performs dynamic frame insertion techniques on an input video signal.
  • the FME 118 performs the dynamic frame insertion techniques subsequent to the operations and motion compensated techniques of the MCFI 116.
  • relatively more interpolated frames from the MCFI video signal are blurred than sharpened, and as a result, the visibility of artifacts generated by the operation so of the MCFI 116 are reduced relative to performing motion compensated temporal frame interpolation alone.
  • the strength of the dynamic frame insertion techniques applied by the FME 118 is modulated by the amount of motion detected by the motion estimator 124 and/or the source pixel intensity.
  • the dynamic frame insertion techniques include copying each of the frames of an input video signal, alternately sharpening the original/copied frames of the input video signal, and alternately blurring the copied/original frames of the input video signal.
  • the FME 118 sharpens either the original or copied set of frames of the input video signal, and blurs the other set of frames (copied or original) of the input video signal that is not sharpened. Due to the relatively large integration in the human eye, the sharpened and blurred frames are not seen individually but are blended by the eye, and the net- result is a normalized picture that is neither sharpened nor blurred.
  • the frame duplicator 128 copies each of the frames of the input video signal.
  • the frame de-sharpener 130 blurs the copied frames of the input video signal while the frame sharpener 132 sharpens the original frames of the input video signal.
  • the frame de-sharpener 130 blurs the original frames of the input video signal and the frame sharpener 132 sharpens the copied frames of the original input video signal.
  • the FME 118 receives a video signal from the MCFI 116 at the targeted output frame rate and alternately sharpens and blurs the inputted frames.
  • the FME 118 receives an input video signal and alternately applies a peak (sharpening) and a blur to the frames of the input video signal without copying each of the frames.
  • an example embodiment of the FME 118 receives an input video signal at a target output frame rate of 96 Hz, generates a frame modification signal of sharpened and blurred frames of the input video at the same 96 Hz target output frame rate, and, thus, outputs a frame modification signal at the 96 Hz target output frame rate.
  • the frame de-sharpener 130 alternately sharpens the odd frames of the input video signal and the frame sharpener 132 alternately sharpens the even frames of the input video signal.
  • the frame de-sharpener 130 alternately sharpens the even frames of the input video signal and the frame sharpener 132 alternately sharpens the odd frames of the input video signal.
  • the frame rate conversion (FRC) mixer 120 mixes the video signal generated by the MCFI 116 with the video signal generated by the FME 118.
  • the FRC mixer 120 is set to blend half of the pixel intensity of the output by the motion compensated temporal frame interpolation techniques of the MCFI 116 with the dynamic frame insertion techniques of the FME 1 18.
  • the FRC mixer 120 may be dynamically adjusted to blend more pixel intensity of the signal generated by the MCFI 1 16 and less of the pixel intensity of the frames generated by the FME 118 into the video output signal generated by the FRC mixer 120, and vice versa.
  • the FRC mixer 120 blends more of the pixel intensity of the signal from the MCFI 1 16.
  • the FRC mixer 120 biases towards the output of the DFI 116.
  • Fig. 4 depicts a schematic block diagram of one embodiment of a mixed frame rate conversion process 134 in conjunction with the FRC 108 of Fig. 3.
  • the mixed frame rate conversion process 134 is described in conjunction with the video display system 100 of Fig. 1 and components thereof, other embodiments of the mixed frame rate conversion process 134 may be implemented with other video display systems and/or other components thereof. Additionally, embodiments of the mixed frame rate conversion process 134 are described in conjunction with the mixed frame rate conversion method 200 of Fig. 5.
  • Fig. 5 depicts a schematic flow chart diagram of one embodiment of a mixed frame rate conversion method 200 for mixing the output of the MCFI 1 16 with the output of the FME 118 in the FRC 108 of Fig. 3.
  • the mixed frame rate conversion method 200 is described in conjunction with the video display system 100 of Fig. 1 and components thereof, as well as the mixed frame rate conversion process 134 of Fig. 4, other embodiments of the mixed frame rate conversion method 200 may be implemented with other video display systems and/or other components thereof.
  • the depicted mixed frame rate conversion method 200 includes two general stages — motion compensated temporal frame interpolation and frame modification which may include dynamic frame insertion.
  • the operations corresponding to the motion compensated temporal frame interpolation and frame modification stages of the mixed frame rate conversion method 200 are indicated in Figs. 4 and 5.
  • the MCFI 116 receives an input video from the video signal from the video source 102.
  • the MCFI 116 performs motion compensated frame interpolation on the input video.
  • the source detector 122 detects features of the input video.
  • the motion estimator 124 generates motion vectors relative to the detected motion of objects in motion in the input video.
  • the temporal up-converter 126 performs temporal frame interpolation algorithms on the input video to generate additional frames that are added in between the original frames of the input video and that interpolate the expected position of moving objects in the input video relative to the motion of the objects in the original frames.
  • the MCFI 116 generates an MCFI output video signal and sends the MCFI output video signal to the FRC mixer 120.
  • the MCFI 116 generates a supplemental MCFI output video signal.
  • the MCFI 116 generates an MCFI output video signal and a supplemental MCFI output video signal simultaneously.
  • the difference between the MCFI output video signal and the supplemental MCFI output video signal is that the frame rate of the MCFI output video signal is at the targeted output frame rate, whereas the frame rate of the supplemental MCFI output video signal is at an intermediate frame rate between the input frame rate (the frame rate of the input video) and the targeted output frame rate.
  • the intermediate frame rate of the supplemental MCFI output video signal may be 48 Hz, and the output frame rate may be 96 Hz.
  • the FME 118 performs dynamic frame insertion (DFI) on the supplemental MCFI output video signal from the MCFI 116.
  • DFI dynamic frame insertion
  • the FME 118 receives the supplemental MCFI output video signal from the MCFI 116 and performs DFI on the supplemental MCFI output video signal, doubling the frame rate of the intermediate frame rate.
  • the frame duplicator 128 duplicates each frame of the supplemental MCFI output video signal.
  • the frame de- sharpener 130 then blurs each of the original frames of the supplemental MCFI output video signal while the frame sharpener 132 sharpens each of the copied frames of the supplemental MCFI output video signal.
  • the frame de-sharpener 130 blurs each of the copied frames of the supplemental MCFI output video signal, while the frame sharpener 132 sharpens each of the original frames of the supplemental MCFI output video signal.
  • the FME 118 generates an FME output video signal and sends the FME output video signal to the FRC mixer 120.
  • the FRC mixer 120 mixes the MCFI output video signal with the FME output video signal.
  • the FRC mixer 120 is configured to generate an output video signal and to send the generated output video signal to the display device 106.
  • the FME output video signal and the MCFI output video signal are then received by the FRC mixer 120.
  • the FRC mixer 120 at block 214, then mixes the output-rate FME output video signal and the MCFI video output to generate the final output video.
  • the MCFI 116 receives an input video of 24 Hz and generates an MCFI output video signal of 96 Hz as well as a supplemental MCFI output video signal of 48 Hz through temporal frame interpolation techniques.
  • the FME 118 then receives the 48 Hz supplemental MCFI output video signal and implements DFI techniques that increase the frame rate from the intermediate frame rate of 48 Hz to an output frame rate of 96 Hz to generate an FME output video signal.
  • the FRC mixer 120 then mixes the MCFI output video signal of 96 Hz with the FME output video signal of 96 Hz to generate the output video signal of 96 Hz.
  • each frame of the MCFI output video signal and each frame of the FME output video signal include pixel intensity values.
  • the FRC mixer 120 mixes a pixel intensity value of each frame of the MCFI output video signal with a pixel intensity value of each corresponding frame of the FME output video signal and generates the output video signal at the final output frame rate from the mixture of the pixel intensity values of corresponding MCFI and FME output video signal frames.
  • Fig. 6 depicts a schematic block diagram of one embodiment of a frame rate conversion process 136 in conjunction with the frame rate converter of Fig. 3.
  • the frame rate conversion process 136 is described in conjunction with the video display system 100 of Fig. 1 and other components thereof, other embodiments of the frame rate conversion process 136 may be implemented with other video display systems and/or video display system components. Additionally, embodiments of the frame rate conversion process 136 are described in conjunction with the frame rate conversion method 250 of Fig. 7.
  • Fig. 7 depicts a schematic flow chart diagram of one embodiment of a frame rate conversion method 250 for performing dynamic frame insertion on the output of the MCFI 116 in the FRC 108 of Fig. 3.
  • the frame rate conversion method 250 is described in conjunction with the video display system 100 of Fig. 1 and components thereof, as well as the frame rate conversion process 136 of Fig. 6, other embodiments of the frame rate conversion method 250 may be implemented with other video display systems and/or other video display system components.
  • the depicted frame rate conversion method 250 includes two general stages — motion compensated temporal frame interpolation and frame modification.
  • the operations corresponding to the motion compensated temporal frame interpolation and frame modification stages of the frame rate conversion method 250 are indicated in Figs. 6 and 7.
  • the MCFI 116 receives an input video from the video signal from the video source 102.
  • the MCFI 116 performs motion compensated frame interpolation on the received input video, as described above.
  • the source detector 122 detects features of the input video.
  • the motion estimator 124 generates motion vectors relative to the detected motion of objects in motion in the input video.
  • the temporal up-converter 126 performs temporal frame interpolation algorithms on the input video to generate additional frames that are added in between the original frames of the input video and that interpolate the expected position of moving objects in the input video relative to the motion of the objects in the original frames. Alternatively, at least some of the original frames are added in between the additional frames of the input video. In other words, in some embodiments, a portion of the original frames are removed. As one example, for a conversion of a 24 Hz video signal to a 60 Hz video signal, only 12 out of the 24 original frames are included in the 60 Hz output video signal because the other 12 are not properly located temporally for a 60 Hz video stream.
  • the MCFI 116 generates an MCFI output video signal and sends the MCFI output video signal to the FME 118. In one embodiment, the MCFI output video signal is generated at the full output frame rate.
  • the FME 118 performs a frame modification technique on the MCFI output video signal from the MCFI 116. For every other original frame of the MCFI output video signal, the FME 118 generates a peaked, or sharpened, frame. For every other original frame of the MCFI output video not peaked, the FME 118 generates a blurred frame. In other words, in some embodiments, the frame de- sharpener 130 blurs all the odd frames of the MCFI output video signal and the frame sharpener 132 sharpens all the even frames of the MCFI output video signal.
  • the frame de-sharpener 130 blurs all the even frames of the MCFI output video signal and the frame sharpener 132 sharpens all the odd frames of the MCFI output video signal.
  • the FME 118 outputs the output video signal at the same frame rate as the MCFI output video signal.
  • the FME 118 generates an FME output video signal and sends the FME output video signal to the display device 106 for viewing.
  • the peaked and blurred frames of the MCFI output video signal are displayed alternately on the display device 106, and as explained above, the peaked and blurred images are blended by the human eye.
  • the MCFI 116 receives an input video of 24 Hz and generates an MCFI output video signal of 96 Hz through temporal frame interpolation techniques that increase the frame rate from the input frame rate of 24 Hz to an output frame rate of 96 Hz.
  • the FME 118 then receives the 96 Hz MCFI output video signal and implements frame modification techniques that alternately sharpen/blur each of the frames of the MCFI output video signal, while maintaining the output frame rate of 96 Hz.
  • the FME output video signal from the FME 118 is the final output video which is sent to the display device 106 for viewing.
  • mixed frame rate conversion method 200 and/or the frame rate conversion method 250 may implement fewer or more operations.
  • some embodiments of the mixed frame rate conversion method 200 and/or the frame rate conversion method 250 facilitate implementation of any of the functions described in relation to the video display system 100 or any of the components thereof.
  • the frame modification process masks potential artifacts introduced by the temporal frame interpolation.
  • the joint MCFI and frame modification scheme combines the benefits of motion-compensated temporal up-conversion with the masking principles of dynamic frame insertion, while reducing the disadvantages of frame modification such as flicker and the loss of sharpness in moving objects relative to the video signal from the video source 102.
  • the FRC mixer 120 masks potential errors in motion compensated temporal frame interpolation and substantially reduces the flicker resulting from DFI.
  • the frame rate conversion is separated into two successive stages — a motion compensated frame interpolation stage and a frame modification stage, as described above.
  • Embodiments of the frame conversion techniques described herein are capable of masking MCFI generated artifacts as well as limiting peak/blur generated flicker and maintaining sharpness in the edges of moving images. It should also be noted that the embodiments described herein may be applied to any general video display system with an LCD screen.
  • At least some of the operations for the mixed frame rate conversion method 200, the frame rate conversion method 250, and the video display system 100 may be implemented using software instructions stored on a computer useable storage medium for execution by a computer.
  • an embodiment of a computer program product includes a computer useable storage medium to store a computer readable program that, when executed on a computer, causes the computer to perform operations, as described above.
  • Embodiments of the invention can take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment containing both hardware and software elements.
  • the invention is implemented in software, which includes but is not limited to firmware, resident software, microcode, etc.
  • embodiments of the invention can take the form of a computer program product accessible from a computer-usable or computer-readable storage medium providing program code for use by or in connection with a computer or any instruction execution system.
  • a computer-usable or computer readable storage medium can be any apparatus that can store the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the computer-useable or computer-readable storage medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device), or a propagation medium.
  • Examples of a computer-readable storage medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk, and an optical disk.
  • Current examples of optical disks include a compact disk with read only memory (CD-ROM), a compact disk with read/write (CD-R/W), a digital video disk (DVD), and high-definition (HD) disks such as Blu-Ray and HD-DVD.
  • An embodiment of a data processing system suitable for storing and/or executing program code includes at least one processor coupled directly or indirectly to memory elements through a system bus such as a data, address, and/or control bus.
  • the memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Television Systems (AREA)

Abstract

L’invention concerne un convertisseur de taux de trame servant à convertir le taux de trame d’un signal vidéo d’entrée. Ce convertisseur comprend un interpolateur de trame à compensation de mouvement (MCFI) et un moteur de modification de trame (FME). L’interpolateur effectue une interpolation de trame temporelle à compensation de mouvement sur un signal vidéo d’entrée avec un taux de trame d’entrée et génère, à partir du signal vidéo d’entrée, un signal vidéo de sortie d’interpolateur avec un taux de trame d’interpolateur de sortie. Le moteur de modification de trame génère un signal vidéo de sortie de moteur en fonction du signal de sortie de l’interpolateur.
PCT/IB2009/052304 2008-06-02 2009-06-02 Masquage d’artefacts pour conversion ascendante temporelle WO2009147614A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US5792508P 2008-06-02 2008-06-02
US61/057,925 2008-06-02

Publications (1)

Publication Number Publication Date
WO2009147614A1 true WO2009147614A1 (fr) 2009-12-10

Family

ID=41077600

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2009/052304 WO2009147614A1 (fr) 2008-06-02 2009-06-02 Masquage d’artefacts pour conversion ascendante temporelle

Country Status (1)

Country Link
WO (1) WO2009147614A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1551181A2 (fr) * 2003-12-23 2005-07-06 Genesis Microchip, Inc. Contrôleur adaptatif d'affichage
WO2007116370A1 (fr) * 2006-04-11 2007-10-18 Koninklijke Philips Electronics N.V. Iinsertion de trame dynamique par compensation de mouvement avec filtrage unidimensionnel
WO2008018006A2 (fr) * 2006-08-09 2008-02-14 Koninklijke Philips Electronics N.V. Augmentation de débit d'image
WO2008018015A1 (fr) * 2006-08-09 2008-02-14 Koninklijke Philips Electronics N.V. Dispositif d'élévation d'une fréquence image

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1551181A2 (fr) * 2003-12-23 2005-07-06 Genesis Microchip, Inc. Contrôleur adaptatif d'affichage
WO2007116370A1 (fr) * 2006-04-11 2007-10-18 Koninklijke Philips Electronics N.V. Iinsertion de trame dynamique par compensation de mouvement avec filtrage unidimensionnel
WO2008018006A2 (fr) * 2006-08-09 2008-02-14 Koninklijke Philips Electronics N.V. Augmentation de débit d'image
WO2008018015A1 (fr) * 2006-08-09 2008-02-14 Koninklijke Philips Electronics N.V. Dispositif d'élévation d'une fréquence image

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
CHEN HA-FEG; LEE SUNG-HEE; KWON OH-JAE; KIM SUNG-SOO; SUNG JUN-HO; PARKYUNG-JUN: "Smooth frame insertion method for motion-blur reduction in LCDs", EURODISPLAY, XX, XX, 1 January 2005 (2005-01-01), pages 359 - 361, XP009087929 *

Similar Documents

Publication Publication Date Title
RU2419243C1 (ru) Устройство и способ обработки изображений и устройство и способ отображения изображений
US8077258B2 (en) Image display apparatus, signal processing apparatus, image processing method, and computer program product
US7817127B2 (en) Image display apparatus, signal processing apparatus, image processing method, and computer program product
JP5187531B2 (ja) 画像表示装置
US7400321B2 (en) Image display unit
US8508659B2 (en) System and method for frame rate conversion using multi-resolution temporal interpolation
JP2005043875A (ja) 液晶ディスプレイパネルにおけるビデオ画像シーケンスの処理方法
JP2007121704A (ja) 動画像表示装置および動画像表示方法
US8531610B2 (en) Arrangement and approach for image data processing
US8155476B2 (en) Image processing apparatus, image processing method, and program
US8098333B2 (en) Phase shift insertion method for reducing motion artifacts on hold-type displays
US8687123B2 (en) Video signal processing
Klompenhouwer et al. 48.1: LCD Motion Blur Reduction with Motion Compensated Inverse Filtering
JP2011124893A (ja) 画像処理装置およびその制御方法、プログラム
JP2003069859A (ja) 動きに適応した動画像処理
WO2009147614A1 (fr) Masquage d’artefacts pour conversion ascendante temporelle
Chen et al. Nonlinearity compensated smooth frame insertion for motion-blur reduction in LCD
WO2007116370A1 (fr) Iinsertion de trame dynamique par compensation de mouvement avec filtrage unidimensionnel
JP5114274B2 (ja) テレビジョン受信機及びそのフレームレート変換方法
US20110134316A1 (en) Image display apparatus and method
US6980599B2 (en) Video decoding system and method having post-processing to reduce sharpness prediction drift
JP2009027608A (ja) ノイズ低減装置およびノイズ低減方法
JP2007010699A (ja) 液晶表示装置
JP5911210B2 (ja) 画像表示装置
KR100759224B1 (ko) 디스플레이 장치의 영상 표시방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09757964

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09757964

Country of ref document: EP

Kind code of ref document: A1