CN101322400A - Method and apparatus for spatio-temporal deinterlacing aided by motion compensation for field-based video - Google Patents

Method and apparatus for spatio-temporal deinterlacing aided by motion compensation for field-based video Download PDF

Info

Publication number
CN101322400A
CN101322400A CNA2006800455528A CN200680045552A CN101322400A CN 101322400 A CN101322400 A CN 101322400A CN A2006800455528 A CNA2006800455528 A CN A2006800455528A CN 200680045552 A CN200680045552 A CN 200680045552A CN 101322400 A CN101322400 A CN 101322400A
Authority
CN
China
Prior art keywords
frame
interleave
release
produce
space time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CNA2006800455528A
Other languages
Chinese (zh)
Inventor
田涛
石方
维贾雅拉克希米·R·拉韦恩德拉恩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qualcomm Inc
Original Assignee
Qualcomm Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Inc filed Critical Qualcomm Inc
Publication of CN101322400A publication Critical patent/CN101322400A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention comprises devices and methods for processing multimedia data to generate progressive frame data from interlaced frame data. In one aspect, a method of processing multimedia data includes generating spatio-temporal information for a selected frame of interlaced multimedia data, generating motion information for the selected frame, and deinterlacing fields of the selected frame based on the spatio-temporal information and the motion information to form a progressive frame associated with the selected frame. In another aspect an apparatus for processing multimedia data can include a spatial filter module configured to generate spatio-temporal information of a selected frame of interlaced multimedia data, a motion estimator configured to generate motion information for the selected frame, and a deinterlacer configured to deinterlace fields of the selected frame and form a progressive frame corresponding to the selected frame based on the spatio-temporal information and the motion information.

Description

The method and apparatus that passes through the auxiliary spatio-temporal deinterlacing of motion compensation that is used for field-based video
Advocate priority 119 times at 35 U.S.C. §
Present application for patent was advocated based on the 60/727th of being entitled as of application on October 17th, (1) 2005 " METHOD ANDAPPARATUS FOR SPATIO-TEMPORAL DEINTERLACING AIDED BY MOTIONCOMPENSATION FOR FIELD-BASED VIDEO ", No. 643 provisional application cases, the priority of the 60/789th, No. 048 provisional application case that is entitled as " SPATIO-TEMPORAL DEINTERLACING AIDED BYMOTION COMPENSATION FOR FIELD-BASED MULTIMEDIA DATA " of application on April 3rd, (2) 2006.Two temporary patent application cases all transfer this assignee, and are incorporated herein clearly by reference.
Technical field
The present invention relates generally to multi-medium data is handled, and more particularly relates to based on space time and motion compensation process and come the release of an interleave multi-medium data.
Background technology
Release of an interleave is meant and converts terleaved video (sequence) to the noninterlace process of frame (frame sequence) in proper order.The release of an interleave of multi-medium data is handled (this paper is called for short works " release of an interleave " sometimes) and is produced certain image degradation to I haven't seen you for ages, because it requires between first interlaced field of correspondence and second interlaced field and/or carry out interpolation method on the time between the interlaced field of vicinity will produce " losing " data that frame in proper order may need with generation.Usually, the release of an interleave process is used various linear interpolation technology and through being designed to calculating relatively simply to realize fast processing speed.
The multi-medium data that will interlock is transferred in proper order, and the demand of frame display unit (for example, mobile phone, computer, PDA(Personal Digital Assistant)) increases the importance that has also increased release of an interleave day by day.The a great problem of release of an interleave is the requirement that the field-based video signal does not reach sampling theorem usually.The statement of described theorem: if signal be finite bandwidth and sampling frequency than the big twice of signal bandwidth, may carry out the continuous time baseband signal so from the accurate reconstruction of its sample.If do not satisfy sampling condition, frequency is called as aliasing with overlapping and gained distortion so.In some TV broadcast systems, the filtration in advance (it can remove the aliasing condition) before the sampling is lost.The type solution interleaving technique (comprises that BOB (vertical frame in interpolation method), interweave (weave) (interpolation method between time frame) and linear VT (vertical and time) filter also fail to overcome aliasing effect.On the space, these linear filters are handled the image border in the mode identical with handling smooth region.Therefore, gained soft edge.On time, these linear filters do not utilize movable information, and the gained image suffers high aliasing grade owing to non-the seamlessly transitting between primary field and the recovery field.Although the linear filter poor-performing still is widely used owing to its lower computational complexity.Therefore, the applicant need to propose a kind of through improved de-interlace method and system.
Summary of the invention
In invention equipment described herein and the method each all has some aspects, and the arbitrary single aspect in the described aspect does not want attribute to be responsible for to it separately.Now under situation about not limiting the scope of the invention, the feature of brief discussion outbalance of the present invention.Understanding after this discusses content, and especially be entitled as after the part of " embodiment ", how to provide improvement multi-medium data treatment facility and method with understanding feature of the present invention in reading.
In one aspect, a kind of method of handling multi-medium data comprises: the space time information that produces the selected frame of staggered multi-medium data; Produce the motion compensation information of described selected frame; With the field of coming the described selected frame of release of an interleave based on described space time information and described motion compensation information, to form the frame in proper order that is associated with described selected frame.Producing space time information can comprise: use weighting median filter to handle staggered multi-medium data; With the interim release of an interleave frame of generation space time.The field of the described selected frame of release of an interleave further can comprise makes up interim release of an interleave frame of space time and motion-compensated interim release of an interleave frame to form frame in proper order.Can use candidate motion vector (being also referred to as " exercise estimator " herein) to produce motion compensation information.Motion compensation information can be bidirectional-movement information.In some respects, receive candidate motion vector and use it to produce motion compensation information.In some aspects, determine the candidate motion vector of the block the frame from the candidate motion vector of adjacent block.Produce space time information and can comprise generation at least one exercise intensity mapping graph (motionintensity map).In some aspects, the exercise intensity mapping graph is categorized into three or three above different motion grades.The exercise intensity mapping graph can become the different motion grade in order to the territorial classification with described selected frame.Can produce interim release of an interleave frame based on described exercise intensity mapping graph, the multiple standards that wherein can use Wmed to filter produces described interim release of an interleave frame based on the exercise intensity mapping graph.In some respects, use noise reduction sound filter (for example, wavelet shrinkage filter or Weiner filter (Weiner filter)) to remove noise from described interim frame.
On the other hand, a kind of equipment that is used to handle multi-medium data comprises: filter module, and it is configured to produce the space time information of the selected frame of staggered multi-medium data; Exercise estimator, it is configured to produce the bidirectional-movement information of described selected frame; And combiner, it is configured to use described space time information and described movable information to form frame in proper order corresponding to described selected frame.Described space time information can comprise the interim release of an interleave frame of space time, described movable information can comprise motion-compensated interim release of an interleave frame, and described combiner is configured to form frame in proper order by the interim release of an interleave frame of space time and motion-compensated interim release of an interleave frame are made up.
On the other hand, a kind of equipment that is used to handle multi-medium data comprises: the device of space time information that is used to produce the selected frame of staggered multi-medium data; Be used to produce the device of the movable information of described selected frame; Come the field of the described selected frame of release of an interleave to form the device of the frame in proper order that is associated with described selected frame with being used for based on described space time information and described movable information.Described de-interlacer can comprise and being used for the interim release of an interleave frame of space time and motion-compensated interim release of an interleave frame combination to form the device of frame in proper order.More in general, the device that is used to make up can be configured to form frame in proper order by space time information and movable information are made up.The device that produces space time information can be configured to produce the exercise intensity mapping graph of described selected frame, and uses described exercise intensity mapping graph to produce the interim release of an interleave frame of space time.In some respects, the device that produces space time information is configured to produce at least one exercise intensity mapping graph, and produces interim release of an interleave frame based on described exercise intensity mapping graph.
On the other hand, a kind of machine-readable medium is included in and causes a machine to carry out the instruction of following operation when carrying out: the space time information that produces the selected frame of staggered multi-medium data, produce the bidirectional-movement information of described selected frame, and come the field of the described frame of release of an interleave to form frame in proper order corresponding to described selected frame based on described space time information and described movable information.As disclosed herein, " machine-readable medium " can represent to be used to store one or more devices of data, comprises read-only memory (ROM), random-access memory (ram), magnetic disc storage media, optic storage medium, flash memory device and/or is used for other machine-readable medium of stored information.Term " machine-readable medium " includes, but is not limited to portable or fixed storage device, optical storage, wireless channel and can store, contains or carry multiple other medium of instruction and/or data.
On the other hand, disclose a kind of processor that is used to handle multi-medium data, described processor comprises in order to carry out the configuration of following operation: the space time information that produces the selected frame of staggered multi-medium data, produce the movable information of described selected frame, and come the field of the described selected frame of release of an interleave to form the frame in proper order that is associated with described selected frame based on described space time information and described movable information.
Description of drawings
Fig. 1 is the block diagram that is used to transmit the multimedia communication system of crossfire;
Fig. 2 is the block diagram that is used to transmit some assembly of the multimedia communication system of crossfire;
Fig. 3 A is the block diagram of explanation one deinterlacer device;
Fig. 3 B is the block diagram of another deinterlacer device of explanation;
Fig. 3 C is the block diagram of another deinterlacing apparatus of explanation;
Fig. 4 is subsample pattern graphic of staggered picture;
Fig. 5 is that explanation uses Wmed to filter the block diagram that estimation produces the deinterlacer device of release of an interleave frame;
Fig. 6 illustrates an aspect in the aperture of the static region that is used for definite multi-medium data;
Fig. 7 is the figure of an aspect in the aperture in the explanation microinching zone that is used for determining multi-medium data;
Fig. 8 is the figure of the one side of account for motion estimation;
Fig. 9 explanation employed two motion vector mapping graphs in determining motion compensation;
Figure 10 is the flow chart of the method for explanation release of an interleave multi-medium data;
Figure 11 is the flow chart that explanation usage space temporal information produces the method for release of an interleave frame;
Figure 12 is that the flow chart of motion compensation with the method that is used for release of an interleave carried out in explanation;
Figure 13 is the image of selected original " football " frame of explanation;
Figure 14 is the image of the interlaced frame of the image shown in explanation Figure 13;
Figure 15 is the image of the release of an interleave Wmed frame of the original football frame shown in explanation Figure 13; And
Figure 16 is the image of explanation by the release of an interleave frame of the combination Wmed frame of Figure 15 and motion compensation information gained.
Embodiment
In the following description, provide specific detail so that the overall understanding to described aspect to be provided.Yet one of ordinary skill in the art will understand, can not have to put into practice under the situation of these specific detail described aspect.For instance, can show circuit so that can not obscure described aspect by block diagram with unnecessary details.In other example, but the well-known circuit of detail display, structure and technology be not so that can obscure described aspect.
This paper description can be used singly or in combination the invention aspect with some release of an interleave of the system and method for the performance of improving release of an interleave.These aspects can comprise: the usage space temporal filtering comes the release of an interleave selected frame to determine the first interim release of an interleave frame; Use bi-directional motion estimation and motion compensation to determine the second interim release of an interleave frame from selected frame; With following the first interim frame and the second interim frame are made up to form finally frame in proper order.Space time filters and can use weighting median filter (" Wmed "), and described weighting median filter can comprise and prevents horizontal edge or near the fuzzy horizontal edge detector of horizontal edge.The space time previous and opposite field subsequently of " current " field filters and produces sport rank intensity mapping graph, and its several portions with selected frame is categorized into different motion grade (for example, static state, microinching and rapid movement).
In some respects, use the pore size filter that comprises from the pixel of five opposite fields (two preceding field, described), filter by Wmed and produce the intensity mapping graph when front court and two follow-up fields.Wmed filter can determine effectively to handle scene changes and the forward direction of object appearing and subsiding, back to the detection of bidirectional static zone.Aspect multiple, the Wmed filter can between filtered model on one or more of same parity, utilize, and switch the inner filtration pattern of showing up by adjusting threshold criteria.In some respects, estimation uses lumen (luma) (intensity of pixel or brightness) and chroma data (color of pixel information) to improve intensity level in the selected frame almost all once the release of an interleave that carry out in the different zone of color with compensation.Noise reduction sound filter can be in order to increase the accuracy of estimation.Noise reduction sound filter applies can be filtered the false shadow of the aliasing that produces to the interim release of an interleave frame of Wmed to remove by Wmed.De-interlace method described herein and system produce good release of an interleave result and have relatively low computational complexity to allow the carrying out release of an interleave embodiment fast, thereby make these embodiments be applicable to that the release of an interleave of broad variety uses, comprise in order to data be provided to mobile phone, computer or utilize the electronics of other type of display or the system of communicator.
" aspect " that this paper quoted, " on the one hand ", " some aspects " or " some aspect " mean in conjunction with one or more being included at least one aspect in the described special characteristic in described aspect, structure or the characteristic.These phrases appear at different places in the specification may not perfect representation with on the one hand, independent aspect or the alternative aspect that neither repel mutually with others.In addition, description can be by some aspects but not the various features that represented by others.Similarly, describe and can be for some aspects but not the multiple requirement of the requirement of others.
This paper employed " deinterlacer " is broad terms, and it can be in order to describe integral body or greatly partly to handle staggered multi-medium data to form release of an interleave system, device or the process of multi-medium data (for example comprising the software, firmware or the hardware that are configured to implementation) in proper order.
This paper employed " multi-medium data " is broad terms, and it comprises video data (it can comprise voice data), voice data or video data and both audio.This paper contains the image sequence of text or image information and/or voice data as " video data " or " video " expression of broad terms, and can be in order to the expression multi-medium data, the replaceable use of perhaps described term (unless stipulating in addition).
Fig. 1 is the block diagram that is used to transmit the multimedia communication system 10 of crossfire multimedia or other type.This technology is applied to the digital compression video transmission to multiple terminal, as shown in Figure 1.Digital video source can be feed source or through the high signal/rate source of digitized simulation of (for example) digital cable.Video source is treated and be modulated on the carrier wave to be transferred to terminal 16 via network 14 in transmission facilities 12.Network 14 can be the network (wired or wireless) of any kind that is applicable to transfer of data.For instance, network can be cell phone network, local area network (LAN) or wide area network (wired or wireless) or internet.Terminal 16 can be the communicator of any kind, includes, but is not limited to mobile phone, PDA and personal computer.
The broadcast video that produces with conventional method (with video cameras, at broadcasting studio etc.) meets NTSC (NTSC) standard in the U.S..Conventional method in order to compressed video interlocks for making it.In intercrossed data, each frame is made up of a field in two fields.A field is made up of the odd lines of described frame, and another is made up of even lines.Although produce described frame with about 30 frames of per second, described television camera record images for being separated by 1/60 second.Each frame of interlace video signal is showed every the horizontal line of image.When with described frame projection on screen the time, vision signal is being showed between even lines and the odd lines alternately.When this alternately enough soon when (for example about 60 frames of per second), video image at human eye for level and smooth.
Be used for staggered recent decades in the analog television broadcast based on NTSC (U.S.) and line-by-line inversion (PAL, Europe) form always.Because only half image sends with each frame, so the bandwidth of half is approximately compared in the terleaved video use with the bandwidth that will send whole picture.The final video display format of terminal 16 inside is not necessarily compatible and can not easily show intercrossed data with NTSC.In fact, modern display based on pixel (for example, LCD (LCD), digital light are handled (DLP), liquid crystal on silicon (LCOS), plasma etc.) is P-SCAN and requires by the video source of P-SCAN (yet many old-fashioned video-units use old-fashioned interlacing technology).The case description of some release of an interleave algorithms commonly used is in P.Haavisto, " Scan rate up-conversion using adaptive weighted medianfiltering " (Signal Processing of HDTV II of nineteen ninety of J.Juhola and Y.Neuvo, the the 703rd to 710 page) in and R.Simonetti, S.Carrato, in " Deinterlacing of HDTV Images for MultimediaApplications " (Signal Processing of HDTV IV in 1993, the 765th to 772 page) of G.Ramponi and A.Polo Filisan.
Fig. 2 explanation is in order to some assembly of the digital transmission facility 12 of release of an interleave multi-medium data.Transmission facilities 12 comprises the receiver 20 with staggered multi-medium data sources traffic.Described source can be external (as shown in the figure) or it can be from the source that is positioned at transmission facilities 12.Receiver 20 can be configured to receive the staggered multi-medium data that is transformat and it is transformed into and may be easy for the form further handled.Receiver 20 multi-medium data that will interlock is provided to deinterlacer 22, and 22 pairs of intercrossed datas of described deinterlacer carry out interpolation and produce frame of video in proper order.This paper is with reference to the each side of describing deinterlacer and de-interlace method in order to multiple assembly, module and/or the step of release of an interleave multi-medium data.
Fig. 3 A is the block diagram of an aspect of explanation deinterlacer 22.Deinterlacer 22 comprises: spatial filter 30, its room and time (" space time ") are gone up at least a portion of filtering intercrossed data and are produced space time information.For instance, in spatial filter 30, can use Wmed.In some respects, deinterlacer 22 also comprises noise reduction sound filter (not shown), for example Weiner filter or wavelet shrinkage filter.Deinterlacer 22 also comprises: exercise estimator 32, it provides estimation and the compensation and the generation movable information of the selected frame of intercrossed data.The combiner 34 that is arranged in deinterlacer 22 receive and interblock space temporal information and movable information to form frame in proper order.
Fig. 3 B is another block diagram of deinterlacer 22.The processor 36 that is arranged in deinterlacer 22 comprises: spatial filter module 38, exercise estimator module 40 and combiner modules 42.Staggered multi-medium data from external source 48 can be provided to the communication module 44 in the deinterlacer 22.Deinterlacer and assembly thereof or step can be implemented by hardware, software, firmware, middleware, microcode or its any combination.For instance, deinterlacer can be stand-alone assembly, and be incorporated into as hardware, firmware, middleware in the assembly of another device, or may be implemented in the microcode of on processor, carrying out or software, or its combination.In the time of in being implemented on software, firmware, middleware, microcode, carrying out the procedure code of deinterlacer task or chip segment and for example can be stored in the machine-readable medium such as medium.Chip segment can be represented any combination of process, function, subprogram, program, routine, subroutine, module, software kit, classification or instruction, data structure or program statement.By transmitting and/or reception information, data, independent variable, parameter or memory content, can make chip segment be coupled to another chip segment or hardware circuit.
The intercrossed data that is received can be stored in the deinterlacer 22 in the medium 46, described medium 46 can comprise that chip configuration formula medium that (for example) be connected to processor 36 (for example, ROM, RAM) or disc type medium (for example, magnetic or optics).In some respects, processor 36 can contain part or all of medium.Processor 36 is configured to handle staggered multi-medium data to form the frame in proper order that then is provided to another device or process.
Fig. 3 C is the block diagram of another deinterlacing apparatus 31 of explanation.Deinterlacing apparatus 31 comprises the device that is used to produce space time information, for example is used to produce the module 33 of space time information.Deinterlacing apparatus also comprises the device that is used to produce movable information, for example is used to produce the module 35 of movable information.In some respects, movable information is a bidirectional-movement information.Deinterlacing apparatus 31 also comprises the device that is used for release of an interleave, for example is used for the module 37 of the field of release of an interleave selected frame, and it produces the frame in proper order that is associated with the selected frame of just handling based on space time information and movable information.Describe the process that can be incorporated in the modules configured illustrated among Fig. 3 C in the application's case full text, comprise the process among (for example) Fig. 5.
The illustrative aspect of spatio-temporal deinterlacing device
As described above, present video with interlace mode, i.e. these device transmission even-line interlace line (even field) and odd-numbered scan lines (odd field) as traditional analog video-units such as TVs.From the viewpoint of sample of signal, this is equivalent to the described pattern of equation (1) and carries out the space time subsample:
Figure A20068004555200141
Wherein, Θ represents the primitive frame picture, and F represents interlaced field, and (x, y, n) horizontal level, upright position and the time location of difference remarked pixel.
Under the situation of not losing generality, can suppose that n=0 is an even field so that above equation (1) is reduced in this disclosure full text:
Figure A20068004555200142
Do not extract owing in horizontal dimensions, carry out, so can describe the subsample pattern in n~y coordinate subsequently.In Fig. 4, circular and star represents that the original full picture has the position of sampled pixel.Interleaved processes extracts the star pixel, and circular pixel is kept intact.It should be noted that our the index upright position of starting from scratch, so even field is top field, and odd field is a bottom field.
The target of deinterlacer is for terleaved video (sequence) is transformed into noninterlace frame (frame sequence) in proper order.In other words, dual numbers field and odd field carry out interpolation with " recovery " or generation full frame picture.This can represent by equation 3:
Figure A20068004555200151
F wherein iThe release of an interleave result of pixel is lost in expression.
Fig. 5 is the block diagram of some aspect of the one side of explanation deinterlacer 22, and deinterlacer 22 uses Wmed to filter and estimation produces frame in proper order from the multi-medium data that interlocks.The first half displaying of Fig. 5 can be used from shining upon Figure 52 when front court, two preceding field (PP field and P field) and two exercise intensities that produce with the information of back court (follow-up and follow-up field (Next Next Field)).Exercise intensity mapping Figure 52 is with described present frame classification or be divided into two or more different motion grades, and can filter by space time and produce (hereinafter describing in more detail).In some respects, produce exercise intensity mapping Figure 52, see below equation 4-8 and describe with identification static region, microinching zone and rapid movement zone.Space time filter (for example, Wmed filter 54) uses based on the standard of exercise intensity mapping graph and filters staggered multi-medium data, and produces the interim release of an interleave frame of space time.In some respects, the Wmed filter process relates to [1,1] horizontal neighbors, [3, the time neighborhood of vertical neighborhood 3] and five contiguous fields, represent that by five fields illustrated in fig. 5 (PP field, P field, when front court, follow-up, follow-up) wherein Z-1 represents the delay of a field for described five contiguous.With respect to when the front court, follow-up and P field are that non-parity field and PP field and follow-up are parity field.Be used for actual field of using and the room and time position of pixel during " neighborhood " expression filter operation that space time filters, and can be illustrated as " aperture " as shown in (for example) Fig. 6 and Fig. 7.
Deinterlacer 22 also can comprise noise reduction sound device (noise reduction sound filter) 56.Noise reduction sound device 56 is configured to filter the interim release of an interleave frame of space time that is produced by Wmed filter 56.The interim release of an interleave frame of space time is carried out the noise reduction sound make that the motion search process is more accurate subsequently, especially under the situation of the staggered multi-medium data sequence of white noise pollutant sources.It also can remove the aliasing between the even number line and odd-numbered line in the Wmed picture to small part.Noise reduction sound device 56 can be embodied as the various filters that also further describe hereinafter, comprises the noise reduction sound device based on wavelet shrinkage filter and small echo Weiner filter.
The base section explanation of Fig. 5 is used for the aspect of the movable information (for example, candidate motion vector, estimation, motion compensation) of definite staggered multi-medium data.In particular, Fig. 5 account for motion estimates and motion compensation mechanism, its motion-compensated frame in proper order temporarily in order to produce selected frame, and then itself and the interim frame of Wmed are made up with formation " finally " gained frame in proper order, be shown as release of an interleave present frame 64.In some respects, the candidate motion vector (" MV ") of staggered multi-medium data (or estimation) is provided to deinterlacer from the external movement estimator, and with thinking that bi-directional motion estimation device and compensator (" ME/MC ") 68 provides starting point.In some respects, candidate MV selector 72 uses the candidate MV of the previous MV that determines for adjacent block (MV of previous treated block for example, described previous treated block be the block in the release of an interleave previous frame 70 for example) with the block that is used for just handling.Can be based on previous release of an interleave frame 70 and Wmed follow-up (for example, future) frame 58, it is two-way making motion compensation.Merge or make up Wmed present frame 60 and motion-compensated (" MC ") present frame 66 by combiner 62.Gained release of an interleave present frame 64 (existing for frame) in proper order is provided gets back to ME/MC 68 to be used as release of an interleave previous frame 70 and also to be sent to the deinterlacer outside to be used for further processing (for example, compress and be transferred to display terminal).Various aspects shown in Fig. 5 are hereinafter described in more detail.
Figure 10 explanation is used to handle multi-medium data to produce the process 80 of frame sequence in proper order from the interlaced frame sequence.In one aspect, produce frame in proper order by deinterlacer illustrated in fig. 5.At frame 82 places, process 80 (process " A ") produces the space time information of selected frame.Space time information can comprise the information in order to the sport rank of classification multi-medium data and generation exercise intensity mapping graph, and comprises interim release of an interleave frame of Wmed and the information (for example, the information of using among the equation 4-11) in order to produce described frame.This process can be carried out by Wmed filter 54 (illustrated in the first half as Fig. 5) and relevant treatment (hereinafter being described in more detail) thereof.As hereinafter further describing, in process A (illustrated in fig. 11), the zone is categorized into the field of different motion grade at frame 92 places.
Next, (process " B ") locates at frame 84, and process 80 produces the motion compensation information of selected frame.In one aspect, bi-directional motion estimation device/motion compensator 68 (illustrated in the latter half of Fig. 5) can be carried out this process.Process 80 then proceeds to frame 86, and wherein said process 80 is come the field of the described selected frame of release of an interleave based on described space time information and described motion compensation information, to form the frame in proper order that is associated with described selected frame.This can the latter half by Fig. 5 in illustrated combiner 62 carry out.
The exercise intensity mapping graph
For each frame, can determine exercise intensity mapping Figure 52 by the pixel of handling in the front court, to determine different " motion " zones.Hereinafter with reference Fig. 6-9 describes the illustrative aspect of determining three type games intensity mapping graphs.Pixel in the identical based on the comparison parity field of exercise intensity mapping graph and the different parity field is appointed as static region, microinching zone and rapid movement zone with the zone of each frame.
Static region
The static region of determining Motion mapping figure can comprise the pixel of handling in the neighborhood that is close to the field, whether satisfies a certain standard with the luminance difference of determining some pixel.In some respects, the static region of determining Motion mapping figure comprises the pixel of the neighborhood of handling five contiguous (one as front court (C), be arranged in two fields before current on the time and be positioned at two fields after current on the time), whether satisfies some threshold value with the luminance difference of definite some pixel.These five fields are illustrated among Fig. 5, wherein Z -1The delay of a field of expression.In other words, described five contiguous fields will be showed in usually and have Z -1In this sequence of time delay.
The aperture of some pixel of each in five fields that can be used for the space time filtration is discerned in Fig. 6 explanation according to some aspects.The aperture comprise (from left to right) previous preceding field (Previous Previous Field, PP), preceding field (P), when 3 * 3 pixel groups of front court (C), follow-up (N) and follow-up field (NN).In some respects, if satisfy standard described in the equation 4-6 when the zone of front court, it is regarded as staticly in Motion mapping figure so, and location of pixels and corresponding fields are illustrated among Fig. 6:
|L P-L N|<T 1 (4)
With
| L BPP - L B 2 | + | L EPP - L E 2 | < T 1 (forward direction static state) (5)
Or
| L BNN - L B 2 | + | L ENN - L E 2 | < T 1 (back is to static state) (6)
T wherein 1Be threshold value,
L PBe the brightness of the pixel P that is arranged in the P field,
L NBe the brightness of the pixel N that is arranged in the N field,
L BFor being arranged in brightness when the pixel B of front court,
L EFor being arranged in brightness as the pixel E of front court,
L BPPFor being arranged in the pixel B of PP field PPBrightness,
L EPPFor being arranged in the pixel E of PP field PPBrightness,
L BNNFor being arranged in the pixel B of NN field NNBrightness, and
L ENNFor being arranged in the pixel E of NN field NNBrightness.
Threshold value T 1Can be scheduled to and be set at the particular value of determining by the process that is different from release of an interleave and (for example, as the metadata of the video that is used for positive release of an interleave) is provided, or it can dynamically be determined during release of an interleave.
Above the static region standard described in the equation 4,5 and 6 since at least two former thereby use than more of conventional release of an interleave technology.The first, between the identical parity field relatively than relatively having lower aliasing and a phase mismatch between the different parity field.Yet the field of just handling is two fields with the minimum time poor (thereby correlation) between its most contiguous identical odd even opposite field, and is poor greater than the minimum time of different odd even opposite fields with it.More reliable different parity field can be improved the accuracy that static region detects with the combination of the low identical parity field of aliasing.
In addition, as shown in Figure 6, can be distributed in over and future symmetrically in five fields with respect to the pixel X among the present frame C.Static region can be divided into again three classifications: forward direction static state (with respect to previous frame for static), back are to static (with respect to subsequent frame for static) or two-way (if satisfying forward direction standard and back to both criteria).Performance when this of static region can improve variation of (especially) scene and object appearance/disappearance than disaggregated classification.
The microinching zone
Be applicable to the standard of specifying the microinching zone if the brightness value of some pixel does not satisfy the standard that is applicable to the appointment static region but satisfies, the zone of Motion mapping figure can be regarded as the microinching zone among the Motion mapping figure so.Hereinafter equation 7 definition can be in order to determine the standard in microinching zone.Referring to Fig. 7, it is in the aperture at center that the position of pixel Ia, Ic, Ja, Jc, Ka, Kc, La, Lc, P and the N that is discerned in the equation 7 is showed in pixel X.Described aperture comprises 3 * 5 neighborhoods when 3 * 7 neighborhood of pixels of front court (C) and follow-up (N), preceding field (P).If, so pixel X is considered as the part in microinching zone if the standard and the pixel in the aperture of the above listed static region of the discontented foot of pixel X satisfy the standard shown in the following equation 7:
(|L Ia-L Ic|+|L Ja-L Jc|+|L Ja-L Jc|+|L Ka-L Kc|+|L La-L Lc|+|L P-L N|)/5<T 2 (7)
Wherein, T 2Be threshold value, and
L Ia, L Ic, L Ja, L Jc, L Jc, L Jc, L Ka, L Kc, L La, L Lc, L P, L NBe respectively the brightness value of pixel Ia, Ic, Ja, Jc, Ka, Kc, La, Lc, P and N.
Threshold value T 2Also can be scheduled to and be set at the particular value of determining by the process that is different from release of an interleave and (for example, as the metadata of the video that is used for positive release of an interleave) is provided, or it can dynamically be determined during release of an interleave.
It should be noted that filter because the cause of the angle of its rim detection ability and can blur the edge (for example, become greater than 45 ° angle) of level with vertical arrangement.For instance, the angle that is formed by pixel " A " and " F " or " C " and " D " can influence the rim detection ability in aperture illustrated in fig. 7 (filter).Than this angle more any edge of level will can not get best interpolation and therefore stepped false shadow can come across described edge.In some respects, the microinching classification can be divided into two subclass (" horizontal edge " and " other ") so that this rim detection effect to be described.The microinching pixel can be categorized as horizontal edge (if Horizontal Edge is the standard in the equation 8 shown in satisfying hereinafter) and so-called " other " classification (if not satisfying described standard).
|(LA+LB+LC)-(LD+LE+LF)|<T 3 (8)
Wherein, T 3Be threshold value, and LA, LB, LC, LD, LE and LF are the brightness value of pixel A, B, C, D, E and F.
Different interpolating methods can be used for each in horizontal edge and other classification.
The rapid movement zone
If do not satisfy the standard of static region and the standard in microinching zone, can think that so pixel is in the rapid movement zone.
After to the classification of the pixel in the selected frame, process A (Figure 11) then proceeds to frame 94 and produces interim release of an interleave frame based on the exercise intensity mapping graph.In this regard, Wmed filter 54 (Fig. 5) filtration institute selected scenes and suitably contiguous field are to provide candidate's full frame image F 0, it can be defined as follows:
Figure A20068004555200191
Wherein, α i(i=0,1,2,3) be following calculate the integer weighted value:
&beta; 0 = A + F | A - F | , &beta; 1 = B + E | B - E | , &beta; 2 = C + D | C - D | , &beta; 3 = G + H | G - H | - - - ( 11 )
Provide the interim release of an interleave frame that filters through Wmed to be used for further processing in conjunction with estimation and motion compensation process, illustrated in the latter half as Fig. 5.
Showed that as described above with in the equation 9 static interpolation method comprises between the field wait a moment speed motion and rapid movement interpolation method of interpolation method and comprises the intrafield interpolation method.At some aspect of the time that does not require identical parity field (for example, between) interpolation, by with threshold value T 1(equation 4-6) is set at zero (T 1=0) the temporal interpolation method of can " stopping using ".It is static that the processing when the front court that the temporal interpolation method is stopped using causes not having in the sport rank mapping graph zone to be classified as, and Wmed filter 54 (Fig. 5) may need to be illustrated in three fields in the aperture among Fig. 7, and described three field actions are in the non-parity field when front court and two vicinities.
The noise reduction sound
In some aspects, can use noise reduction sound device before the use motion compensation information is further handled candidate Wmed frame, from described candidate Wmed frame, to remove noise.How noise reduction sound device can be removed the noise that is present in the Wmed frame and the signal that is provided is provided regardless of the frequency content of signal.Can use polytype noise reduction sound filter, comprise wavelet filter.Small echo is to be used for the class function of given signal limiting in spatial domain and scale domain.The basic thought of small echo is to come analytic signal to make the less change in the Wavelet representation for transient form produce corresponding less change in primary signal with different scale or resolution.
In some respects, noise reduction sound filter is based on the aspect of (4,2) biorthogonal cube B spline wavelets filter.This type of filter can define by following positive-going transition and reciprocal transformation:
h ( z ) = 3 4 + 1 2 ( z + z - 1 ) + 1 8 ( z + z - 2 ) (positive-going transition) (12)
With
g ( z ) = 5 4 z - 1 - 5 32 ( 1 + z - 2 ) - 3 8 ( z + z - 3 ) - 3 32 ( z 2 + z - 4 ) (reciprocal transformation) (13)
In noisy environment, the application of noise reduction sound filter can increase the accuracy of motion compensation.Suppose that the noise in the video sequence is the additivity white Gaussian noise.The estimate variance of noise by
Figure A20068004555200203
Indicate.Its median absolute deviation that can be estimated as highest frequency time frequency band coefficient is divided by 0.6745.The embodiment of these filters is further described in D.L.Donoho and I.M.Johnstone's " Ideal spatial adaptation by wavelet shrinkage " (the 425th to 455 page of Biometrika the 8th volume in 1994), and it is incorporated herein by reference in full.
Wavelet shrinkage filter or small echo Weiner filter are also applicable as noise reduction sound device.Wavelet shrinkage noise reduction sound can relate to the contraction in the wavelet transformed domain, and comprises three steps usually: linear forward wavelet transform, nonlinear shrinkage noise reduction sound and linear inverse wavelet transform.Weiner filter is best mean square error (MSE) linear filter, and it can be in order to improve by the image of additivity noise and fuzzy degradation.These filters are generally known in this technology and are described in " via experience dimension Nano Filtration improvement wavelet de-noising sound (Improvement Wavelet denoising via empirical Wiener filtering) " (in July, 1997, the 389th to 399 page of Santiago Proceedings of SPIE the 3169th volume) of " ideal space by wavelet shrinkage adaptive (Ideal spatial adaptation by waveletshrinkage) " that (for example) above quoted and S.P.Ghael, A.M.Sayeed and R.G.Baraniuk.
Motion compensation
Referring to Figure 12, at frame 102 places, process B carries out bi-directional motion estimation, and then uses estimation to carry out motion compensation at frame 104 places, and it is discussed among Fig. 5, and is described in hereinafter in the illustrative aspect.There is a field " hysteresis " at the Wmed filter and between based on the deinterlacer of motion compensation.When the motion compensation information of " losing " data (non-former the beginning of pixel data) of front court " C " is according to the information prediction among both of as shown in Figure 8 previous frame " P " and subsequent frame " N ".In the front court (Fig. 6), solid line represents that row that raw pixel data wherein exists and dotted line represent the row that Wmed interpolation type pixel data wherein exists.In some aspects, in 4 row * 8 row neighborhood of pixels, carry out motion compensation.Yet, purpose for explaination, this neighborhood of pixels is an example, and be understood by those skilled in the art that, can carry out motion compensation based on the neighborhood of pixels of the row of row that comprises different numbers and different numbers in others, the selection of the number of row and column can comprise the characteristic of the multi-medium data of (for example) computational speed, available processes ability or positive release of an interleave based on many factors.Because when the front court only has half of described row, so four row to be matched are in fact corresponding to the zone of 8 pixels * 8 pixels.
Referring to Fig. 5, but two-way ME/MC 68 use error quadratic sums (SSE), and it can be used for measuring for the positive prediction block of Wmed present frame 60 and the similarity between the prediction block with respect to Wmed subsequent frame 58 and release of an interleave present frame 70.Then use from the Pixel Information of the most similar coupling block and fill obliterated data between the original pixels line, to produce motion-compensated present frame 66.In some respects, from the information offset of release of an interleave previous frame 70 or be administered to Pixel Information, because it is produced by motion compensation information and Wmed information, Wmed subsequent frame 58 only filters release of an interleave by space time to two-way ME/MC 68 simultaneously with more weights.
In some respects, for the matching performance in the zone of improving the field that has similar lumen zone but have different chroma areas, can use SSE tolerance, described SSE tolerance comprises (for example provides one or more lumen pixel groups, the lumen block of one 4 row * 8 row) and the pixel value of one or more chroma pixel groups (for example, two 2 row * 4 be listed as colourity block U and V).These methods reduce the mismatch at place, color sensitizing range effectively.
Motion vector (MV) has in vertical dimensions 1/2 pixel and the granularity of 1/2 or 1/4 pixel in horizontal dimensions.In order to obtain the fraction pixel sample, can use interpolation filter.For instance, can comprise in order to some filters that obtain the half-pix sample bi-linear filter (1,1), by the interpolation filter of H.263/AVC recommending: (1 ,-5,20,20 ,-5,1) and six tap Hamming windows? sinc? function filter (3 ,-21,147,147 ,-21,3).Can from both full-pixel sample and half-pix sample, produce 1/4 pixel samples by using bi-linear filter.
In some respects, motion compensation can be used various types of search procedures, make present frame certain position data (for example, describe an object) and another frame is (for example, subsequent frame or previous frame) in the corresponding data coupling at diverse location place, the motion of the position difference directing object in the individual frame.For instance, search procedure is used the full motion search that can cover than the large search zone, or its use rapid movement search, and described rapid movement search can be used less pixel, and/or be used for search pattern selected pixel can have given shape (for example, diamond shape).For rapid movement search, can make the region of search is the center with estimation or candidate's motion of the starting point that can be used as the searching near frame.In some respects, candidate MV can produce and be provided to deinterlacer from the external movement estimator.Motion vector from the macro zone block in the previous motion-compensated contiguous frames of corresponding neighborhood also can be used as estimation.The neighborhood of macro zone block (for example, 3 macro zone blocks * 3 macro zone blocks) that in some respects, can be by corresponding previous frame of search and subsequent frame produces candidate MV.
Producible two the MV mapping graph (MV of neighborhood of search previous frame and subsequent frame are passed through in Fig. 9 explanation during motion estimation/compensation PAnd MV N) example, as shown in Figure 8.At MV PAnd MV NAmong both, pending is the center block of being represented by " X " with the block of determining movable information.At MV PAnd MV NAmong both, spendable nine candidate MV during the estimation of the current block X that existence is just being handled.In this example, four among the candidate MV are present in the field identical with the motion search of early carrying out and pass through MV PAnd MV NLighter block is described (Fig. 9).The movable information (or mapping graph) that other five the candidate MV that described by more dark block are frames of before pre-treatment duplicates and gets.
After finishing motion estimation/compensation, twice interpolation result can produce and lose row (being represented by dotted line among Fig. 8): an interpolation result produces (Wmed present frame 60 (Fig. 5)) by the Wmed filter, and interpolation result produces (MC present frame 66) by the motion estimation process of motion compensator.Combiner 62 merges Wmed present frame 60 and MC present frame 66 by at least a portion of using Wmed present frame 60 and MC present frame 66 usually, to produce current release of an interleave frame 64.Yet under certain conditions, combiner 62 can only one produce current release of an interleave frame in present frame 60 or the MC present frame 66 by using.In an example, combiner 62 merges Wmed present frame 60 and MC present frame 66 to produce the release of an interleave output signal shown in the equation 14:
Figure A20068004555200221
Wherein Be used for a n 1Middle position x=(x, y) t( tBe used for transposition) brightness value located.The shearing function that uses as give a definition:
Clip (0,1, a)=0, if (a<0); 1, if (a>1); A, other (15)
k 1Can be calculated as:
k 1 = clip ( 0 , C 1 Diff ) - - - ( 16 )
Wherein, C 1Be sane parameter, and Diff is a positive predictive frame pixel and the lumen between the available pixel in the predictive frame (take from existing) is poor.By suitable selection C 1, the relative importance of the tuning mean square error of possibility.k 2Can as shown in equation 17, calculate:
k 2 = 1 - clip ( 0,1 , ( 1 - k 1 ) | F Wmed ( x &RightArrow; - y &RightArrow; u , n ) - F MC ( x &RightArrow; - y &RightArrow; u - D &RightArrow; , n - 1 ) | + &delta; | F Wmed ( x &RightArrow; , n ) - F MC ( x &RightArrow; - D &RightArrow; , n - 1 ) | + &delta; ) - - - ( 17 )
Wherein, x &RightArrow; = ( x , y ) , y &RightArrow; u = ( 0,1 ) ,
Figure A20068004555200236
Be motion vector, δ is that less constant is to prevent division by 0.The release of an interleave that uses the shearing function to filter was further described in " De-interlacing of videodata " (the 819th to 825 page of the 43rd the 3rd phase of volume of IEEE Transactions on Consumer Electronics in 1997) of G.D.Haan and E.B.Bellers, and it is incorporated herein by reference in full.
In some respects, combiner 62 can be configured on probation and keep following equation to realize high Y-PSNR (PSNR) and sane result:
| F O ( x &RightArrow; , n ) - F Wmed ( x &RightArrow; , n ) | = | F O ( x &RightArrow; - y &RightArrow; u , n ) - F Wmed ( x &RightArrow; - y &RightArrow; u , n ) | - - - ( 17 )
The release of an interleave forecasting mechanism that can comprise interpolation method between the field from the uncoupling of intrafield interpolation method by Wmed+MC release of an interleave mechanism.In other words, space time Wmed filters can be mainly used in intrafield interpolation method purpose, and interpolation method can be carried out during motion compensation between the field.This has reduced Wmed result's Y-PSNR, but the visual quality of using after the motion compensation is more satisfactory, because will remove by the Wmed filter process from the bad pixel of predictive mode decision-making between inaccurate field.
The colourity dirigibility need be handled consistent with the lumen that is disposed.Aspect the generation of Motion mapping figure, the sport rank of chroma pixel obtains by the sport rank of observing its four lumen pixels that are configured.Described operation can be carried out (the colourity sport rank is used tangible lumen sport rank) based on ballot (voting).Yet we advise using following conservative approach.If the arbitrary pixel in four lumen pixels has the rapid movement grade, the colourity sport rank should be rapid movement so; Otherwise if the arbitrary pixel in four lumen pixels has the microinching grade, the colourity sport rank should be microinching so; Otherwise the colourity sport rank is static.The described conservative approach the highest PSNR of may being unrealized, but it has avoided existing the fuzzy risk of whenever all using interframe (INTER) prediction in the colourity sport rank.
Use independent described Wmed algorithm of describing and composite type Wmed described herein and movement compensating algorithm to come release of an interleave multi-medium data sequence." no release of an interleave " situation of also using pixel to mix (or asking average) algorithm and only the field combination be need not to carry out any interpolation or mixing is come the identical multi-medium data sequence of release of an interleave.The gained frame is by analysis to determine PSNR and to be showed in the following table:
PSNR (dB) sequence No release of an interleave Mix Wmed Wmed+MC
Football 8.955194 11.38215 19.26221 19.50528
The city 11.64183 12.93981 15.03303 15.09859
The group member 13.32435 15.66387 22.36501 22.58777
Although only have small PSNR improvement by except that Wmed, also using MC to carry out release of an interleave, but it is visually more satisfactory by the release of an interleave visual quality for images that combination Wmed and MC interpolation result are produced, because as mentioned above, combination Wmed result and MC result have suppressed aliasing and the noise between even field and the odd field.
The example of the performance of the described deinterlacer of Figure 13-15 explanation.Figure 13 shows the primitive frame #109 of " football ".Figure 14 shows the frame #109 identical with intercrossed data.Figure 15 is shown as the Wmed frame with frame #109, in other words, and the Wmed frame of gained after Wmed filter 54 (Fig. 5) is handled.Figure 16 shows the frame #109 by combination Wmed interpolation method and motion compensated interpolation method gained.
It should be noted that and described aspect can be described as process, it is depicted as flow chart, flow chart, structure chart or block diagram.Although flow chart can be described as operation process in regular turn, the many operations in the described operation can walk abreast or carry out simultaneously.In addition, the order of operation is reconfigurable.When the operation of process was finished, process stopped.Process can be corresponding to method, function, program, subroutine, subprogram etc.When process during corresponding to function, its termination is returned call function or principal function corresponding to described function.
The those skilled in the art should also be clear that one or more elements of reconfigurable device disclosed herein under the situation of the operation that does not influence device.Similarly, one or more elements of device disclosed herein capable of being combined under the situation of the operation that does not influence device.One of ordinary skill in the art will understand, and can use in various different process and the technology any one to come expression information and signal.Those skilled in the art will further understand, and can be embodied as electronic hardware, firmware, computer software, middleware, microcode or its combination in conjunction with the described multiple declaration logical blocks of example disclosed herein, module and algorithm steps.For this interchangeability of hardware and software clearly is described, above substantially according to its functional descriptions multiple declaration assembly, block, module, circuit and step.It still is that software depends on application-specific and the design constraint of forcing at whole system that this interchangeability is embodied as hardware.For each application-specific, those skilled in the art can implement described functional by different way, but the scope of the method that these implementation decisions should not be interpreted as causing and be disclosed breaks away from.
In the software module that the method for describing in conjunction with example disclosed herein or the step of algorithm can be embodied directly in the hardware, carried out by processor or in described both combination.Software module can reside in the medium of any other form known in RAM memory, flash memory, ROM memory, eprom memory, eeprom memory, register, hard disk, moveable magnetic disc, CD-ROM or this technology.Exemplary storage medium is coupled to processor, makes that processor can be from read information and to the medium writing information.In replacing embodiment, medium can be integral formula with processor.Processor and medium can reside in the application-specific integrated circuit (ASIC) (ASIC).ASIC can reside in the radio modem.In replacing embodiment, processor and medium can be used as discrete component and reside in the radio modem.
In addition, various illustrative logical blocks, assembly, module and the circuit of describing in conjunction with example disclosed herein can be implemented with its any combination of carrying out function described herein or carry out with general processor, digital signal processor (DSP), application-specific integrated circuit (ASIC) (ASIC), field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components or through design.General processor can be a microprocessor, but in alternate embodiment, processor can be any conventional processors, controller, microcontroller or state machine.Processor also can be embodied as the combination of calculation element, for example combination, a plurality of microprocessor of DSP and microprocessor, combine one or more microprocessors of DSP core or any other this type of configuration.
One of ordinary skill in the art provide previous description to the example that disclosed so that can make or use the method and apparatus that is disclosed.The those skilled in the art will be easy to understand the multiple modification to these examples, and can be under the situation of the spirit or scope that do not break away from the method and apparatus that is disclosed principle defined herein be applied to other example and can add additional element.The description of described aspect is wished for illustrative and should not limit the scope of claims.

Claims (41)

1. method of handling multi-medium data, described method comprises:
Produce the space time information of the selected frame of staggered multi-medium data;
Produce the motion compensation information of described selected frame; And
Come the field of the described selected frame of release of an interleave based on described space time information and described motion compensation information, to form the frame in proper order that is associated with described selected frame.
2. method according to claim 1, wherein produce space time information and comprise the interim release of an interleave frame of generation space time, wherein produce movable information and comprise and produce motion-compensated interim release of an interleave frame, and wherein the field of the described selected frame of release of an interleave further comprises interim release of an interleave frame of described space time and described motion-compensated interim release of an interleave frame are made up to form described frame in proper order.
3. method according to claim 1, it further comprises the use candidate motion vector and produces described motion compensation information.
4. method according to claim 1, it further comprises:
Receive candidate motion vector;
Determine motion vector based on described candidate motion vector; And
Use described motion vector to produce described motion compensation information.
5. method according to claim 1, it further comprises:
From the motion vector estimation value of adjacent block, determine the candidate motion vector of the video data block in the described selected frame; And
Use described candidate motion vector to produce described motion compensation information.
6. method according to claim 1 wherein produces space time information and comprises:
Produce at least one exercise intensity mapping graph; And
Produce interim release of an interleave frame based on described exercise intensity mapping graph, wherein said release of an interleave comprises the described interim release of an interleave frame of use and described movable information produces described frame in proper order.
7. method according to claim 6 wherein produces interim release of an interleave frame and comprises: if described at least one exercise intensity mapping graph indication selected condition, the described staggered multi-medium data of spatial filtering so.
8. method according to claim 6 wherein produces the territorial classification that at least one exercise intensity mapping graph comprises described selected frame and becomes the different motion grade.
9. method according to claim 8 wherein produces at least one exercise intensity mapping graph and comprises based on the described staggered multi-medium data of described different motion grade spatial filtering.
10. method according to claim 1, wherein spatial filtering comprises use and handles described staggered multi-medium data through weighting median filter.
11. method according to claim 6 wherein produces interim release of an interleave frame and comprises a plurality of field of crossing over described staggered multi-medium data based on described exercise intensity mapping graph and carry out spatial filtering.
12. method according to claim 1 wherein produces space time information and comprises the time neighborhood of crossing over selected field when the front court and carry out space time and filter.
Be positioned at described preceding field before the front court on time 13. method according to claim 12, wherein said time neighborhood are included in, and be included in and be positioned at described follow-up field when the front court after on the time.
Be positioned at described a plurality of preceding field before the front court on time 14. method according to claim 12, wherein said time neighborhood are included in, and be included in and be positioned at described a plurality of follow-up field when the front court after on the time.
15. method according to claim 1 wherein produces space time information and comprises based on space time and filter to produce interim release of an interleave frame and use noise reduction sound filter to filter described interim release of an interleave frame.
16. method according to claim 15, wherein the field of the described selected frame of release of an interleave comprises described interim release of an interleave frame through the noise reduction sound and movable information is made up to form described frame in proper order.
17. method according to claim 15, wherein said noise reduction sound filter comprises wavelet shrinkage filter.
18. method according to claim 15, wherein said noise reduction sound filter comprises Wei Na (Weiner) filter.
19. method according to claim 1 wherein produces movable information and comprises described selected execution bi-directional motion estimation to produce motion vector and to use described motion vector to carry out motion compensation.
20. method according to claim 1, it further comprises:
Produce the interim release of an interleave frame that is associated with described selected frame based on described space time information;
Acquisition is about the motion vector of described interim release of an interleave frame; And
Use described motion vector to carry out motion compensation to produce described movable information, wherein said movable information comprises motion-compensated frame, and
Wherein release of an interleave comprises described motion-compensated frame and described interim release of an interleave frame combination.
21. method according to claim 20, it further comprises:
In the time neighborhood around the described selected frame, produce interim release of an interleave frame sequence based on described space time information; And
Use described interim release of an interleave frame sequence to produce motion vector.
22. method according to claim 20 is wherein carried out motion compensation and is comprised the execution bi directional motion compensation.
23. method according to claim 21, it further comprises the noise reduction sound and filters described interim release of an interleave frame.
24. method according to claim 21, wherein said interim interlaced frame sequence is included in the interim release of an interleave frame of the described multi-medium data before the described interim release of an interleave frame of described selected frame and the interim release of an interleave frame of the described multi-medium data after the described interim release of an interleave frame of described selected frame.
25. an equipment that is used to handle multi-medium data, it comprises:
Filter module, it is configured to produce the space time information of the selected frame of staggered multi-medium data;
Exercise estimator, it is configured to produce the bidirectional-movement information of described selected frame; And
Combiner, it is configured to use described space time information and described movable information to form the frame in proper order that is associated with described selected frame.
26. equipment according to claim 25, it further comprises the noise reduction sound device that is configured to remove noise from described space time information.
27. equipment according to claim 25, wherein said space time information comprises the interim release of an interleave frame of space time, wherein said movable information comprises motion-compensated interim release of an interleave frame, and wherein said combiner further is configured to form described frame in proper order by the interim release of an interleave frame of described space time and described motion-compensated interim release of an interleave frame are made up.
28. equipment according to claim 25, wherein said movable information are bidirectional-movement information.
29. equipment according to claim 26, wherein said filter module further is configured to determine the exercise intensity mapping graph of described selected frame, and use described exercise intensity mapping graph to produce the interim release of an interleave frame of space time, and described combiner is configured to form described frame in proper order by described movable information and the interim release of an interleave frame of described space time are made up.
30. equipment according to claim 25, wherein said exercise estimator are configured to use the frame in proper order of previous generation to produce at least a portion of described movable information.
31. an equipment that is used to handle multi-medium data, it comprises:
Be used to produce the device of space time information of the selected frame of staggered multi-medium data;
Be used to produce the device of the movable information of described selected frame; And
The field that is used for coming the described selected frame of release of an interleave based on described space time information and described movable information with form with
The device of the frame in proper order that described selected frame is associated.
32. equipment according to claim 31, wherein said space time information comprises the interim release of an interleave frame of space time, wherein said movable information comprises motion-compensated interim release of an interleave frame, and wherein said de-interlacer comprises and is used for the interim release of an interleave frame of described space time and described motion-compensated interim release of an interleave frame combination to form the device of described frame in proper order.
33. equipment according to claim 31, wherein said de-interlacer comprises combiner, and described combiner is configured to form described frame in proper order by space time information and movable information are made up.
34. equipment according to claim 31, wherein said movable information comprises bidirectional-movement information.
35. equipment according to claim 32, the device of wherein said generation space time information is configured to produce the exercise intensity mapping graph of described selected frame, and use described exercise intensity mapping graph to produce the interim release of an interleave frame of space time, and wherein said composite set is configured to form described frame in proper order by described movable information and the interim release of an interleave frame of described space time are made up.
36. equipment according to claim 31, the device of wherein said generation space time information is configured to: produce at least one exercise intensity mapping graph; And
Produce interim release of an interleave frame based on described exercise intensity mapping graph,
Wherein said de-interlacer is configured to use described interim release of an interleave frame and described movable information to produce described frame in proper order.
37. equipment according to claim 36 wherein produces interim release of an interleave frame and comprises: if described at least one exercise intensity mapping graph indication selected condition, the described staggered multi-medium data of spatial filtering so.
38. equipment according to claim 36 wherein produces the territorial classification that at least one exercise intensity mapping graph comprises described selected frame and becomes the different motion grade.
39., wherein produce at least one exercise intensity mapping graph and comprise based on described different motion grade and come the described staggered multi-medium data of spatial filtering according to the described equipment of claim 38.
40. a machine-readable medium that comprises the instruction that is used to handle multi-medium data, wherein said instruction cause a machine when carrying out:
Produce the space time information of the selected frame of staggered multi-medium data;
Produce the bidirectional-movement information of described selected frame; And
Come the field of the described frame of release of an interleave based on described space time information and described movable information, to form frame in proper order corresponding to described selected frame.
41. a processor that is used to handle multi-medium data, described processor is configured to:
Produce the space time information of the selected frame of staggered multi-medium data;
Produce the movable information of described selected frame; And
Come the field of the described selected frame of release of an interleave based on described space time information and described movable information, to form the frame in proper order that is associated with described selected frame.
CNA2006800455528A 2005-10-17 2006-10-17 Method and apparatus for spatio-temporal deinterlacing aided by motion compensation for field-based video Pending CN101322400A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US72764305P 2005-10-17 2005-10-17
US60/727,643 2005-10-17
US60/789,048 2006-04-03
US11/536,894 2006-09-29

Publications (1)

Publication Number Publication Date
CN101322400A true CN101322400A (en) 2008-12-10

Family

ID=40181303

Family Applications (1)

Application Number Title Priority Date Filing Date
CNA2006800455528A Pending CN101322400A (en) 2005-10-17 2006-10-17 Method and apparatus for spatio-temporal deinterlacing aided by motion compensation for field-based video

Country Status (1)

Country Link
CN (1) CN101322400A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103826082A (en) * 2014-01-21 2014-05-28 华为技术有限公司 Video processing method and device
CN111445504A (en) * 2020-03-25 2020-07-24 哈尔滨工程大学 Water-to-air distortion correction algorithm based on image sequence

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103826082A (en) * 2014-01-21 2014-05-28 华为技术有限公司 Video processing method and device
US9516260B2 (en) 2014-01-21 2016-12-06 Huawei Technologies Co., Ltd. Video processing method and apparatus
CN103826082B (en) * 2014-01-21 2017-07-14 华为技术有限公司 A kind of method for processing video frequency and device
CN111445504A (en) * 2020-03-25 2020-07-24 哈尔滨工程大学 Water-to-air distortion correction algorithm based on image sequence

Similar Documents

Publication Publication Date Title
US20070206117A1 (en) Motion and apparatus for spatio-temporal deinterlacing aided by motion compensation for field-based video
CN100518243C (en) De-interlacing apparatus using motion detection and adaptive weighted filter
KR101536794B1 (en) Image interpolation with halo reduction
US9247250B2 (en) Method and system for motion compensated picture rate up-conversion of digital video using picture boundary processing
CN100479495C (en) De-interlacing method with the motive detection and self-adaptation weight filtering
US20020171759A1 (en) Adaptive interlace-to-progressive scan conversion algorithm
US8582032B2 (en) Motion detection for interlaced video
US20100177239A1 (en) Method of and apparatus for frame rate conversion
US8411751B2 (en) Reducing and correcting motion estimation artifacts during video frame rate conversion
US8102914B2 (en) Image decoder
Jeon et al. Specification of the geometric regularity model for fuzzy if-then rule-based deinterlacing
JP2004515980A (en) High-quality, cost-effective film-to-video converter for high-definition television
US7302003B2 (en) Method and device for image interpolation with motion compensation
CN100518288C (en) Adaptive vertical temporal flitering method of de-interlacing
Yu et al. Motion adaptive deinterlacing with accurate motion detection and anti-aliasing interpolation filter
US6909752B2 (en) Circuit and method for generating filler pixels from the original pixels in a video stream
CN101322400A (en) Method and apparatus for spatio-temporal deinterlacing aided by motion compensation for field-based video
Zhang et al. A polynomial approximation motion estimation model for motion-compensated frame interpolation
Sugiyama et al. Motion compensated frame rate conversion using normalized motion estimation
US6760376B1 (en) Motion compensated upconversion for video scan rate conversion
Wang et al. A block-wise autoregression-based deinterlacing algorithm
JP4681341B2 (en) Sequential scan converter
Biswas Content adaptive video processing algorithms for digital TV
JP4102620B2 (en) Video format converter
Hong et al. Edge-preserving spatial deinterlacing for still images using block-based region classification

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Open date: 20081210