CN101483746A - Deinterlacing method based on movement detection - Google Patents

Deinterlacing method based on movement detection Download PDF

Info

Publication number
CN101483746A
CN101483746A CNA2008103064495A CN200810306449A CN101483746A CN 101483746 A CN101483746 A CN 101483746A CN A2008103064495 A CNA2008103064495 A CN A2008103064495A CN 200810306449 A CN200810306449 A CN 200810306449A CN 101483746 A CN101483746 A CN 101483746A
Authority
CN
China
Prior art keywords
pixel value
absolute difference
field
interpolation
directional dependency
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2008103064495A
Other languages
Chinese (zh)
Other versions
CN101483746B (en
Inventor
徐慧博
刘强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Hongwei Technology Co Ltd
Original Assignee
Sichuan Hongwei Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Hongwei Technology Co Ltd filed Critical Sichuan Hongwei Technology Co Ltd
Priority to CN2008103064495A priority Critical patent/CN101483746B/en
Publication of CN101483746A publication Critical patent/CN101483746A/en
Application granted granted Critical
Publication of CN101483746B publication Critical patent/CN101483746B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Television Systems (AREA)

Abstract

The invention relates to a time-space domain multi-field related de-interlacing technology, which is convenient to realize by hardware. The invention provides a de-interlacing method with time-space domain information based on a movement adaptive algorithm. Different from moving state directed by an ordinary movement adaptive algorithm, that is field-in interpolation, the invention still searches for related points in a small scale of adjacent fields when it judges that it is in the moving state. It further determines specific interpolation method of next step by comparison with a predetermined value. The predetermined value ensures inter-field direction correlation of current field enough, which makes inter-field pixel points with large direction correlation participate interpolation calculation. It consumes less hardware resources compared with various search algorithms of movement compensation and is added with time domain information compared with movement adaptation, which makes the interpolation result in movement condition have time continuity. It uses peripheral point information to calculate, which overcomes effect of noise to mono-point operation and improves movement interpolation accuracy.

Description

Interlace-removing method based on motion detection
Technical field
The invention belongs to Digital Image Processing and video display technology field, be specifically related to a kind of many relevant interlace-removing methods in hard-wired time-space domain that are convenient to.
Technical background
The video image deinterlacing technique is that image processing is at one of field of television application indispensable a kind of important technology of present stage.
The most frequently used radio and television standard comprises that the pal mode (being mainly used in Europe and China) of NTSC 525 row 60Hz field frequency standards (being mainly used in North America and Japan) and 625 row 50Hz field frequencies is because interlaced picture is lost vertical resolution, and there are shortcomings such as line flicker and line crawl, so will be interlaced to conversion line by line inevitably, i.e. deinterlacing technique.The essence of removing the interlacing algorithm is by existing interlace signal being carried out interpolation with strange (idol) that recover actual things OK.Field picture is reduced to two field picture, thereby improves the vertical definition of image.
Go the interlacing algorithm to be divided into linear algorithm, nonlinear algorithm, Motion Adaptive algorithm and movement compensating algorithm according to the kind difference of its used filter.Wherein, the Motion Adaptive algorithm is selected interpolation method according to the kinematic parameter that motion detection obtains, and at present most common way is for judging that parameter is complete when static, duplicates or average interpolation between adopting, and pure field interpolation is then adopted in motion fully.About the general detection method that adopts based on the field difference of motion detection, promptly utilize adjacent two pixel value and threshold ratio, then be judged as motion greater than this value, otherwise just think rest point.Duplicate wherein, promptly adjacent last one or next respective coordinates location point pixel number are composed to current point to be inserted.Average interpolation is promptly asked average value as current point to be inserted to adjacent two the corresponding same coordinate position pixel numbers in front and back.Pure field interpolation is and only utilizes current to be inserted some institute's presence data to calculate the value of trying to achieve point to be inserted.The Motion Adaptive algorithm can be eliminated the motion sawtooth, improves the vertical definition of image, but needs the pointwise motion detection, affected by noise big, and can't utilize time domain information in the motion state interpolation, weighting also can cause the interpolation result inaccuracy, and the interlacing effect is gone in influence.
Movement compensating algorithm (mates as piece by image is carried out estimation, optical flow method etc.), try to achieve the moving displacement vector, at the enterprising row interpolation of movement locus, the image that is restored by this method can be good at keeping the vertical definition of image, but the motion vector that he requires to estimate must cause amount of calculation big accurately and reliably, it is bigger to consume hardware resource, hardware implementation complexity height.
Summary of the invention
Technical problem to be solved by this invention is, provide a kind of on the basis of Motion Adaptive algorithm the interlace-removing method of joining day domain information.
The present invention solves the problems of the technologies described above the technical scheme that is adopted to be, based on the interlace-removing method of motion detection,, may further comprise the steps:
Directional dependency in the maximum field of a, calculating interpolation point;
B, judge whether the motion detection result of interpolation point is inactive state, in this way, duplicate or average interpolation between adopting; As not, be motion state then, enter step c;
Directional dependency between the maximum field of c, calculating interpolation point;
D, judge that whether directional dependency is greater than setting threshold between maximum field; And judge that whether directional dependency is greater than directional dependency in the maximum field between maximum field, in this way, employing is directional dependency counterparty two the pixel value adjacent with interpolation point upwards in maximum field, and the directional dependency counterparty makes progress the average of two the pixel value adjacent with interpolation point as interpolation result between maximum field; As not, adopt in maximum field directional dependency counterparty upwards the average of two the pixel value adjacent with interpolation point as interpolation result.
The present invention detects motion state with the general motion adaptive approach and promptly adopts field interpolation different, the present invention is when being judged as motion state, still can close on the field among a small circle in searching whether relevant point is arranged, by relatively coming to determine next step concrete interpolation method with setting threshold.Setting threshold makes pixel participation interpolation calculation between the big field of correlation when directional dependency is enough big between this guarantees then and there.If there is not the restriction of setting threshold, a directional dependency and an interior directional dependency may be all very little between, even at this time seem better on the directional dependency ratio interior directional dependency numerical value between the field, and in fact this situation is not because motion causes, so still should consider an interior difference.The various searching algorithms consumption hardware resources of comparing motion compensation are few, compare Motion Adaptive and added time domain information, make the interpolation result under the motion conditions also have time continuity, and dot information calculates around utilizing, overcome the single-point computing and be subjected to The noise, improved the sport interpolation precision.
Concrete, the present invention by in the calculated field/between the pixel value absolute difference of direction corresponding pixel points reflect in/between the directional dependency size; In/between the pixel value absolute difference of direction corresponding pixel points big more, in/between directional dependency more little.If point coordinates to be inserted be in the fn of front court (i, j), row to be inserted up and down two adjacent lines choose the scope of j ± k, in spatial domain so that (i j) asks the poor of absolute value for centrosymmetric 2, and to establish with directional dependency in the maximum field 2 be Xup, Xdown.For example, direction a (a=0 ... k) in spatial domain with (i, j) be centrosymmetric 2 absolute difference Ia then:
Ia=|P(i-1,j-a)-P(i+1,j+a)|
This direction interpolation point two ends pixel gray scale difference of the big more explanation of pixel value absolute difference is apart from big more, and correlation is more little; Phase, directional dependency is big more.So, the pairing absolute difference=min of maximum directional dependency (I1, I2 ..., Ik);
If min (I1, I2 ..., Ik)=Ia, then Xup=P (i-1, j-a); Xdown=P (i+1, j+a);
The interior directional dependency counterparty of maximum field is the average X=(Xup+Xdown)/2 of two the pixel value adjacent with interpolation point upwards.
For avoiding the single-point noise effect, can choose directional dependency and calculate 2 left and right sides some spots according to asking the poor of absolute value with the direction of correlation calculations level.
That is, by in the calculated field/between the direction corresponding pixel points the pixel value absolute difference, in/between the direction corresponding pixel points left neighbor pixel the pixel value absolute difference, in/between the pixel value absolute difference sum of right neighbor pixel of direction corresponding pixel points reflect in/between the directional dependency size; In/between pixel value absolute difference sum big more, in/between directional dependency more little.If point coordinates to be inserted be in the fn of front court (i, j), row to be inserted up and down two adjacent lines choose the scope of j ± k, in spatial domain so that (i j) asks the poor of absolute value for centrosymmetric 2, and to establish with directional dependency in the maximum field 2 be Xup, Xdown.
For example, ask direction b (b=0 ... k) pixel value absolute difference sum Ib in the field, adopt direction b some P (i-1, j-b) and P (i+1, j+b) and on every side P (i-1, j-b-1) and P (i+1, j+b-1) and P (i-1, j-b+1) and P (i+1, j+b+1), then:
Ib=|P(i-1,j-b)-P(i+1,j+b)|+|P(i-1,j-b-1)-P(i+1,j+b-1)|+|P(i-1,j-b+1)-P(i+1,j+b+1)|;
Pairing interior pixel value absolute difference sum=min of maximum directional dependency (I1, I2 ..., Ik);
If min (I1, I2 ..., Ik)=Ib, then Xup=P (i-1, j-b); Xdown=P (i+1, j+b);
The interior directional dependency counterparty of maximum field is the average X=(Xup+Xdown)/2 of two the pixel value adjacent with interpolation point upwards.
Further, the invention provides a kind of new method for testing motion, by many Pixel Information try to achieve poor, frame is poor, makes that the result of motion detection is accurate.Judge the motion detection result of interpolation point among the step b, specifically may further comprise the steps:
B-1, the human eye sensitivity threshold is set; For the result who makes motion detection meets the custom that human eye is watched more, the sensitivity of brightness has been selected the human eye sensitivity threshold according to human eye;
B-2, calculate in maximum field the upwards average of two the pixel value adjacent with interpolation point of directional dependency counterparty, with the absolute difference of interpolation point pixel value of the point of correspondence position in previous field, its absolute difference result is first poor fL; The average of calculating 2 the pixel value that the directional dependency counterparty is upwards adjacent with interpolation point in maximum field, with the absolute difference of interpolation point pixel value of the point of correspondence position in Hou Yichang, its absolute difference result is second poor fR; Calculate the pixel value of described interpolation point point of correspondence position in previous field, the absolute difference of interpolation point pixel value of the point of correspondence position in Hou Yichang, its absolute difference result are frame difference f LRf L=| f N-1(i, j)-X|; f R=| f N+1(i, j)-X|; f LR=| f N-1(i, j)-f N+1(i, j) |;
B-3, get maximum between first difference, second difference, the frame difference as kinematic parameter MAXf, MAXf=max (f L, f R, f LR);
B-4, whether judge kinematic parameter greater than the human eye sensitivity threshold, in this way, the motion detection result of interpolation point is a motion state; As not, the motion detection result of interpolation point is an inactive state.
The invention has the beneficial effects as follows, the present invention is to duplicate or average interpolation between adopting under the static situation in motion detection, whether bigger at the motion conditions information correlativity that judges between the field, when switching such as scene or significantly under the situation such as motion, adopt field interpolation method, avoid interframe inaccurate information impression interpolation result.Otherwise interpolation result is information between reference field still, and the harmful effect of both having avoided the inaccuracy of motion match to bring is simplified again and calculated.Through evidence, can make interlacing obtain good effect.
Embodiment
1, ask maximum correlation in the frame, suppose that range of choice is j ± 2, utilize ask correlation point about respectively one light the de-noising effect, the interior pixel value absolute difference sum of all directions corresponding fields:
I0=|P(i-1,j)-P(i+1,j)|+|P(i+1,j-1)-P(i-1,j-1)|+|P(i-1,j+1)-P(i+1,j+1)|;
I1=|P(i-1,j-1)-P(i+1,j+1)|+|P(i-1,j-2)-P(i+1,j)|+|P(i-1,j)-P(i+1,j+2)|;
I2=|P(i-1,j-2)-P(i+1,j+2)|+|P(i-1,j-3)-P(i+1,j+1)|+|P(i-1,j-1)-P(i+1,j+3)|;
Pairing interior pixel value absolute difference sum=min of maximum directional dependency (I1, I2, I0);
If min (I1, I2, I0)=and I0,2 that then have the interior directional dependency of maximum field are Xup, Xdown:
Xup=P(i-1,j);
Xdown=P(i+1,j);
The interior directional dependency counterparty of maximum field is the average X of two the pixel value adjacent with interpolation point upwards, X=(Xup+Xdown)/2;
2, on the human eye sensitivity curve of correspondence, select human eye sensitivity threshold value TM according to X.The TM value may have unsteady according to people's vision system difference, so choose suitable value when determining as far as possible, in case the situation of motion mistake is judged as static, not so can cause serious ghost image problem, and the TM span is advisable between 12-24;
3, try to achieve a poor f by many Pixel Information L, f R, frame difference f LR:
f L=|f n-1(i,j)-X|;
f R=|f n+1(i,j)-X|;
f LR=|f n-1(i,j)-f n+1(i,j)|;
4, ask kinematic parameter MAXf, MAXf=max (f L, f R, f LR), and relatively come with TM, the motion detection result of determining interpolation point is motion state or inactive state:
if?MAXf>TM
MS=1; Motion detection result is a motion state, enters step 6;
else
MS=0; Motion detection result is an inactive state; Enter step 5;
5, interpolation method is: and fn (i, j)=f N-1(i, j).
6, further judge correlation between the field, suppose in the zone of two correspondences in front and back, to choose the scope of (i ± 2, j ± 2).Ask on the time domain pixel value absolute difference sum I0 between direction 0 corresponding fields, the some fn-1 (i-2 of employing (i ± 2, j ± 0) direction, j) and fn+1 (i+2, j) and around fn-1 (i-2, j-1) and fn+1 (i+2, j-1) and fn-1 (i-2, j+1) and fn+1 (i+2, j+1);
Because the supposition scope is (i ± 2, j ± 2), pixel value absolute difference sum I0 between direction 0,1,2 corresponding fields of demand then, I1, I2:
I0=|fn-1(i-2,j)-fn+1(i+2,j)|+|fn-1(i-2,j-1)-fn+1(i+2,j-1)|+|fn-1(i-2,j+1)-fn+1(i+2,j+1)|;
I1=|fn-1(i-2,j-1)-fn+1(i+2,j+1)|+|fn-1(i-2,j2)-fn+1(i+2,j)|+|fn-1(i-2,j)-fn+1(i+2,j+2)|;
I2=|fn-1(i-2,j2)-fn+1(i+2,j+2)|+|fn-1(i-2,j-3)-fn+1(i+2,j+1)|+|fn-1(i-2,j-1)-fn+1(i+2,j+3)|;
Pixel value absolute difference sum=min between the pairing field of maximum directional dependency (I1, I2, I0);
If min (I1, I2, I0)=I0, then XL=fn-1 (i-2, j); XR=fn+1 (i+2, j);
7, motion state interpolation: if (XL+XR)/2<threshold value TX and (XL+XR)/2<X, the value computing formula of point so to be inserted is: fn (i, j)=(XL, XR X), promptly calculate XL to mid, XR, the average of X; If do not satisfy above-mentioned condition, interpolation just be fn (i, j)=X.
Threshold value TX described in the present embodiment, human eye sensitivity threshold value TM, those skilled in the art all can obtain empirical value by the experiment of routine, and adjust by concrete demand.

Claims (4)

  1. [claim 1] is characterized in that based on the interlace-removing method of motion detection, may further comprise the steps:
    Directional dependency in the maximum field of a, calculating interpolation point;
    B, judge whether the motion detection result of interpolation point is inactive state, in this way, duplicate or average interpolation between adopting; As not, be motion state then, enter step c;
    Directional dependency between the maximum field of c, calculating interpolation point;
    D, judge that whether directional dependency is greater than setting threshold between maximum field; And judge that whether directional dependency is greater than directional dependency in the maximum field between maximum field, in this way, employing is directional dependency counterparty two the pixel value adjacent with interpolation point upwards in maximum field, and the directional dependency counterparty makes progress the average of two the pixel value adjacent with interpolation point as interpolation result between maximum field; As not, adopt in maximum field directional dependency counterparty upwards the average of two the pixel value adjacent with interpolation point as interpolation result.
  2. [claim 2] based on the interlace-removing method of motion detection, is characterized in that according to claim 1, and the pixel value absolute difference by direction corresponding pixel points in the calculated field reflects an interior directional dependency size; The pixel value absolute difference of direction corresponding pixel points is big more in, and directional dependency is more little in; Pixel value absolute difference by direction corresponding pixel points between calculated field reflects directional dependency size between the field; The pixel value absolute difference of direction corresponding pixel points is big more between, and directional dependency is more little between.
  3. [claim 3] is according to claim 1 based on the interlace-removing method of motion detection, it is characterized in that, the pixel value absolute difference by direction corresponding pixel points in the calculated field, in the direction corresponding pixel points left neighbor pixel the pixel value absolute difference, in the pixel value absolute difference sum of right neighbor pixel of direction corresponding pixel points reflect an interior directional dependency size; Pixel value absolute difference sum is big more in, and directional dependency is more little in; Pixel value absolute difference by direction corresponding pixel points between calculated field, between the direction corresponding pixel points left neighbor pixel the pixel value absolute difference, between the pixel value absolute difference sum of right neighbor pixel of direction corresponding pixel points reflect directional dependency size between the field; Pixel value absolute difference sum is big more between, and directional dependency is more little between.
  4. [claim 4] as described in the claim 1,2 or 3 based on the interlace-removing method of motion detection, it is characterized in that, judge the motion detection result of interpolation point among the step b, specifically may further comprise the steps:
    B-1, the human eye sensitivity threshold is set;
    B-2, calculate in maximum field the upwards average of two the pixel value adjacent with interpolation point of directional dependency counterparty, with the absolute difference of interpolation point pixel value of the point of correspondence position in previous field, its absolute difference result is first difference; The average of calculating 2 the pixel value that the directional dependency counterparty is upwards adjacent with interpolation point in maximum field, with the absolute difference of interpolation point pixel value of the point of correspondence position in Hou Yichang, its absolute difference result is second difference; Calculate the pixel value of described interpolation point point of correspondence position in previous field, the absolute difference of interpolation point pixel value of the point of correspondence position in Hou Yichang, its absolute difference result is that frame is poor;
    B-3, get maximum between first difference, second difference, the frame difference as kinematic parameter;
    B-4, whether judge kinematic parameter greater than the human eye sensitivity threshold, in this way, the motion detection result of interpolation point is a motion state; As not, the motion detection result of interpolation point is an inactive state.
CN2008103064495A 2008-12-22 2008-12-22 Deinterlacing method based on movement detection Expired - Fee Related CN101483746B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2008103064495A CN101483746B (en) 2008-12-22 2008-12-22 Deinterlacing method based on movement detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2008103064495A CN101483746B (en) 2008-12-22 2008-12-22 Deinterlacing method based on movement detection

Publications (2)

Publication Number Publication Date
CN101483746A true CN101483746A (en) 2009-07-15
CN101483746B CN101483746B (en) 2011-04-20

Family

ID=40880656

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2008103064495A Expired - Fee Related CN101483746B (en) 2008-12-22 2008-12-22 Deinterlacing method based on movement detection

Country Status (1)

Country Link
CN (1) CN101483746B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101895674A (en) * 2010-07-02 2010-11-24 浙江红苹果电子有限公司 De-interlacing method and device for monitoring video
CN102045531A (en) * 2010-12-31 2011-05-04 福州瑞芯微电子有限公司 Realization method for deinterlaced video display of mobile terminal
CN102496165A (en) * 2011-12-07 2012-06-13 四川九洲电器集团有限责任公司 Method for comprehensively processing video based on motion detection and feature extraction
CN103024332A (en) * 2012-12-26 2013-04-03 电子科技大学 Video de-interlacing method based on edge and motion detection
CN103200381A (en) * 2013-04-07 2013-07-10 成都博盛信息技术有限公司 Hybrid interlacing removal processing method based on motion compensation reliability
CN104349105A (en) * 2014-10-09 2015-02-11 深圳市云宙多媒体技术有限公司 Deinterlacing method and system aiming at coding video source
CN104702877A (en) * 2014-12-02 2015-06-10 深圳市云宙多媒体技术有限公司 Video de-interlacing method and device
CN107071326A (en) * 2017-04-26 2017-08-18 西安诺瓦电子科技有限公司 Method for processing video frequency and device
CN107135367A (en) * 2017-04-26 2017-09-05 西安诺瓦电子科技有限公司 Video interlace-removing method and device, method for processing video frequency and device
CN107333174A (en) * 2017-07-19 2017-11-07 河海大学 A kind of method for processing video frequency based on scene change detecte
CN110248132A (en) * 2019-05-31 2019-09-17 成都东方盛行电子有限责任公司 A kind of video frame rate interpolation method
CN116016831A (en) * 2022-12-13 2023-04-25 湖南快乐阳光互动娱乐传媒有限公司 Low-time-delay image de-interlacing method and device

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101895674B (en) * 2010-07-02 2011-11-09 浙江红苹果电子有限公司 De-interlacing method and device for monitoring video
CN101895674A (en) * 2010-07-02 2010-11-24 浙江红苹果电子有限公司 De-interlacing method and device for monitoring video
CN102045531A (en) * 2010-12-31 2011-05-04 福州瑞芯微电子有限公司 Realization method for deinterlaced video display of mobile terminal
CN102496165A (en) * 2011-12-07 2012-06-13 四川九洲电器集团有限责任公司 Method for comprehensively processing video based on motion detection and feature extraction
CN103024332A (en) * 2012-12-26 2013-04-03 电子科技大学 Video de-interlacing method based on edge and motion detection
CN103024332B (en) * 2012-12-26 2015-05-27 电子科技大学 Video de-interlacing method based on edge and motion detection
CN103200381A (en) * 2013-04-07 2013-07-10 成都博盛信息技术有限公司 Hybrid interlacing removal processing method based on motion compensation reliability
CN103200381B (en) * 2013-04-07 2016-06-29 成都博盛信息技术有限公司 Based on the mixing de-interlacing processing method that motion compensation is credible
CN104349105B (en) * 2014-10-09 2017-11-17 深圳市云宙多媒体技术有限公司 A kind of interlace-removing method and system for encoded video source
CN104349105A (en) * 2014-10-09 2015-02-11 深圳市云宙多媒体技术有限公司 Deinterlacing method and system aiming at coding video source
CN104702877A (en) * 2014-12-02 2015-06-10 深圳市云宙多媒体技术有限公司 Video de-interlacing method and device
CN107071326A (en) * 2017-04-26 2017-08-18 西安诺瓦电子科技有限公司 Method for processing video frequency and device
CN107135367A (en) * 2017-04-26 2017-09-05 西安诺瓦电子科技有限公司 Video interlace-removing method and device, method for processing video frequency and device
CN107135367B (en) * 2017-04-26 2019-10-29 西安诺瓦星云科技股份有限公司 Video interlace-removing method and device, method for processing video frequency and device
CN107071326B (en) * 2017-04-26 2020-01-17 西安诺瓦星云科技股份有限公司 Video processing method and device
CN107333174A (en) * 2017-07-19 2017-11-07 河海大学 A kind of method for processing video frequency based on scene change detecte
CN107333174B (en) * 2017-07-19 2020-05-05 河海大学 Video processing method based on scene shear detection
CN110248132A (en) * 2019-05-31 2019-09-17 成都东方盛行电子有限责任公司 A kind of video frame rate interpolation method
CN116016831A (en) * 2022-12-13 2023-04-25 湖南快乐阳光互动娱乐传媒有限公司 Low-time-delay image de-interlacing method and device
CN116016831B (en) * 2022-12-13 2023-12-05 湖南快乐阳光互动娱乐传媒有限公司 Low-time-delay image de-interlacing method and device

Also Published As

Publication number Publication date
CN101483746B (en) 2011-04-20

Similar Documents

Publication Publication Date Title
CN101483746B (en) Deinterlacing method based on movement detection
CN101416523B (en) Motion compensated frame rate conversion with protection against compensation artifacts
KR20060135770A (en) Reducing artefacts in scan-rate conversion of image signals by combining interpolation and extrapolation of images
CN101536507B (en) Image display device and method, and image processing device and method
EP1723786B1 (en) Motion compensation deinterlacer protection
CN101895674B (en) De-interlacing method and device for monitoring video
GB2450121A (en) Frame rate conversion using either interpolation or frame repetition
CN102025960A (en) Motion compensation de-interlacing method based on adaptive interpolation
CN101510985B (en) Self-adapting de-interleave method for movement compensation accessory movement
US8345148B2 (en) Method and system for inverse telecine and scene change detection of progressive video
US20080165278A1 (en) Human visual system based motion detection/estimation for video deinterlacing
CN102364933A (en) Motion-classification-based adaptive de-interlacing method
CN101699856B (en) De-interlacing method with self-adapting motion
JP2004007568A (en) Image converting device and image converting method
CN100425054C (en) Film mode extrapolation
US7616693B2 (en) Method and system for detecting motion between video field of same and opposite parity from an interlaced video source
US20140023141A1 (en) Image deblurring method using motion compensation
JPS5940772A (en) Double scanning television receiver
CN101483747B (en) Movement detection method suitable for deinterlacing technique
JP2004320278A (en) Dynamic image time axis interpolation method and dynamic image time axis interpolation apparatus
US7796189B2 (en) 2-2 pulldown signal detection device and a 2-2 pulldown signal detection method
CN101370145A (en) Shielding method and apparatus for image frame
CN102497492B (en) Detection method for subtitle moving in screen
US20060181642A1 (en) Apparatus for interpolating scanning lines
KR100587263B1 (en) Method for frame rate conversing of video signal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20110420

Termination date: 20161222

CF01 Termination of patent right due to non-payment of annual fee