CN1622613A - Texture information based video image motion detecting method - Google Patents

Texture information based video image motion detecting method Download PDF

Info

Publication number
CN1622613A
CN1622613A CN 200410093098 CN200410093098A CN1622613A CN 1622613 A CN1622613 A CN 1622613A CN 200410093098 CN200410093098 CN 200410093098 CN 200410093098 A CN200410093098 A CN 200410093098A CN 1622613 A CN1622613 A CN 1622613A
Authority
CN
China
Prior art keywords
pixel
motion
texture information
diff
cff
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 200410093098
Other languages
Chinese (zh)
Other versions
CN1312924C (en
Inventor
董云朝
邹琪
郑世宝
戈迪
王峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CNB2004100930986A priority Critical patent/CN1312924C/en
Publication of CN1622613A publication Critical patent/CN1622613A/en
Application granted granted Critical
Publication of CN1312924C publication Critical patent/CN1312924C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The video image motion detecting method based on grain information belongs to the field of electronic information technology. The present invention adopts two characteristics of the frame-to-frame difference of brightness signal and its neighborhood pixel grain information matching to judge the motion state of the pixel to be interpolated. Through calculating the frame-to-frame difference of pixels in two adjacent fields based on input image, calculating the grain information of the pixels, judging the matching state of the grain information, and judging the motion state of the pixel to be interpolated, the motion detection result of the pixel to be interpolated is finally determined. The present invention raises the distinction capacity between fast motion state and static state and motion state judging accuracy greatly, and may be used in control device supporting progressively scanning LCOS, LCD, DLP and other digital displays.

Description

Video image motion detecting method based on texture information
Technical field
What the present invention relates to is a kind of method of video image motion detection, and particularly a kind of method of the video image motion detection based on texture information belongs to electronic information field.
Background technology
Along with the continuous development of Display Technique and the popularization gradually of digital television broadcasting, people are more and more higher to the requirement of the definition of picture and fidelity.Conventional television broadcast has been because interlacing scan has been adopted in the restriction of bandwidth, only comprises in the original picture information of half on each width of cloth picture, constitutes strange and idol respectively.Interlacing scan not only can cause the decline of visual vertical resolution, also exists defectives such as interline flicker and large-area flicker simultaneously.Along with liquid crystal TV set, the plasma TV of large scale high brightness enters family, the distortion that interlace mode causes more and more allows the people stand, and generally all has been equipped with the chip that can finish de-interlacing function in the chipset of high-grade television set for this reason.The simplest interlacing algorithm that goes is exactly that strange and the even field pattern that will belong to same frame resembles direct assembly and inserts, and forms an output frame.Under the static situation of image, it is right-on doing like this.But have motion when between strange and the idol, especially during the motion of horizontal direction, the method that this direct assembly is inserted will cause tangible sawtooth, thereby the output image that makes the mistake has a strong impact on visual effect.Another interlacing algorithm that goes commonly used is the field interpolation filtering method.This method is better than direct assembly interlude method to the treatment effect of motion picture, but the processing to still frame just is difficult to accept, people begin one's study and in de-interlaced process these two kinds of algorithms are combined for this reason, motion conditions according to image adopts diverse ways, thereby is better gone the interlacing effect.
Find that by literature search the Chinese patent name is called: " method of moving in the detection terleaved video sequence and the device of motion detection ", application number is: (01143659.X).This patent has proposed a kind of method of calculating the motion signals in video signal processing system.According to the method that this patent provides, adopt three continuous field pattern image datas to calculate the frame difference and form the point-like motion detection signal; Calculate district's shape motion detection signal from adjacent two point-like motion detection signal; Form the motion signals from district's shape motion detection signal, be used for the further processing of interlace video signal treatment system.Owing in the process of calculating the frame difference, only adopted a framing difference signal,, be about to rapid movement and differentiate for static so when comprising the rapid movement image in the image, can cause the misjudgement of motion state.
Summary of the invention
The objective of the invention is to overcome deficiency of the prior art, a kind of method of the video image motion detection based on the texture information matching judgment is provided, the video image motion detection that is used for interlacing module or device based on texture information, make it can improve the accuracy of motion detection, and then improve the visual effect of going the interlacing conversion, reproduce original picture realistically.
The present invention is achieved by the following technical solutions, the present invention adopts the frame difference of luminance signal and the motion conditions that two features of the interior pixel texture information coupling of neighborhood are judged the interpolation pixels thereof, according to the frame difference between the pixel that is in same position among adjacent two an of frame period of input imagery counting period, calculate the texture information of above pixel then, judge whether texture information mates, again according to frame difference and texture information match condition, judge the motion conditions of interpolation pixel, finally determine the motion detection result of interpolation pixel.
Below the present invention is further illustrated:
(1) calculating of frame difference
The present invention calculates two framing differences Diff_CFF and Diff_BF, and computational methods are shown in formula:
Diff _ CFF [ m ] = 1 3 Σ k = - 1 1 | I ( i , j + m + k , n ) - I ( i , j - m + k , n - 2 ) |
Diff _ BF [ m ] = 1 3 Σ k = - 1 1 | I ( i + 1 , j + m + k , n + 1 ) - I ( i + 1 , j - m + k , n - 1 ) |
I (i wherein, j, the n) gray scale of the capable j row of i pixel in the expression n field, I (i, j, n-2) gray scale of the capable j row of i pixel in the expression n-2 field, I (i+1, j, n-1) gray scale of the capable j row of i+1 pixel in the expression n-1 field, I (i+1, j, n+1) gray scale of the capable j row of i+1 pixel in the expression n+1 field; M represent pixel I (i, j, n) displacement in the horizontal direction, m ∈ [N, N], N represent that (level n) can detect range of movement to pixel I for i, j; Diff_FFC[m] between expression n field and n-2 field along the difference of the corresponding pixel of direction m, Diff_FB[m] represent between n+1 field and n-1 field along the difference of the corresponding pixel of direction m.
Because the interval in time, input field that these two groups of differences relate to is a frame period, so the difference between them is called frame difference.The range of movement that the present invention detects be (0 ,-N) to (0, N) between, i.e. present picture element I (i, j, [N, N] interior motion conditions n) in the horizontal direction.
(2) judgement of texture information coupling
The texture information matching judgment comprises the calculating of texture information and the judgement of texture information coupling.Corresponding with frame difference, the coupling of texture information also is divided into coupling between n-2 field and the n field and the coupling between n-1 field and the n+1 field, respectively with Accordance_CFF[m] and Accordance_BF[m] expression.Here with Accordance_BF[m] be the process of example explanation texture information matching judgment.If (i, j-m's pixel I of n-1 field n-1) move through the interpolation pixel I (i of n field along direction m, j+1 is n) to pixel I (i, the j+m of n+1 field, n+1), then need to consider the pixel variations in following two set, promptly whether consistent: { I (i with the pixel textural characteristics in the lower area, j+m+2 (k-2), n+1), k ∈ [0,4] } and I (i, j-m+2 (k-2), n-1), k ∈ [0,4] }.Texture information in this zone is collected shown in as follows:
GradBw ( m , k , n + 1 ) = I ( i , j + m + 2 ( k - 1 ) , n + 1 ) - I ( i , j + m + 2 k , n + 1 ) k ∈ [ 0 , 1 ] GradFw ( i , j , n - 1 ) = I ( i , j - m + 2 ( k - 1 ) , n - 1 ) - I ( i , j - m + 2 k , n - 1 ) k ∈ [ 0,1 ]
GradBw _ Border ( m , 0 , n + 1 ) = I ( i , j + m - 4 , n + 1 ) - I ( i , j + m - 2 , n + 1 ) GradBw _ Border ( m , 1 , n + 1 ) = I ( i , j + m + 4 , n + 1 ) - I ( i , j + m + 2 , n + 1 ) GradFw _ Border ( m , 0 , n - 1 ) = I ( i , j - m - 4 , n - 1 ) - I ( i , j - m - 2 , n - 1 ) GradFw _ Border ( m , 0 , n - 1 ) = I ( i , j - m + 4 , n - 1 ) - I ( i , j - m + 2 , n - 1 )
Wherein GradBw and GradFw represent the texture information of central area, and GradBw_Border and GradFw_Border represent the texture information of borderline region.
The judgement of texture information coupling is as follows:
If 1. 3. the element among GradBw and the GradFw then forwards to all less than a certain thresholding; Otherwise order is carried out;
If 2. GradBw is consistent with the corresponding element symbol among the GradFw, then forward to 3.; Otherwise match flag parameter A ccordanceBF[m is set]=0, the texture matching judgment finishes;
If 3. the element among GradBw_Border and the GradFw_Border is all less than a certain thresholding, match flag parameter A ccordanceBF[m is set then]=1, the texture matching judgment finishes; Otherwise order is carried out;
If 4. GradBw_Border is consistent with the symbol of element among the GradFw_Border, corresponding match flag parameter A ccordanceBF[m is set then]=1, the texture matching judgment finishes; Otherwise order is carried out;
5. match flag parameter A ccordanceBF[m is set]=0, the texture matching judgment finishes.
(3) determine motion detection result
Determine that motion detection result comprises the definite of the determining of kinematic coefficient and the direction of motion, the kinematic coefficient of pixel represents that with Motion the direction of motion is represented with Dir.Process is as follows:
1. suppose Diff_CFF[m1]=min (Diff_CFF[m], m ∈ [N, N]), Diff_BF[m2]=min (Diff_BF[m], m ∈ [N, N]), then m1 and m2 are called the relevant direction of motion of Diff_CFF and Diff_BF.
If 2. Diff_CFF[m1] and Diff_BF[m1] all less than the thresholding of regulation, and Accordance_CFF[m1] and Accordance_BF[m1] all be 1, then Motion=0, and Dir=m1.
If 3. Diff_CFF[m2] and Diff_BF[m2] all less than the thresholding of regulation, and Accordance_CFF[m2] and Accordance_BF[m2] all be 1, then Motion=0, and Dir=m2.
4. otherwise determine Motion=1, Dir=0 simultaneously.
The present invention has substantive distinguishing features and marked improvement, and the present invention is used to finish the motion detection to the interlaced scanning video image, and it can be applied in interlacing module or the device.The present invention at first calculates two framing differences according to four adjacent input video images, and judges the match condition of pixel textural characteristics in the motion detection zone; Match condition according to two framing differences and pixel textural characteristics is determined motion detection result.In the separating capacity that improves rapid movement and static these two kinds of motion states, also improved motion state greatly, comprise the accuracy of the judgement of the direction of motion.Can be used in the control device of multiple digital displays such as the LCOS, the LCD that support to line by line scan, DLP.
Description of drawings
Fig. 1 motion state involved in the present invention is judged theory diagram.
Fig. 2 frame difference involved in the present invention calculates schematic diagram.
Fig. 3 texture information involved in the present invention calculates schematic diagram.
Embodiment
Provide following examples in conjunction with content of the present invention:
The present invention adopts the basis for estimation of the difference of pixel texture information in the frame difference of luminance signal and the neighborhood thereof as motion detection.The interior motion of [N ,+N] scope in the horizontal direction detects to present picture element, and provides the coefficient of interframe movement interpolation.
To interleaved list entries, suppose that the i that comprises in the original picture in the n field is capable, i+2 is capable, and i+4 is capable etc., and de-interlaced purpose is that the i+1 that lacks in the additional as far as possible realistically n field is capable, and i+3 is capable, and i+5 is capable etc.If not motion between n field and n-1 field, then the information that lacks in the n field will be directly additional with the information in the n-1 field, otherwise will according to the motion conditions between two inside between interpolation and the interframe interpolation weighted sum obtain.This shows and the motion conditions between n field and the n-1 field judged accurately obtain the correct key of going the interlacing result.
As shown in Figure 1, be principle of the invention block diagram.Motion detection process is as follows: interleaved video image order through three field Postponement modules 10,11 and 12, is obtained pixel collection { I (i, the j+m of n+1 field respectively, n+1), m ∈ [N, N] } pixel collection { I (i of 21, the n fields, j+m, n), m ∈ [N, N] } 22, the pixel collection of n-1 field { I (i, j+m, n-1), m ∈ [N, N] } 23, and the pixel collection of n-2 field { I (i, j+m, n-2), m ∈ [N, N] } 24.Pixel collection 22 and 24 enters frame difference and calculates I module 13, obtains frame difference set { Diff_CFF[m], m ∈ [N, N] }, represents with signal 31; 22 and 24 enter texture matching judgment I module 14, obtain texture matching result set { Accordance_CFF[m], m ∈ [N, N] }, represent with signal 32.Pixel collection 21 and 23 enters frame difference and calculates II module 15, obtains frame difference set { Diff_BF[m], m ∈ [N, N] }, represents with signal 33; 21 and 23 enter texture matching judgment II module 15, obtain texture matching result set { Accordance_BF[m], m ∈ [N, N] }, represent with signal 34.Signal 31,32,33,34 is sent into motion detection block 17, finishes last motion state and detects, and obtains motion detection coefficient Motion, represents and direction of motion Dir with signal 41, represents with signal 42.
Shown in Figure 2, be the example of frame difference calculating involved in the present invention.Suppose pixel I in the n-1 field (i+1, j+1 n-1) move horizontally-1 pixel arrive in the n field (i+1 j) locates, and along this direction arrive in the n+1 field (i+1 j-1) locates.As seen when m=-1, (n-1) (i+1, j-1 n+1) are two pixels on the same movement locus to I with I for i+1, j+1.Consider the neighborhood relevance of image, when calculating frame difference, not only consider the pixel on the movement locus, also consider and these pixel left and right sides neighboring pixels.Calculate their frame difference with formula (1).
Shown in Figure 3, be the example of texture information calculating involved in the present invention.Suppose pixel I in the n-1 field (i+1, j+1 n-1) move horizontally-1 pixel arrive in the n field (i+1 j) locates, and along this direction arrive in the n+1 field (i+1 j-1) locates.As seen when m=-1, (n-1) (i+1, j-1 n+1) are two pixels on the same movement locus to I with I for i+1, j+1.Consider the neighborhood relevance of image, and in order to improve the noiseproof feature of texture judge module, the pixel of pixel on movement locus that when calculating texture information, adopts, also comprise with these pixels between left and right every the pixel of a pixel.Calculate texture information with formula (2) and (3).
At first calculate frame difference and texture information.The picture element movement scope N that detects in the special case is 2, promptly detects interpolation pixel I (i+1, j, the motion conditions in [2,2] scope n) in the horizontal direction.Calculate frame difference { Diff_CFF[m], m ∈ [2,2] } and { Diff_BF[m], m ∈ [2,2] } according to formula (1), in special case with 15 thresholdings as frame difference.Calculate texture information according to formula (2) and (3),, judge whether texture information mates with 10 thresholdings as the texture information matching judgment, obtain gathering Accordance_CFF[m], m ∈ [2,2] } and Accordance_BF[m], m ∈ [2,2] }.Determine interpolation pixel I (i+1, j, n) the motion detection coefficient and the direction of motion according to frame difference and texture matching judgment result at last.
Adopt above method to quick horizontal movement (horizontal movement speed is 6 pixels/field), still image and at a slow speed three groups of video sequences (every group of each 10 frame) such as horizontal movement image (horizontal movement speed is 2 pixels/field) carried out the motion detection test.The result shows that the accuracy that this patent is judged above three kinds of motion states all reaches more than 95%, the motion state of still image is judged reach 100%.

Claims (4)

1, a kind of video image motion detecting method based on texture information, it is characterized in that, adopt the frame difference of luminance signal and the motion conditions that two features of the interior pixel texture information coupling of neighborhood are judged the interpolation pixels thereof, according to the frame difference between the pixel that is in same position among adjacent two an of frame period of input imagery counting period, calculate the texture information of above pixel then, judge whether texture information mates, again according to frame difference and texture information match condition, judge the motion conditions of interpolation pixel, finally determine the motion detection result of interpolation pixel.
2, the video image motion detecting method based on texture information according to claim 1 is characterized in that, described frame difference calculates and adopts following formula:
Diff _ CFF [ m ] = 1 3 Σ k = - 1 1 | I ( i , j + m + k , n ) - I ( i , j - m + k , n - 2 ) |
Diff _ BF [ m ] = 1 3 Σ k = - 1 1 | I ( i + 1 , j + m + k , n + 1 ) - I ( i + 1 , j - m + k , n - 1 ) |
Diff_CFF[m wherein] pixel I (i in the expression n field, j+m, n) and horizontal direction neighborhood [1,1] Nei pixel and the pixel I (i in the n-2 field, j-m, n-2) and horizontal direction neighborhood [1,1] in pixel between frame difference, Diff_BF[m] expression when motion vector be (0, in the time of 2m), pixel I in the n+1 field (i+1, j+m, n+1) and horizontal direction neighborhood [1,1] Nei pixel and the pixel I (i+1 in the n-1 field, j-m, n-1) and horizontal direction neighborhood [1,1] in pixel between frame difference.
3, the video image motion detecting method based on texture information according to claim 1 is characterized in that, described texture information coupling comprises the calculating of texture information and the judgement of texture information coupling:
Following formula is adopted in the calculating of described texture information:
Figure A2004100930980002C4
Wherein GradBw and GradFw represent the texture information of central area, and GradBw_Border and GradFw_Border represent the texture information of borderline region;
Described texture information matching judgment is as follows:
If 1. 3. the element among GradBw and the GradFw then forwards to all less than a certain thresholding; Otherwise order is carried out;
If 7. GradBw is consistent with the corresponding element symbol among the GradFw, then forward to 3.; Otherwise match flag parameter A ccordanceBF[m is set]=0, the texture matching judgment finishes;
If 3. the element among GradBw_Border and the GradFw_Border is all less than a certain thresholding, match flag parameter A ccordanceBF[m is set then]=1, the texture matching judgment finishes; Otherwise order is carried out;
If 4. GradBw_Border is consistent with the symbol of element among the GradFw_Border, corresponding match flag parameter A ccordanceBF[m is set then]=1, the texture matching judgment finishes; Otherwise order is carried out;
5. match flag parameter A ccordanceBF[m is set]=0, the texture matching judgment finishes.
4, the video image motion detecting method based on texture information according to claim 1 is characterized in that, described definite motion detection result, comprise determining of kinematic coefficient and determining of the direction of motion, the kinematic coefficient of pixel represents that with Motion the direction of motion represents that with Dir process is as follows:
1. suppose Diff_CFF[m1]=min (Diff_CFF[m], m ∈ [N, N]), Diff_BF[m2]=min (Diff_BF[m], m ∈ [N, N]), then m1 and m2 are called the relevant direction of motion of Diff_CFF and Diff_BF;
If 2. Diff_CFF[m1] and Diff_BF[m1] all less than the thresholding of regulation, and Accordance_CFF[m1] and Accordance_BF[m1] all be 1, then Motion=0, and Dir=m1;
If 3. Diff_CFF[m2] and Diff_BF[m2] all less than the thresholding of regulation, and Accordance_CFF[m2] and Accordance_BF[m2] all be 1, then Motion=0, and Dir=m2;
4. otherwise determine Motion=1, Dir=0.
CNB2004100930986A 2004-12-16 2004-12-16 Texture information based video image motion detecting method Expired - Fee Related CN1312924C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB2004100930986A CN1312924C (en) 2004-12-16 2004-12-16 Texture information based video image motion detecting method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB2004100930986A CN1312924C (en) 2004-12-16 2004-12-16 Texture information based video image motion detecting method

Publications (2)

Publication Number Publication Date
CN1622613A true CN1622613A (en) 2005-06-01
CN1312924C CN1312924C (en) 2007-04-25

Family

ID=34766369

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2004100930986A Expired - Fee Related CN1312924C (en) 2004-12-16 2004-12-16 Texture information based video image motion detecting method

Country Status (1)

Country Link
CN (1) CN1312924C (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101030295B (en) * 2006-02-27 2010-05-26 精工爱普生株式会社 Data conversion method for processor, texture producing method, program, recording medium, and projector
CN101631246B (en) * 2008-07-17 2011-07-06 联发科技股份有限公司 Image processing apparatus and method
CN101523475B (en) * 2006-11-24 2011-09-28 夏普株式会社 Image display apparatus
CN102215321A (en) * 2010-04-08 2011-10-12 联咏科技股份有限公司 Mobile detection method and device
CN101533516B (en) * 2008-03-10 2012-05-02 索尼株式会社 Information processing device and method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU5457598A (en) * 1996-11-27 1998-06-22 Princeton Video Image, Inc. Image insertion in video streams using a combination of physical sensors and pattern recognition

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101030295B (en) * 2006-02-27 2010-05-26 精工爱普生株式会社 Data conversion method for processor, texture producing method, program, recording medium, and projector
CN101523475B (en) * 2006-11-24 2011-09-28 夏普株式会社 Image display apparatus
CN101533516B (en) * 2008-03-10 2012-05-02 索尼株式会社 Information processing device and method
CN101631246B (en) * 2008-07-17 2011-07-06 联发科技股份有限公司 Image processing apparatus and method
CN102215321A (en) * 2010-04-08 2011-10-12 联咏科技股份有限公司 Mobile detection method and device
CN102215321B (en) * 2010-04-08 2013-07-24 联咏科技股份有限公司 Mobile detection method and device

Also Published As

Publication number Publication date
CN1312924C (en) 2007-04-25

Similar Documents

Publication Publication Date Title
CN100348035C (en) Method and equipment for calculating kinematical vector
CN1265633C (en) Interlacing-removing device and method
CN1270526C (en) Device and method for using adaptive moving compensation conversion frame and/or semi-frame speed
US6262773B1 (en) System for conversion of interlaced video to progressive video using edge correlation
CN1087892C (en) Motion adaptive scan-rate conversion using directional edge interpolation
JPH0951509A (en) Video converting method
CN1460368A (en) Scalable resolution enhancement of video image
CN1922873A (en) Reducing artefacts in scan-rate conversion of image signals by combining interpolation and extrapolation of images
WO2008002491A2 (en) Systems and methods for a motion compensated picture rate converter
CN101043609A (en) Subtitle detection apparatus, subtitle detection method and pull-down signal detection apparatus
KR970068682A (en) Three-dimensional luminance / color signal separator for composite video signal
CN103051857B (en) Motion compensation-based 1/4 pixel precision video image deinterlacing method
CN1507265A (en) Apparatus and method for non-interlace scanning
CN1130455A (en) Method and apparatus for motion estimation using block matching
CN1509065A (en) Image signal format detection device and method thereof
CN103369208A (en) Self-adaptive de-interlacing method and device
US8345148B2 (en) Method and system for inverse telecine and scene change detection of progressive video
WO2009140916A1 (en) Deinterlacing method, deinterlacing device and video process system for video data
US8471962B2 (en) Apparatus and method for local video detector for mixed cadence sequence
CN1258288C (en) Motion tester and testing method
CN1312924C (en) Texture information based video image motion detecting method
CN102447870A (en) Detection method for static objects and motion compensation device
CN102509311A (en) Motion detection method and device
CN102364933A (en) Motion-classification-based adaptive de-interlacing method
CN1894957A (en) Image format conversion

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20070425

Termination date: 20111216