CN102946504A - Self-adaptive moving detection method based on edge detection - Google Patents
Self-adaptive moving detection method based on edge detection Download PDFInfo
- Publication number
- CN102946504A CN102946504A CN201210476628XA CN201210476628A CN102946504A CN 102946504 A CN102946504 A CN 102946504A CN 201210476628X A CN201210476628X A CN 201210476628XA CN 201210476628 A CN201210476628 A CN 201210476628A CN 102946504 A CN102946504 A CN 102946504A
- Authority
- CN
- China
- Prior art keywords
- mov
- mode
- field
- state
- pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
- Processing Of Color Television Signals (AREA)
Abstract
The invention discloses a self-adaptive moving detection method based on the edge detection. The method includes calculating difference value between four brightness component fields relative to current pixel points and different value of two U and V color difference component fields, respectively comparing the four brightness component field difference value with four thresholds to obtain a variable mov_mode, then obtaining a judgment result combined with the moving state and edge detection condition of the pixel points that the current pixels of the two fields correspond to according to the value of the variable mov_mode and finally modifying judgment results according to the difference value between the U and V color difference component fields. The method enables edge detection condition and color difference components to be considered in judgment of the moving state, improves robustness of moving detection and lays good foundation for deinterlacing or interlacing videos.
Description
Technical field
The invention belongs to technical field of video processing, more specifically say, relate to a kind of adaptive motion detection method based on rim detection of video reprocessing interlaced video signal in the progressive conversion.
Background technology
Owing to be subject to the restriction of transmission bandwidth, the TV signal that TV station sends is not complete image, and two part timesharing send with idol but a scene is divided into strange field, so just produce interlaced video signal.But the appearance along with development and the wide screen panel TV of Digital Signal Processing, people are more and more higher to the requirement of TV display quality, and the interlace signal that TV station sends shows at present high-end panel TV the serious problems such as serious screen sudden strain of a muscle, sawtooth will occur.Deinterlacing technique is that interlacing turns technology line by line and produces in order to solve interlaced video signal and high-end TV contradiction just.
Go the interlacing algorithm to be divided into linear algorithm, nonlinear algorithm, movement compensating algorithm and motion-adaptive algorithm according to the kind difference of its used filter.Linear algorithm comprises that row copies, row is average etc., goes the interlacing rear video image blurring and produce serious carpenter's square; Nonlinear algorithm comprises that medium filtering and some fit algorithm; Movement compensating algorithm is that effect is best in theory, but because its algorithm complex is high, the reasons such as hardware implementation cost height also do not become the main flow scheme of present product; And the adaptive kinematic parameter that goes the interlacing algorithm to obtain according to motion detection of based on motion select field interpolation or between copy, be at present large multiple terminals show that product selects go the interlacing scheme.
The adaptive topmost core content of interlacing algorithm that goes of based on motion is exactly the motion and standstill state of judging current pixel at present.For the judgement of motion and standstill often all be by between the field or frame difference and a threshold value compare and obtain.But the robustness of this mode motion detection is not strong.
Summary of the invention
The object of the invention is to overcome the deficiencies in the prior art, a kind of adaptive motion detection method based on rim detection is provided, improving the robustness of motion detection, for the interlacing of going of interlaced video lays a good foundation.
For achieving the above object, the present invention is based on the adaptive motion detection method of rim detection, it is characterized in that, may further comprise the steps:
(1), difference is calculated between the field:
(1.1), difference is calculated between the field of luminance component:
field_diff12=|YF
n(i,j)-YF
n+1(i,j)|+|YF
n(i,j-1)-YF
n+1(i,j-1)|+|YF
n(i,j+1)-YF
n+1(i,j+1)|
field_diff23=|YF
n(i,j)-YF
n-1(i,j)|+|YF
n(i,j-1)-YF
n-1(i,j-1)|+|YF
n(i,j+1)-YF
n-1(i,j+1)|
frm_diff02=|YF
n(i,j)-YF
n-2(i,j)|+|YF
n(i,j-1)-YF
n-2(i,j-1)|+|YF
n(i,j+1)-YF
n-2(i,j+1)|
frm_diff13=|YF
n+1(i,j)-YF
n-1(i,j)|+|YF
n+1(i,j-1)-YF
n-1(i,j-1)|+|YF
n+1(i,j+1)-YF
n-1(i,j+1)|
(1.2), difference is calculated between the field of U color difference components:
Ufield_diff12=|UF
n(i,j)-UF
n+1(i,j)|+|UF
n(i,j-1)-UF
n+1(i,j-1)|+|UF
n(i,j+1)-UF
n+1(i,j+1)|
Ufield_diff23=|UF
n(i,j)-UF
n-1(i,j)|+|UF
n(i,j-1)-UF
n-1(i,j-1)|+|UF
n(i,j+1)-UF
n-1(i,j+1)|
(1.3), difference is calculated between the field of V color difference components:
Vfield_diff12=|VF
n(i,j)-VF
n+1(i,j)|+|VF
n(i,j-1)-VF
n+1(i,j-1)|+|VF
n(i,j+1)-VF
n+1(i,j+1)|
Vfield_diff23=|VF
n(i,j)-VF
n-1(i,j)|+|VF
n(i,j-1)-VF
n-1(i,j-1)|+|VF
n(i,j+1)-VF
n-1(i,j+1)|
Wherein, YF
n, YF
N+1, YF
N-1, YF
N-2Represent respectively current field picture, a field picture behind the front court is when front court previous field image, when the luminance component (being the Y component) of front, front court the second field picture pixel;
UF
n, UF
N+1, UF
N-1, UF
N-2Represent respectively current field picture, a field picture behind the front court is when front court previous field image, when the U color difference components of front, front court the second field picture pixel;
VF
n, VF
N+1, VF
N-1, VF
N-2Represent respectively current field picture, a field picture behind the front court is when front court previous field image, when the V color difference components of front, front court the second field picture pixel;
Wherein, the locus of (i, j) expression current pixel point in image, (i, j-1) locus of pixel in image of top, expression current pixel point upright position, the locus of pixel in image of below, (i, j+1) expression current pixel point upright position;
(2), difference and threshold ratio be between the field of luminance component
Difference between the field of luminance component is compared with a threshold value respectively, obtain the motor pattern of current pixel point, mov_mode represents with variable, being calculated as follows of mov_mode value:
If difference field_diff23 between the field of luminance component〉threshold value thread1, then first of variable mov_mode is:
mov_mode[0]=1
Otherwise, mov_mode[0]=0;
If difference field_diff12 between the field of luminance component〉threshold value thread2, then the second of variable mov_mode is:
mov_mode[1]=1
Otherwise, mov_mode[1]=0;
If difference frm_diff13>threshold value thread3 between the field of luminance component, then the 3rd of variable mov_mode the is:
mov_mode[2]=1
Otherwise, mov_mode[2]=0;
If difference frm_diff02>threshold value thread4 between the field of luminance component, then the 4th of variable mov_mode the is:
mov_mode[3]=1
Otherwise, mov_mode[3]=0;
Wherein threshold value thread1~4 rule of thumb are worth definite;
(3), rim detection
(3.1), vertical gradient is calculated:
The vertical gradient of the current pixel point of current field picture is calculated as follows:
grad_ver
n(i,j)=|YF
n(i,j)-YF
n(i+1,j)|+|YF
n(i,j-1)-YF
n(i+1,j-1)|+|YF
n(i,j+1)-YF
n(i+1,j+1)|
The vertical gradient of the corresponding pixel points of previous field image is calculated as follows:
grad_ver
n-1(i,j)=|YF
n-1(i,j)-YF
n-1(i+1,j)|+YF
n-1(i,j-1)-YF
n-1(i+1,j-1)|+|YF
n-1(i,j+1)-YF
n-1(i+1,j+1)|
The vertical gradient of the corresponding pixel points of front the second field picture is calculated as follows:
grad_ver
n-2(i,j)=|YF
n-2(i,j)-YF
n-2(i+1,j)|+|YF
n-2(i,j-1)-YF
n-2(i+1,j-1)|+|YF
n-2(i,j+1)-YF
n-2(i+1,j+1)
The vertical gradient of the corresponding pixel points of a rear field picture is calculated as follows:
grad_ver
n+1(i,j)=|YF
n+1(i,j)-YF
n+1(i+1,j)|+|YF
n+1(i,j-1)-YF
n+1(i+1,j-1)|+|YF
n+1(i,j+1)-YF
n+1(i+1,j+1)
(3.2), the edge judges,
If satisfy simultaneously:
grad_ver
n(i,j)>edge_thread
grad_ver
n-1(i,j)>edge_thread
grad_ver
n-2(i,j)>edge_thread
grad_ver
n+1(i,j)>edge_thread
Then the edge identifies edge_flag (i, j)=1, otherwise, sign edge_flag (i, j)=0, edge;
(4), the static judgement of moving
When the motion state of front court pixel, the motion state of previous field corresponding pixel points is when the motion state of front court former two corresponding pixel points is respectively mov_state
n(i, j), mov_state
N-1(i, j), mov_state
N-2(i, j);
When their value equals 0, represent staticly, equal to represent motion at 1 o'clock;
(4.1) if variable mov_mode=0 or mov_mode=8 then when the motion state of front court pixel be:
mov_state
n(i,j)=0
Otherwise:
(4.2) if mov_mode〉3 and mov_mode ≠ 8, then the motion state when the front court pixel is:
mov_state
n(i,j)=1,
Otherwise:
(4.3) if mov_mode〉0 and mov_mode<4
If (4.3.1) motion state of second corresponding points of previous field and front all is motion, then the motion state when the front court pixel is:
Mov_state
n(i, j)=1, otherwise:
If (4.3.2) motion state of second corresponding points of previous field and front all is static, then the motion state when the front court pixel is:
Mov_state
n(i, j)=0, otherwise:
If (4.3.3) edge sign edge_flag (i, j)=1, then the motion state when the front court pixel is:
Mov_state
n(i, j)=0, otherwise:
If (4.3.4) mov_mode=3, then the motion state when the front court pixel is:
Mov_state
n(i, j)=1, otherwise:
(4.3.5), the motion state when the front court pixel is:
mov_state
n(i,j)=0;
(5), the correction of motion state judged result
If kinematic variables mov_mode<9 and mov_mode ≠ 3 and mov_mode ≠ 7 are then revised the motion state when the front court pixel according to difference between the field of chromatic component:
(5.1) if satisfy simultaneously:
Ufield_diff12>UVThread
Ufield_diff23>UVThread
Vfield_diff12>UVThread
Vfield_diff23>UVThread
Mov_state then
n(i, j)=1, otherwise the motion state that obtains was judged in (4.1) ~ (4.3) above the motion state of current pixel point kept;
(5.2) if the discontented mov_mode<9﹠amp that is enough to; Mov_mode unequal to 3﹠amp; Mov_mode unequal to 7,4.1 ~ 4.3 judged the state that obtains above the motion state of current pixel point kept.
Goal of the invention of the present invention is achieved in that
The present invention is based on the adaptive motion detection method of rim detection, by difference between difference between four luminance component fields of calculating the current pixel spot correlation and two U, V color difference components field, then difference between four luminance component fields is compared with four threshold values respectively, obtain variable mov_mode, again, value according to variable mov_mode, and obtain judged result in conjunction with motion state, the rim detection situation of two the current pixel corresponding pixel points in front, according to difference between U, V color difference components field judged result is revised at last.The present invention has improved the robustness of motion detection by rim detection situation and color difference components are participated in the judgement of motion state, for the interlacing of going of interlaced video is had laid a good foundation.
Description of drawings
Fig. 1 is the adaptive motion detection method one embodiment flow chart that the present invention is based on rim detection.
Embodiment
Below in conjunction with accompanying drawing the specific embodiment of the present invention is described, so that those skilled in the art understands the present invention better.What need to point out especially is that in the following description, when perhaps the detailed description of known function and design can desalinate main contents of the present invention, these were described in here and will be left in the basket.
Fig. 1 is the adaptive motion detection method one embodiment flow chart that the present invention is based on rim detection.
As shown in Figure 1, in the present embodiment, the adaptive motion detection method that the present invention is based on rim detection may further comprise the steps:
1, difference is calculated between the field
In the present invention, difference is based on the input of 4 video images and carries out between, namely according to current field picture YF
n, behind the front court field picture, current previous field image, as luminance component (the being the Y component) YF of front, front court the second field picture pixel
n, YF
N+1, YF
N-1, YF
N-2Current field picture, behind the front court field picture, current previous field image, as the U color difference components UF of front, front court the second field picture pixel
n, UF
N+1, UF
N-1, UF
N-2Current field picture, behind the front court field picture, current previous field image, as the V color difference components VF of front, front court the second field picture pixel
n, VF
N+1, VF
N-1, VF
N-2Calculate, be specially:
Difference is calculated between luminance field:
field_diff12=|F
n(i,j)-F
n+1(i,j)|+|F
n(i,j-1)-F
n+1(i,j-1)|+|F
n(i,j+1)-F
n+1(i,j+1)|
field_diff23=|F
n(i,j)-F
n-1(i,j)|+|F
n(i,j-1)-F
n-1(i,j-1)|+|F
n(i,j+1)-F
n-1(i,j+1)|
field_diff02=|F
n(i,j)-F
n-2(i,j)|+|f
n(i,j-1)-F
n-2(i,j-1)|+|F
n(i,j+1)-F
n-2(i,j+1)|
field_diff13=|F
n+1(i,j)-F
n-1(i,j)|+|F
n+1(i,j-1)-F
n-1(i,j-1)|+|F
n+1(i,j+1)-F
n-1(i,j+1)|
Difference is calculated between the field of U color difference components:
Ufield_diff12=|UF
n(i,j)-UF
n+1(i,j)|+|UF
n(i,j-1)-UF
n+1(i,j-1)|+|UF
n(i,j+1)-UF
n+1(i,j+1)|
Ufield_diff23=|UF
n(i,j)-UF
n-1(i,j)|+|UF
n(i,j-1)-UF
n-1(i,j-1)|+|UF
n(i,j+1)-UF
n-1(i,j+1)|
Difference is calculated between the field of V color difference components:
Vfield_diff12=|VF
n(i,j)-VF
n+1(i,j)|+|VF
n(i,j-1)-VF
n+1(i,j-1)|+|VF
n(i,j+1)-VF
n+1(i,j+1)|
Vfield_diff23=|VF
n(i,j)-VF
n-1(i,j)|+|VF
n(i,j-1)-VF
n-1(i,j-1)|+|VF
n(i,j+1)-VF
n-1(i,j+1)|
More than be difference field_diff12 between 4 luminance field, field_diff23, field_diff02,4 field picture that field_diff13 represents respectively to input are at current pixel point locus (i, difference between the luminance field of j) locating, wherein difference field_diff12 represents afterwards difference between the luminance field between the field picture n+1 corresponding pixel points of current field picture n current pixel point and its between luminance field, field_diff23 represents difference between luminance field between current field picture n current pixel point and its previous field image n-1 corresponding pixel points, field_diff02 represents difference between luminance field between current field picture n current pixel point and its front the second field picture n-2 corresponding pixel points, field_diff13 represents after its difference between the luminance field between a field picture n+1 corresponding pixel points and its previous field image n-1 corresponding pixel points.
4 field picture that difference Ufield_diff12, Ufield_diff23 represent respectively to input between the field of two U color difference components are at current pixel point locus (i, difference between the U chromatic component field of j) locating, wherein between the field of U color difference components difference Ufield_diff12 represent current field picture n current pixel point with its afterwards U chromatic component difference, the Ufield_diff23 between the field picture n+1 corresponding pixel points represent U chromatic component difference between current field picture n current pixel point and its previous field image n-1 corresponding pixel points.
4 field picture that difference Vfield_diff12, Vfield_diff23 represent respectively to input between the field of two V color difference components are at current pixel point locus (i, difference between the V chromatic component field of j) locating, wherein between the field of V color difference components difference Vfield_diff12 represent current field picture n current pixel point with its afterwards V chromatic component difference, the Vfield_diff23 between the field picture n+1 corresponding pixel points represent V chromatic component difference between current field picture n current pixel point and its previous field image n-1 corresponding pixel points.
Through this step, obtain current pixel point position (i, j) difference field_diff12, field_diff23, field_diff02, field_diff13 between four luminance component fields, difference Vfield_diff12, Vfield_diff23 between the field of difference Ufield_diff12, Ufield_diff23, two V color difference components between the field of two U color difference components.
2, difference and threshold ratio be between the field of luminance component
Difference between the field of luminance component is compared with a threshold value respectively, obtain the motor pattern of current pixel point, mov_mode represents with variable, being calculated as follows of mov_mode value:
If difference field_diff23 between the field of luminance component〉threshold value thread1, then first of variable mov_mode is:
mov_mode[0]=1
Otherwise, mov_mode[0]=0;
If difference field_diff12 between the field of luminance component〉threshold value thread2, then the second of variable mov_mode is:
mov_mode[1]=1
Otherwise, mov_mode[1]=0;
If difference frm_diff13 between the field of luminance component〉threshold value thread3, then the 3rd of variable mov_mode the is:
mov_mode[2]=1
Otherwise, mov_mode[2]=0;
If difference frm_diff02 between the field of luminance component〉threshold value thread4, then the 4th of variable mov_mode the is:
mov_mode[3]=1
Otherwise, mov_mode[3]=0;
The purpose that relatively obtains the value of variable mov_mode needs to carry out the judgement of motion and standstill with other condition after being.
3, rim detection
3.1, vertical gradient calculates:
The vertical gradient of the current pixel point of current field picture is calculated as follows:
grad_ver
n(i,j)=|YF
n(i,j)-YF
n(i+1,j)|+|YF
n(i,j-1)-YF
n(i+1,j-1)|+|YF
n(i,j+1)-YF
n(i+1,j+1)|
The vertical gradient of the corresponding pixel points of previous field image is calculated as follows:
grad_ver
n-1(i,j)=|YF
n-1(i,j)-YF
n-1(i+1,j)|+|YF
n-1(i,j-1)-YF
n-1(i+1,j-1)|+|YF
n-1(i,j+1)-YF
n-1(i+1,j+1)|
The vertical gradient of the corresponding pixel points of front the second field picture is calculated as follows:
grad_ver
n-2(i,j)=|YF
n-2(i,j)-YF
n-2(i+1,j)|+|YF
n-2(i,j-1)-YF
n-2(i+1,j-1)|+|YF
n-2(i,j+1)-YF
n-2(i+1,j+1)|
The vertical gradient of the corresponding pixel points of a rear field picture is calculated as follows:
grad_ver
n+1(i,j)=|YF
n+1(i,j)-YF
n+1(i+1,j)|+|YF
n+1(i,j-1)-YF
n+1(i+1,j-1)|+|YF
n+1(i,j+1)-YF
n+1(i+1,j+1)|
3.2, the edge judges,
If satisfy simultaneously:
grad_ver
n(i,j)>edge_thread
grad_ver
n-1(i,j)>edge_thread
grad_ver
n-2(i,j)>edge_thread
grad_ver
n+1(i,j)>edge_thread
Then the edge identifies edge_flag (i, j)=1, otherwise, sign edge_flag (i, j)=0, edge;
4, the static judgement of moving
When the motion state of front court pixel, the motion state of previous field corresponding pixel points is when the motion state of front court former two corresponding pixel points is respectively mov_state
n(i, j), mov_state
N-1(i, j), mov_state
N-2(i, j);
When their value equals 0, represent staticly, equal to represent motion at 1 o'clock;
If 4.1 variable mov_mode=0 or mov_mode=8, then the motion state when the front court pixel is:
mov_state
n(i,j)=0
Otherwise:
If 4.2 mov_mode>3 and mov_mode ≠ 8, then the motion state when the front court pixel is:
mov_state
n(i,j)=1,
Otherwise:
If 4.3 mov_mode>0 and mov_mode<4
This situation need to utilize the result of the motion detection of two of fronts further to judge, specifically judges as follows:
If 4.3.1 the motion state of second corresponding points of previous field and front all is motion, then the motion state when the front court pixel is:
Mov_state
n(i, j)=1, otherwise:
If 4.3.2 the motion state of second corresponding points of previous field and front all is static, then the motion state when the front court pixel is:
Mov_state
n(i, j)=0, do not satisfy 4.3.2 otherwise namely do not satisfy 4.3.1 yet:
If 4.3.3 sign edge_flag (i, j)=1, edge, then the motion state when the front court pixel is:
Mov_state
n(i, j)=0, otherwise:
If 4.3.4 mov_mode=3, then the motion state when the front court pixel is:
Mov_state
n(i, j)=1, otherwise:
4.3.5, when the motion state of front court pixel be:
mov_state
n(i,j)=0;
5, the correction of motion state judged result
In the step in front, tentatively the motion state of current pixel being judged, next is exactly to utilize two color difference components that motion detection result is proofreaied and correct.It is very little because of difference between the field of luminance component (Y component) in the scene of some motion adopting color difference components that motion detection result is proofreaied and correct, if it is static that the judgement of only carrying out motion state according to difference between the field of Y component can be judged as, so at this moment just consider two color difference components of use the judged result of front is proofreaied and correct.
If kinematic variables mov_mode<9 and mov_mode ≠ 3 and mov_mode ≠ 7 are then revised the motion state when the front court pixel according to difference between the field of chromatic component:
If 5.1 satisfy simultaneously:
Ufield_diff12>UVThread
Ufield_diff23>UVThread
Vfield_diff12>UVThread
Vfield_diff23>UVThread
Mov_state then
n(i, j)=1, otherwise the motion state that obtains was judged in (4.1) ~ (4.3) above the motion state of current pixel point kept;
If 5.2 the discontented mov_mode<9﹠amp that is enough to; Mov_mode unequal to 3﹠amp; Mov_mode unequal to 7,4.1 ~ 4.3 judged the state that obtains above the motion state of current pixel point kept.
By above-mentioned steps, rim detection situation and color difference components are participated in the judgement of motion state, improved the robustness of motion detection, for the interlacing of going of interlaced video is had laid a good foundation.
Although the above is described the illustrative embodiment of the present invention; so that those skilled in the art understand the present invention; but should be clear; the invention is not restricted to the scope of embodiment; to those skilled in the art; as long as various variations appended claim limit and the spirit and scope of the present invention determined in, these variations are apparent, all utilize innovation and creation that the present invention conceives all at the row of protection.
Claims (1)
1. the adaptive motion detection method based on rim detection is characterized in that, may further comprise the steps:
(1), difference is calculated between the field:
(1.1), difference is calculated between the field of luminance component:
field_diff12=|YF
n(i,j)-YF
n+1(i,j)|+|YF
n(i,j-1)-YF
n+1(i,j-1)|+|YF
n(i,j+1)-YF
n+1(i,j+1)|
field_diff23=|YF
n(i,j)-YF
n-1(i,j)|+|YF
n(i,j-1)-YF
n-1(i,j-1)|+|YF
n(i,j+1)-YF
n-1(i,j+1)|
frm_diff02=|YF
n(i,j)-YF
n-2(i,j)|+|YF
n(i,j-1)-YF
n-2(i,j-1)|+|YF
n(i,j+1)-YF
n-2(i,j+1)|
frm_diff13=|YF
n+1(i,j)-YF
n-1(i,j)|+|YF
n+1(i,j-1)-YF
n-1(i,j-1)|+|YF
n+1(i,j+1)-YF
n-1(i,j+1)|
(1.2), difference is calculated between the field of U color difference components:
Ufield_dif12=|UF
n(i,j)-UF
n+1(i,j)|+|UF
n(i,j-1)-UF
n+1(i,j-1)|+|UF
n(i,j+1)-UF
n+1(i,j+1)|
Ufield_dif23=|UF
n(i,j)-UF
n-1(i,j)|+|UF
n(i,j-1)-UF
n-1(i,j-1)|+|UF
n(i,j+1)-UF
n-1(i,j+1)|
(1.3), difference is calculated between the field of V color difference components:
Vfield_diff12=|VF
n(i,j)-VF
n+1(i,j)|+|VF
n(i,j-1)-VF
n+1(i,j-1)|+|VF
n(i,j+1)-VF
n+1(i,j+1)|
Vfield_diff23=|VF
n(i,j)-VF
n-1(i,j)|+|VF
n(i,j-1)-VF
n-1(i,j-1)|+|VF
n(i,j+1)-VF
n-1(i,j+1)|
Wherein, YF
n, YF
N+1, YF
N-1, YF
N-2Represent respectively current field picture, a field picture behind the front court is when last field picture, when the luminance component (being the Y component) of front, front court the second field picture pixel;
UF
n, UF
N+1, UF
N-1, UF
N-2Represent respectively current field picture, a field picture behind the front court, current previous field image is when the U color difference components of front, front court the second field picture pixel;
VF
n, VF
N+1, VF
N-1, VF
N-2Represent respectively current field picture, a field picture behind the front court, current previous field image is when the V color difference components of front, front court the second field picture pixel;
Wherein, the locus of (i, j) expression current pixel point in image, (i, j-1) locus of pixel in image of top, expression current pixel point upright position, the locus of pixel in image of below, (i, j+1) expression current pixel point upright position;
(2), difference and threshold ratio be between the field of luminance component
Difference between the field of luminance component is compared with a threshold value respectively, obtain the motor pattern of current pixel point, mov_mode represents with variable, being calculated as follows of mov_mode value:
If difference field_diff23 between the field of luminance component〉threshold value thread1, then first of variable mov_mode is:
mov_mode[0]=1
Otherwise, mov_mode[0]=0;
If difference field_diff12 between the field of luminance component〉threshold value thread2, then the second of variable mov_mode is:
mov_mode[1]=1
Otherwise, mov_mode[1]=0;
If difference frm_diff13 between the field of luminance component〉threshold value thread3, then the 3rd of variable mov_mode the is:
mov_mode[2]=1
Otherwise, mov_mode[2]=0;
If difference frm_diff02 between the field of luminance component〉threshold value thread4, then the 4th of variable mov_mode the is:
mov_mode[3]=1
Otherwise, mov_mode[3]=0;
Wherein threshold value thread1 ~ 4 rule of thumb are worth definite;
(3), rim detection
(3.1), vertical gradient is calculated:
The vertical gradient of the current pixel point of current field picture is calculated as follows:
grad_ver
n(i,j)=|YF
n(i,j)-YF
n(i+1,j)|+|YF
n(i,j-1)-YF
n(i+1,j-1)|+|YF
n(i,j+1)-YF
n(i+1,j+1)|
The vertical gradient of the corresponding pixel points of previous field image is calculated as follows:
grad_ver
n-1(i,j)=|YF
n-1(i,j)-YF
n-1(i+1,j)|+|YF
n-1(i,j-1)-YF
n-1(i+1,j-1)|+|YF
n-1(i,j+1)-YF
n-1(i+1,j+1)|
The vertical gradient of the corresponding pixel points of front the second field picture is calculated as follows:
grad_ver
n-2(i,j)=|YF
n-2(i,j)-YF
n-2(i+1,j)|+|YF
n-2(i,j-1)-YF
n-2(i+1,j-1)|+|YF
n-2(i,j+1)-YF
n-2(i+1,j+1)
The vertical gradient of the corresponding pixel points of a rear field picture is calculated as follows:
grad_ver
n+1(i,j)=|YF
n+1(i,j)-YF
n+1(i+1,j)|+|YF
n+1(i,j-1)-YF
n+1(i+1,j-1)|+|YF
n+1(i,j+1)-YF
n+1(i+1,j+1)
(3.2), the edge judges,
If satisfy simultaneously:
grad_ver
n(i,j)>edge_thread
grad_ver
n-1(i,j)>edge_thread
grad_ver
n-2(i,j)>edge_thread
grad_ver
n+1(i,j)>edge_thread
Then the edge identifies edge_flag (i, j)=1, otherwise, sign edge_flag (i, j)=0, edge;
(4), the static judgement of moving
When the motion state of front court pixel, the motion state of previous field corresponding pixel points is when the motion state of front court former two corresponding pixel points is respectively mov_state
n(i, j), mov_state
N-1(i, j), mov_state
N-2(i, j);
When their value equals 0, represent staticly, equal to represent motion at 1 o'clock;
(4.1) if variable mov_mode=0 or mov_mode=8 then when the motion state of front court pixel be:
mov_state
n(i,j)=0
Otherwise:
(4.2) if mov_mode〉3 and mov_mode ≠ 8, then the motion state when the front court pixel is:
mov_state
n(i,j)=1,
Otherwise:
(4.3) if mov_mode〉0 and mov_mode<4
If (4.3.1) motion state of second corresponding points of previous field and front all is motion, then the motion state when the front court pixel is:
Mov_state
n(i, j)=1, otherwise:
If (4.3.2) motion state of second corresponding points of previous field and front all is static, then the motion state when the front court pixel is:
Mov_state
n(i, j)=0, otherwise:
If (4.3.3) edge sign edge_flag (i, j)=1, then the motion state when the front court pixel is:
Mov_state
n(i, j)=0, otherwise:
If (4.3.4) mov_mode=3, then the motion state when the front court pixel is:
Mov_state
n(i, j)=1, otherwise:
(4.3.5), the motion state when the front court pixel is:
mov_state
n(i,j)=0;
(5), the correction of motion state judged result
If kinematic variables mov_mode<9 and mov_mode ≠ 3 and mov_mode ≠ 7 are then revised the motion state when the front court pixel according to difference between the field of chromatic component:
(5.1) if satisfy simultaneously:
Ufield_diff12>UVThread
Ufield_diff23>UVThread
Vfield_diff12>UVThread
Vfield_diff23>UVThread
Mov_state then
n(i, j)=1, otherwise the motion state that obtains was judged in (4.1) ~ (4.3) above the motion state of current pixel point kept;
(5.2) if the discontented mov_mode<9﹠amp that is enough to; Mov_mode unequal to 3﹠amp; Mov_mode unequal to 7,4.1 ~ 4.3 judged the state that obtains above the motion state of current pixel point kept.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210476628.XA CN102946504B (en) | 2012-11-22 | 2012-11-22 | Self-adaptive moving detection method based on edge detection |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210476628.XA CN102946504B (en) | 2012-11-22 | 2012-11-22 | Self-adaptive moving detection method based on edge detection |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102946504A true CN102946504A (en) | 2013-02-27 |
CN102946504B CN102946504B (en) | 2015-02-18 |
Family
ID=47729408
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210476628.XA Active CN102946504B (en) | 2012-11-22 | 2012-11-22 | Self-adaptive moving detection method based on edge detection |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102946504B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110910429A (en) * | 2019-11-19 | 2020-03-24 | 普联技术有限公司 | Moving target detection method and device, storage medium and terminal equipment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6597737B1 (en) * | 1998-09-03 | 2003-07-22 | Sony Corporation | Motion determining apparatus, method thereof, and picture information converting apparatus |
CN1479517A (en) * | 2003-07-07 | 2004-03-03 | 西安交通大学 | Realizing method of digitized processing TV interlaced scanning format conversion |
CN1846437A (en) * | 2003-09-07 | 2006-10-11 | 微软公司 | Innovations in coding and decoding macroblock and motion information for interlaced and progressive scan video |
US20070263905A1 (en) * | 2006-05-10 | 2007-11-15 | Ching-Hua Chang | Motion detection method and apparatus |
CN102045530A (en) * | 2010-12-30 | 2011-05-04 | 北京中科大洋科技发展股份有限公司 | Motion adaptive deinterleaving method based on edge detection |
-
2012
- 2012-11-22 CN CN201210476628.XA patent/CN102946504B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6597737B1 (en) * | 1998-09-03 | 2003-07-22 | Sony Corporation | Motion determining apparatus, method thereof, and picture information converting apparatus |
CN1479517A (en) * | 2003-07-07 | 2004-03-03 | 西安交通大学 | Realizing method of digitized processing TV interlaced scanning format conversion |
CN1846437A (en) * | 2003-09-07 | 2006-10-11 | 微软公司 | Innovations in coding and decoding macroblock and motion information for interlaced and progressive scan video |
US20070263905A1 (en) * | 2006-05-10 | 2007-11-15 | Ching-Hua Chang | Motion detection method and apparatus |
CN102045530A (en) * | 2010-12-30 | 2011-05-04 | 北京中科大洋科技发展股份有限公司 | Motion adaptive deinterleaving method based on edge detection |
Non-Patent Citations (4)
Title |
---|
GWANGGIL JEON: "Fuzzy rough sets hybrid scheme for motion and scene complexity adaptive deinterlacing", 《IMAGE AND VISION COMPUTING》 * |
伍刘: "带运动检测的自适应去隔行算法及其GPU实现", 《电视技术》 * |
李火生: "一种新的运动自适应去隔行算法", 《电视技术》 * |
王翠喜: "数字高清平板电视中去隔行算法的研究及仿真", 《信息科技辑》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110910429A (en) * | 2019-11-19 | 2020-03-24 | 普联技术有限公司 | Moving target detection method and device, storage medium and terminal equipment |
CN110910429B (en) * | 2019-11-19 | 2023-03-17 | 成都市联洲国际技术有限公司 | Moving target detection method and device, storage medium and terminal equipment |
Also Published As
Publication number | Publication date |
---|---|
CN102946504B (en) | 2015-02-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101600061B (en) | Video motion-adaptive de-interlacing method and device therefor | |
CN100375503C (en) | Noise reduction apparatus | |
US7787048B1 (en) | Motion-adaptive video de-interlacer | |
US9013584B2 (en) | Border handling for motion compensated temporal interpolator using camera model | |
US8289446B2 (en) | Intermediate frame occlusion estimation system using projected vectors | |
US20110211128A1 (en) | Occlusion adaptive motion compensated interpolator | |
US20100310191A1 (en) | Image processing apparatus and image processing method | |
CN101867759A (en) | Self-adaptive motion compensation frame frequency promoting method based on scene detection | |
US8542322B2 (en) | Motion compensated interpolation system using combination of full and intermediate frame occlusion | |
US20170249724A1 (en) | Object speed weighted motion compensated interpolation | |
CN102946505A (en) | Self-adaptive motion detection method based on image block statistics | |
CN102918831A (en) | Resolution evaluation device, image processing apparatus, and image display apparatus | |
CN102215368A (en) | Motion self-adaptive de-interlacing method based on visual characteristics | |
CN102447870A (en) | Detection method for static objects and motion compensation device | |
US8345148B2 (en) | Method and system for inverse telecine and scene change detection of progressive video | |
CN101510985A (en) | Self-adapting de-interleave method for movement compensation accessory movement | |
CN102497524B (en) | Edge adaptive de-interlacing interpolation method | |
CN102364933A (en) | Motion-classification-based adaptive de-interlacing method | |
US8391372B2 (en) | Method of doubling frame rate of video signals | |
CN102946504B (en) | Self-adaptive moving detection method based on edge detection | |
Jeon et al. | Fuzzy rule-based edge-restoration algorithm in HDTV interlaced sequences | |
CN1199449C (en) | Method for removing intersection by adopting error protection motion compensation and its equipment | |
CN101577805B (en) | Method for converting an image and image conversion unit | |
CN201222771Y (en) | High speed edge self-adapting de-interlaced interpolation device | |
CN102497492B (en) | Detection method for subtitle moving in screen |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |