WO2017101348A1 - 一种隔行视频的去隔行方法及装置 - Google Patents

一种隔行视频的去隔行方法及装置 Download PDF

Info

Publication number
WO2017101348A1
WO2017101348A1 PCT/CN2016/088690 CN2016088690W WO2017101348A1 WO 2017101348 A1 WO2017101348 A1 WO 2017101348A1 CN 2016088690 W CN2016088690 W CN 2016088690W WO 2017101348 A1 WO2017101348 A1 WO 2017101348A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
field
pixel
video frame
field effect
Prior art date
Application number
PCT/CN2016/088690
Other languages
English (en)
French (fr)
Inventor
白茂生
Original Assignee
乐视控股(北京)有限公司
乐视云计算有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 乐视控股(北京)有限公司, 乐视云计算有限公司 filed Critical 乐视控股(北京)有限公司
Priority to US15/247,660 priority Critical patent/US20170171501A1/en
Publication of WO2017101348A1 publication Critical patent/WO2017101348A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0117Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving conversion of the spatial resolution of the incoming video signal
    • H04N7/012Conversion between an interlaced and a progressive signal

Definitions

  • the present invention relates to the field of video signal processing technologies, and in particular, to a deinterlacing method and apparatus for interlaced video.
  • the PAL TV standard is 25 frames per second, that is, the video we usually watch changes 25 images per second, and the human eye does not feel flicker due to the visual persistence effect.
  • Each frame of image is divided into two fields for scanning.
  • the scanning means that the electron beam is scanned from top to bottom in a horizontal direction in the horizontal direction in the picture tube.
  • the first field scans odd lines first
  • the second field scans even lines. That is, we often say that the interlaced scan completes one frame after two sweeps.
  • the field frequency is 50 Hz and the frame rate is 25 Hz
  • the odd field and the even field scan are the same frame image, and unless the image is still, the adjacent two frames are different.
  • field video In order to accommodate interlaced scanning devices, field video has existed for many years. With the development of technology, field video shows a significant field effect on progressive scan devices (such as LCD display devices), that is, the more severe the motion of the video picture, the more serious the drawing situation, which seriously affects the video viewing experience.
  • progressive scan devices such as LCD display devices
  • the invention provides a deinterlacing method and device for interlaced video, which solves the defect that the field video exhibits obvious field effect on the progressive scan device in the prior art, and achieves good image quality during video playback. .
  • the invention provides a deinterlacing method for interlaced video, comprising:
  • the pending video is detected as a field video
  • Deinterlace processing is performed on all pixels in each video frame that needs to be deinterlaced.
  • the invention also provides an interlaced device for interlaced video, comprising:
  • a detecting module configured to detect that the to-be-processed video is a field video
  • a determining module configured to determine a video frame in the field video that needs to be deinterlaced
  • a processing module for deinterlacing all pixels in a video frame that needs to be deinterlaced is a processing module for deinterlacing all pixels in a video frame that needs to be deinterlaced.
  • the present invention also provides a deinterlacing device for interlaced video, comprising: a memory, a processor, wherein
  • the memory is configured to store one or more instructions, wherein the one or more instructions are for execution by the processor;
  • the processor is configured to detect that the to-be-processed video is a field video
  • the present application determines a video frame in the field video that needs to be deinterlaced; and performs deinterlacing processing on all pixels in each video frame that needs to be deinterlaced. A low-cost, high-efficiency deinterlacing process is achieved, while improving the quality of the processed video.
  • Embodiment 1 is a technical flowchart of Embodiment 1 of the present invention.
  • FIG. 2 is a schematic diagram of field effect point detection according to the present invention.
  • Embodiment 2 of the present invention is a technical flowchart of Embodiment 2 of the present invention.
  • FIG. 4 is a schematic structural diagram of a device according to Embodiment 3 of the present invention.
  • FIG. 5 is a schematic diagram of device connection according to Embodiment 4 of the present application.
  • Embodiment 1 is a technical flowchart of Embodiment 1 of the present invention.
  • Embodiment 1 of the present invention A de-interlacing method based on local information is mainly implemented by two large steps:
  • Step 110 Detect pixel points in the video frame one by one, and determine whether the pixel point is a field effect point
  • Interlaced and Progressive are methods in which a display device represents a moving image.
  • the interlaced scanning method is that each frame is divided into two fields and alternately displayed.
  • the progressive scanning method is to display all the frames of each frame. Also displayed.
  • the usual LCD TV display screen scanning method is from left to right, from top to bottom, scanning a fixed number of frames per second.
  • each frame of image is continuously scanned by electron beams sequentially one after another, and this scanning method is called progressive scanning.
  • this scanning method is called progressive scanning.
  • each frame of the image must be scanned for an integer number of lines.
  • Interlaced scanning is to divide each frame into two fields. Each field contains all odd or even scan lines in one frame. Usually, the odd field is scanned first to get the first field, and then the even line is scanned to get the second field. Due to the persistence of vision, the human eye will see a smooth motion rather than a flashing half-frame half-frame image.
  • the line scan frequency of interlaced scanning is half that of progressive scanning, so the spectrum of the television signal and the channel bandwidth for transmitting the signal are also half of the progressive scanning. In this way, after interlaced scanning, the channel utilization rate is doubled in the case where the image quality is not degraded. Due to the reduction in channel bandwidth, the complexity and cost of the system and equipment are also reduced.
  • the pixel values of adjacent rows are large, and the pixel values of the interlaced pixels are small.
  • the pixel values of adjacent rows and spaced rows are not much different. Therefore, the feature is used to detect the field effect points.
  • Step 110 is further implemented by steps 111 to 112.
  • Step 111 Acquire a first pixel point and a pixel point at the same position in an adjacent row a pixel difference value and a second pixel difference value between the pixel point and a pixel point located at the same position in the interlaced line;
  • the pixel values p (i, j) of the i-th pixel of the j-th row on the video frame and the pixel value p (i, j of the i-th pixel of the j+1th row are respectively acquired .
  • d1 is the first pixel difference value
  • d2 is the second pixel difference value
  • Step 112 Determine whether the pixel point is the field effect point according to a preset similarity threshold and a preset phase difference threshold.
  • the pixel point is the field effect point:
  • simi_ thd to the similarity threshold value, diff_ thd threshold value for said difference, && represents logical AND operation.
  • step 110 is performed for each pixel, and if the pixel is determined to be a field effect point, all pixels before and after (for a total of 3 pixels) are identified as Field effect point.
  • the coordinate origin of the video frame is at the upper left corner of the image
  • the frame width is The degree is width and the height is height.
  • the array mask[height][width] with the same image size is assigned, and each frame is set to 0 before processing. If p[y][x] is the field effect point, then mask[y][x -1], mask[y][x], and mask[y][x+1] are set to 1.
  • Step 120 When it is determined that the pixel point is a field effect point, the pixel point is deinterlaced.
  • deinterlacing is the conversion of interlaced video to progressive video. Usually this is a process in which the amount of data is doubled and the amount of information is constant.
  • step 110 After the detection of step 110 is completed for each frame, a mask array mask that marks all field effect points of the current frame is obtained. After traversing the array mask, if the value of the mask array corresponding to the current pixel is 1, it indicates that the current point is a field effect point, and then the current point is deinterlaced. Otherwise skip the processing of the current point.
  • the de-interlacing algorithm used in the embodiment of the present invention is a YADIF (Yet Another DeInterlacing Filter) algorithm.
  • YADIF Yet Another DeInterlacing Filter
  • the field effect point in the image is detected in advance, and the field effect point is deinterlaced to convert the field video into frame video, thereby realizing low cost and high efficiency deinterlacing processing, and improving the field video in the field.
  • the obvious field effect phenomenon when displaying on the line scanning device improves the quality of the processed video.
  • FIG. 3 is a technical flowchart of Embodiment 2 of the present invention
  • FIG. 3 is a de-interlacing method for interlaced video according to an embodiment of the present invention.
  • determining that the video frame is required Deinterlaced video frames.
  • the YADIF algorithm is used to deinterlace all pixels in a video frame that needs to be deinterlaced.
  • the detection of all field effect points in the frame is performed for each video frame. Assign the array mask[height][width] with the same image size, and set all frames to 0 before processing. If p[y][x] is the field effect point, mask[y][x-1], mask [y][x] and mask[y][x+1] are all set to 1.
  • the field video is determined, that is, whether the current pending video is a field video, and if it is not a field video, the deinterlacing process is not directly performed, saving time and ensuring quality.
  • the current video to be processed is a field video
  • YADIF Yet Another DeInterlacing Filter
  • the invention can improve the processing speed by the partial deinterlacing process; afterwards, the complete deinterlacing process can improve the processing quality.
  • an interlaced video-based deinterlacing device mainly includes:
  • the detecting module 41 is configured to detect that the to-be-processed video is a field video
  • a determining module 42 configured to determine a video frame in the field video that needs to be deinterlaced
  • the processing module 43 is configured to perform deinterlacing processing on all pixels in each video frame that needs to be deinterlaced.
  • the detecting module 41 is further configured to detect pixel points in each video frame in the to-be-processed video one by one, and determine whether the pixel point is a field effect point;
  • the processing module 43 is further configured to perform deinterlacing processing on the field effect point when determining that the pixel point is a field effect point.
  • the detecting module 41 is specifically configured to:
  • the determining module 42 is specifically configured to:
  • the processing module 43 is configured to:
  • the YADIF algorithm is used to deinterlace all pixels in a video frame that needs to be deinterlaced.
  • the apparatus shown in FIG. 4 can perform the method of the embodiment shown in FIG. 1 or FIG. 3, and the implementation principle and technical effects are not described again.
  • an interlaced video-based deinterlacing device mainly includes: a memory 501 and a processor 502, where
  • the memory 501 is configured to store one or more instructions, where the one or more instructions are used by the processor 502 to invoke execution;
  • the processor 502 is configured to detect that the to-be-processed video is a field video
  • the processor 502 is further configured to detect pixel points in each video frame in the to-be-processed video one by one, and determine whether the pixel point is a field effect point;
  • the processor 502 is further configured to determine, according to the detected number of field effect points in each video frame, if the number of field effect points is greater than a preset apparent field threshold, determining that the video frame is Obvious field image frame;
  • the processor 502 is further configured to determine the video according to the detected number of field effect points included in each video frame, if it is determined that the number of field effect points is greater than a preset single frame processing threshold.
  • the frame is a video frame that needs to be deinterlaced.
  • the processor 502 is further configured to perform deinterlace processing on all pixels in a video frame that needs to be deinterlaced by using a YADIF algorithm.
  • the device embodiments described above are merely illustrative, wherein the units described as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, ie may be located A place, or it can be distributed to multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment. Those of ordinary skill in the art can understand and implement without deliberate labor.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Television Systems (AREA)

Abstract

一种隔行视频的去隔行方法及装置。检测到待处理视频为场视频;确定所述场视频中需要进行去隔行处理的视频帧;对每个需要进行去隔行处理的视频帧中的所有像素点进行去隔行处理。可以提高了去隔行的处理速度以及处理后的图像的质量。

Description

一种隔行视频的去隔行方法及装置
交叉引用
本申请引用于2015年12月14日递交的名称为“一种隔行视频的去隔行方法及装置”的第201510927358.3号中国专利申请,其通过引用被全部并入本申请。
技术领域
本发明涉及视频信号处理技术领域,尤其涉及一种隔行视频的去隔行方法及装置。
背景技术
PAL电视标准,是每秒25帧,即我们通常看的视频每秒更换25个图像,由于视觉暂留效应所以人眼不会感到闪烁。每帧图像又是分为两场来进行扫描的,这里的扫描是指电子束在显像管内沿水平方向一行一行从上到下扫描,第一场先扫奇数行,第二场扫偶数行,即我们常说的隔行扫描,扫完两场即完成一帧图像。当场频50Hz,帧频25Hz时,奇数场和偶数场扫描的是同一帧图像,除非图像静止不动,否则相邻两帧图像不同。
为了适应隔行扫描设备,场视频已经存在多年。随着科技的发展,场视频在逐行扫描的设备(如LCD显示设备)上表现出了明显的场效应,即视频画面运动越剧烈拉丝情况越严重,这严重影响了视频的观看体验。
因此,一种去隔行的方法亟待提出。
发明内容
本发明提供一种隔行视频的去隔行方法及装置,用以解决现有技术中场视频在逐行扫描的设备上表现出了明显的场效应的缺陷,实现了视频播放时具有良好的画质。
本发明提供一种隔行视频的去隔行方法,包括:
检测到待处理视频为场视频;
确定所述场视频中需要进行去隔行处理的视频帧;
对每个需要进行去隔行处理的视频帧中的所有像素点进行去隔行处理。
本发明还提供一种隔行视频的去隔行装置,包括:
检测模块,用于检测到待处理视频为场视频;
确定模块,用于确定所述场视频中需要进行去隔行处理的视频帧;
处理模块,用于对每个需要进行去隔行处理的视频帧中的所有像素点进行去隔行处理。
本发明还提供一种隔行视频的去隔行设备,包括:存储器、处理器,其中,
所述存储器,用于存储一条或多条指令,其中,所述一条或多条指令以供所述处理器调用执行;
所述处理器,用于检测到待处理视频为场视频;
用于确定所述场视频中需要进行去隔行处理的视频帧;
用于对每个需要进行去隔行处理的视频帧中的所有像素点进行去隔行处理。
与现有技术相比,本申请可以获得包括以下技术效果:
本申请在检测到待处理视频为场视频时,确定所述场视频中需要进行去隔行处理的视频帧;对每个需要进行去隔行处理的视频帧中的所有像素点进行去隔行处理。实现了低成本、高效率的去隔行处理,与此同时提高了处理后的视频质量。
附图说明
此处所说明的附图用来提供对本发明的进一步理解,构成本申请的一部分,本发明的示意性实施例及其说明用于解释本发明,并不构成对本发明的不当限定。在附图中:
图1为本发明实施例一的技术流程图;
图2为本发明场效应点检测的示意图;
图3为本发明实施例二的技术流程图;
图4为本发明实施例三的装置结构示意图;
图5是本申请实施例四的设备连接示意图。
具体实施方式
为使本发明实施例的目的、技术方案和优点更加清楚,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
实施例一
图1是本发明实施例一的技术流程图,结合图1,本发明实施例一 种基于局部信息的去隔行方法,主要由两个大的步骤实现:
步骤110:逐一检测视频帧内的像素点,并判断所述像素点是否为场效应点;
隔行扫描(Interlaced)和逐行扫描(Progressive)都是在显示设备表示运动图像的方法,隔行扫描方式是每一帧被分割为两场画面交替显示,逐行扫描方式是将每帧的所有画面同时显示。通常的液晶电视显示画面的扫描方法都是从左到右从上到下,每秒钟扫描固定的帧数。
每一帧图像均是由电子束顺序地一行接着一行连续扫描而成,这种扫描方式称为逐行扫描。要得到稳定的逐行扫描图像,每帧图像必须扫描整数行。
隔行扫描就是每一帧被分割为两场,每一场包含了一帧中所有的奇数扫描行或者偶数扫描行,通常是先扫描奇数行得到第一场,然后扫描偶数行得到第二场。由于视觉暂留效应,人眼将会看到平滑的运动而不是闪动的半帧半帧的图像。隔行扫描的行扫描频率为逐行扫描时的一半,因而电视信号的频谱及传送该信号的信道带宽亦为逐行扫描的一半。这样采用了隔行扫描后,在图像质量下降不多的情况下,信道利用率提高了一倍。由于信道带宽的减小,使系统及设备的复杂性与成本也相应减少。
但是,场图像在逐行扫描的设备上进行显示时,场效应非常明显。
通过大量实验检测与分析发现,对于场图像,其相邻行像素值差别大、相隔行像素值差别小;而在帧图像中,相邻行、相隔行的像素值差别都不大。因此,利用该特征进行场效应点的检测。
步骤110进一步由步骤111~步骤112实现。
步骤111:获取所述像素点与相邻行内同一位置处的像素点的第一 像素差值以及所述像素点与隔行内位于同一位置的像素点的第二像素差值;
如图2所示的,分别获取视频帧上第j行的第i个像素点的像素值p(i,j)、第j+1行的第i个像素点的像素值p(i,j+1)、第j+2行的第i个像素点的像素值p(i,j+2)
分别计算相邻行位于同一位置的像素点的像素差值以及隔行位于同一位置的像素点的像素差值,计算公式如下:
Figure PCTCN2016088690-appb-000001
其中,d1为所述第一像素差值,d2为所述第二像素差值。
步骤112:根据预设的相似阈值以及预设的相差阈值判断所述像素点是否为所述场效应点。
当所述第一像素差值与所述第二像素差值满足如下公式时,判定所述像素点为所述场效应点:
d1>diffthd&&d2<simi_thd
其中,simi_thd为所述相似阈值,diff_thd为所述相差阈值,&&表示逻辑与运算。
所述相似阈值以及所述相差阈值都是经验值,通常预设simi_thd=10,diff_thd=30。
需要说明的是,对于一帧图像,每一个像素点均需执行步骤110,若所述像素点被判断为场效应点,则连同其前、后1个像素(共3个像素)全部标识为场效应点。
本步骤在具体实现中,设视频帧的坐标原点在图像左上角点,帧宽 度为width,高度为height。为每一帧图像分配和图像大小相同的数组mask[height][width],各帧处理前全置为0,若p[y][x]为场效应点,则将mask[y][x-1]、mask[y][x]及mask[y][x+1]置为1。
步骤120:当判定所述像素点为场效应点,对所述像素点进行去隔行处理。
“去隔行”对应的英文为deinterlacing。简单地说,去隔行就是把隔行视频转换为逐行视频。通常这是一个数据量加倍而信息量不变的过程。
对每一帧完成步骤110的检测后,便获得了标记当前帧所有场效应点的掩膜数组mask。之后遍历数组mask,如果当前像素点对应的掩膜数组的值为1,则说明当前点是场效应点,于是对当前点进行去隔行处理。否则跳过对当前点的处理。
本发明实施例采用的去隔行算法是YADIF(Yet Another DeInterlacing Filter)算法,有关去隔行算法可以参考现有技术中的相关技术资料,不再赘述。
本实施例通过预先检测出图像中的场效应点,并对场效应点进行去隔行处理,将场视频转化为帧视频,实现了低成本、高效率的去隔行处理,改善了场视频在逐行扫描设备上进行显示时明显的场效应现象,提高了处理后的视频质量。
实施例二
基于图1所示实施例一种基于局部信息的去隔行方法,图3是本发明实施例二的技术流程图,结合图3,本发明实施例一种隔行视频的去隔行方法,具体实现步骤包括:
301、检测到待处理视频为场视频;
基于图1所示实施例,具体实现时,包括:
逐一检测所述待处理视频中每个视频帧内的像素点,并判断所述像素点是否为场效应点;
当判定所述像素点为场效应点,对所述场效应点进行去隔行处理;
根据检测出的每个视频帧中包括的场效应点个数,若所述场效应点个数大于预设的明显场阈值,则确定所述视频帧为明显场图像帧;
根据检测出的所述明显场图像帧的个数,若所述明显场图像帧的个数大于预设的视频帧数阈值,则确定所述待处理视频为场视频。
302、确定所述场视频中需要进行去隔行处理的视频帧;
具体实现时,例如,根据检测出的每个视频帧中包括的场效应点个数,若确定所述场效应点个数大于预设的单帧处理阈值,则确定所述视频帧为需要进行去隔行处理的视频帧。
303、对每个需要进行去隔行处理的视频帧中的所有像素点进行去隔行处理。
例如,采用YADIF算法对需要进行去隔行处理的视频帧中的所有像素点进行去隔行处理。
下面通过具体实现方式对本发明的技术方案进行详细的描述:
首先,对每一视频帧进行帧内所有场效应点的检测。分配和图像大小相同的数组mask[height][width],各帧处理前全置为0,若p[y][x]为场效应点,则将mask[y][x-1]、mask[y][x]及mask[y][x+1]全部置为1。
之后,根据掩膜数组mask,统计出每一视频帧中所有场效应点的个数记为comb_cc。设置明显场图像帧的个数为comb_fn,设明显场阈值为abs_comb_thd,其中abs_comb_thd=width*8。如果每一视频帧中所有场效应点的个数comb_cc满足公式1,则说明该视频帧是明显场图像,明显场图像帧的个数comb_fn的值加1。
comb_cc>abs_comb_thd公式1
为了提高处理速度与处理精度,首先,进行场视频的判定,即判断出当前待处理视频是否为场视频,若不是场视频,则直接不进行去隔行处理,节省时间并保证质量。
设定检测周期,单位为帧数:unit=fps*60*2,其中,fps为待处理视频的帧率;设当前待处理视频的总帧数为total_fn。若上述检测到明显场图像帧的个数comb_fn满足公式2,则认为当前待处理视频是场视频,进行后续去隔行处理;否则判定当前序列是帧序列,直接跳过去隔行处理。
comnb_fn>total_fn/unit公式2
进一步地,若当前待处理视频是场视频,则对其进行逐帧的去隔行处理,过程如下。首先设是否进行单帧处理的阈值frame_comb_thd=272,之后对于每一视频帧,根据上述检测到每一视频帧内所有场效应点的个数comb_cc来判定该视频帧是否需要进行去隔行处理。若每一视频帧内所有场效应点的个数comb_cc满足公式3,则确定该视频帧需要进行去隔行处理,并对该视频帧所有像素点进行去隔行处理,去隔行处理的算法为YADIF(Yet Another DeInterlacing Filter)算法,否则跳过不处理。
comb_cc>frame_comb_thd公式3
本发明先通过局部去隔行处理可以提高处理速度;之后,在做完整的去隔行处理,可以提升处理质量。
实施例三
图4是本发明实施例三的装置结构示意图,结合图4,本发明实施例一种基于隔行视频的去隔行装置,主要包括:
检测模块41,用于检测到待处理视频为场视频;
确定模块42,用于确定所述场视频中需要进行去隔行处理的视频帧;
处理模块43,用于对每个需要进行去隔行处理的视频帧中的所有像素点进行去隔行处理。
其中:
所述检测模块41,还用于逐一检测所述待处理视频中每个视频帧内的像素点,并判断所述像素点是否为场效应点;
所述处理模块43,还用于当判定所述像素点为场效应点,对所述场效应点进行去隔行处理。
所述检测模块41具体用于:
根据检测出的每个视频帧中包括的场效应点个数,若所述场效应点个数大于预设的明显场阈值,则确定所述视频帧为明显场图像帧;
根据检测出的所述明显场图像帧的个数,若所述明显场图像帧的个数大于预设的视频帧数阈值,则确定所述待处理视频为场视频。
所述确定模块42具体用于:
根据检测出的每个视频帧中包括的场效应点个数,若确定所述场效应点个数大于预设的单帧处理阈值,则确定所述视频帧为需要进行去隔行处理的视频帧。
所述处理模块43用于:
采用YADIF算法对需要进行去隔行处理的视频帧中的所有像素点进行去隔行处理。
图4所示装置可以执行图1或图3所示实施例的方法,实现原理和技术效果不再赘述。
实施例四
图5是本发明实施例四的设备结构示意图,结合图5,本发明实施例一种基于隔行视频的去隔行设备,主要包括:存储器501、处理器502,其中,
所述存储器501,用于存储一条或多条指令,其中,所述一条或多条指令以供所述处理器502调用执行;
所述处理器502,用于检测到待处理视频为场视频;
用于确定所述场视频中需要进行去隔行处理的视频帧;
用于对每个需要进行去隔行处理的视频帧中的所有像素点进行去隔行处理。
其中,所述处理器502,还进一步用于逐一检测所述待处理视频中每个视频帧内的像素点,并判断所述像素点是否为场效应点;
用于当判定所述像素点为场效应点,对所述场效应点进行去隔行处理。
所述处理器502,进一步用于,根据检测出的每个视频帧中包括的场效应点个数,若所述场效应点个数大于预设的明显场阈值,则确定所述视频帧为明显场图像帧;
根据检测出的所述明显场图像帧的个数,若所述明显场图像帧的个数大于预设的视频帧数阈值,则确定所述待处理视频为场视频。
所述处理器502,进一步用于,根据检测出的每个视频帧中包括的场效应点个数,若确定所述场效应点个数大于预设的单帧处理阈值,则确定所述视频帧为需要进行去隔行处理的视频帧。
所述处理器502,进一步用于,采用YADIF算法对需要进行去隔行处理的视频帧中的所有像素点进行去隔行处理。
本设备的技术方案和各模块的功能特征、连接方式,与图1~图5对应实施例所描述的特征和技术方案相对应,不足之处请参见前述图1~图5对应实施例。
以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。本领域普通技术人员在不付出创造性的劳动的情况下,即可以理解并实施。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到各实施方式可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件。基于这样的理解,上述技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在计算机可读存储介质中,如ROM/RAM、磁碟、光盘等,包括若干指令用以使得一台计算机装置(可以是个人计算机,服务器,或者网络装置等)执行各个实施例或者实施例的某些部分所述的方法。
最后应说明的是:以上实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的精神和范围。

Claims (14)

  1. 一种隔行视频的去隔行方法,其特征在于,包括:
    检测到待处理视频为场视频;
    确定所述场视频中需要进行去隔行处理的视频帧;
    对每个需要进行去隔行处理的视频帧中的所有像素点进行去隔行处理。
  2. 根据权利要求1所述的方法,其特征在于,检测到待处理视频为场视频之前,包括:
    逐一检测所述待处理视频中每个视频帧内的像素点,并判断所述像素点是否为场效应点;
    当判定所述像素点为场效应点,对所述场效应点进行去隔行处理。
  3. 根据权利要求1或2所述的方法,其特征在于,检测到待处理视频为场视频,包括:
    根据检测出的每个视频帧中包括的场效应点个数,若所述场效应点个数大于预设的明显场阈值,则确定所述视频帧为明显场图像帧;
    根据检测出的所述明显场图像帧的个数,若所述明显场图像帧的个数大于预设的视频帧数阈值,则确定所述待处理视频为场视频。
  4. 根据权利要求1或2所述的方法,其特征在于,确定所述场视频中需要进行去隔行处理的视频帧,包括:
    根据检测出的每个视频帧中包括的场效应点个数,若确定所述场效应点个数大于预设的单帧处理阈值,则确定所述视频帧为需要进行去隔行处理的视频帧。
  5. 根据权利要求1所述的方法,其特征在于,对每个需要进行去隔行处理的视频帧中的所有像素点进行去隔行处理,包括:
    采用YADIF算法对需要进行去隔行处理的视频帧中的所有像素点进行去隔行处理。
  6. 根据权利要求2所述的方法,其特征在于,逐一检测所述待处理视频中每个视频帧内的像素点,并判断所述像素点是否为场效应点,包括:
    获取所述像素点与相邻行内同一位置处的像素点的第一像素差值以及所述像素点与隔行内位于同一位置的像素点的第二像素差值;
    根据预设的相似阈值以及预设的相差阈值判断所述像素点是否为所述场效应点。
  7. 根据权利要求6所述的方法,其特征在于,当所述第一像素差值与所述第二像素差值满足如下公式时,判定所述像素点为所述场效应点:
    d1>diffthd&&d2<simi_thd
    其中,d1为所述第一像素差值,d2为所述第二像素差值,simi_thd为所述相似阈值,diff_thd为所述相差阈值,&&表示逻辑与运算,所述相似阈值以及所述相差阈值都是经验值。
  8. 一种隔行视频的去隔行装置,其特征在于,包括:
    检测模块,用于检测到待处理视频为场视频;
    确定模块,用于确定所述场视频中需要进行去隔行处理的视频帧;
    处理模块,用于对每个需要进行去隔行处理的视频帧中的所有像素点进行去隔行处理。
  9. 根据权利要求8所述的装置,其特征在于,还包括:
    所述检测模块,还用于逐一检测所述待处理视频中每个视频帧内的像素点,并判断所述像素点是否为场效应点;
    所述处理模块,还用于当判定所述像素点为场效应点,对所述场效 应点进行去隔行处理。
  10. 根据权利要求8或9所述的装置,其特征在于,所述检测模块具体用于:
    根据检测出的每个视频帧中包括的场效应点个数,若所述场效应点个数大于预设的明显场阈值,则确定所述视频帧为明显场图像帧;
    根据检测出的所述明显场图像帧的个数,若所述明显场图像帧的个数大于预设的视频帧数阈值,则确定所述待处理视频为场视频。
  11. 根据权利要求8或9所述的装置,其特征在于,所述确定模块具体用于:
    根据检测出的每个视频帧中包括的场效应点个数,若确定所述场效应点个数大于预设的单帧处理阈值,则确定所述视频帧为需要进行去隔行处理的视频帧。
  12. 根据权利要求8所述的装置,其特征在于,所述处理模块用于:
    采用YADIF算法对需要进行去隔行处理的视频帧中的所有像素点进行去隔行处理。
  13. 根据权利要求9所述的装置,其特征在于,所述检测模块,还用于:
    获取所述像素点与相邻行内同一位置处的像素点的第一像素差值以及所述像素点与隔行内位于同一位置的像素点的第二像素差值;
    根据预设的相似阈值以及预设的相差阈值判断所述像素点是否为所述场效应点。
  14. 根据权利要求13所述的装置,其特征在于,所述检测模块,具体用于:
    当所述第一像素差值与所述第二像素差值满足如下公式时,判定所 述像素点为所述场效应点:
    d1>diffthd&&d2<simi_thd
    其中,d1为所述第一像素差值,d2为所述第二像素差值,simi_thd为所述相似阈值,diff_thd为所述相差阈值,&&表示逻辑与运算,所述相似阈值以及所述相差阈值都是经验值。
PCT/CN2016/088690 2015-12-14 2016-07-05 一种隔行视频的去隔行方法及装置 WO2017101348A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/247,660 US20170171501A1 (en) 2015-12-14 2016-08-25 Deinterlacing method for interlaced video and electronic apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510927358.3A CN105898179A (zh) 2015-12-14 2015-12-14 一种隔行视频的去隔行方法及装置
CN201510927358.3 2015-12-14

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/247,660 Continuation US20170171501A1 (en) 2015-12-14 2016-08-25 Deinterlacing method for interlaced video and electronic apparatus

Publications (1)

Publication Number Publication Date
WO2017101348A1 true WO2017101348A1 (zh) 2017-06-22

Family

ID=57002998

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/088690 WO2017101348A1 (zh) 2015-12-14 2016-07-05 一种隔行视频的去隔行方法及装置

Country Status (2)

Country Link
CN (1) CN105898179A (zh)
WO (1) WO2017101348A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108495073B (zh) * 2018-03-29 2020-11-06 瑞芯微电子股份有限公司 一种视频图像帧场检测方法、存储介质及计算机
CN111476803B (zh) * 2020-04-14 2022-08-26 展讯通信(上海)有限公司 视频处理方法及相关设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1984305A (zh) * 2005-12-12 2007-06-20 深圳艾科创新微电子有限公司 一种去隔行技术的三维自适应运动检测方法
CN101552867A (zh) * 2009-05-06 2009-10-07 凌阳科技股份有限公司 去交错处理系统
CN101699856A (zh) * 2009-10-30 2010-04-28 北京中科大洋科技发展股份有限公司 一种运动自适应的去交织方法
US20100177241A1 (en) * 2009-01-12 2010-07-15 Himax Technologies Limited Apparatus and method for motion adaptive de-interlacing with chroma up-sampling error remover
CN102186045A (zh) * 2011-04-20 2011-09-14 广东威创视讯科技股份有限公司 去隔行处理的三场运动检测方法、装置以及去隔行系统
CN104580978A (zh) * 2015-02-11 2015-04-29 北京海尔集成电路设计有限公司 一种视频检测及处理方法、装置

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7116372B2 (en) * 2000-10-20 2006-10-03 Matsushita Electric Industrial Co., Ltd. Method and apparatus for deinterlacing
CN101106685B (zh) * 2007-08-31 2010-06-02 湖北科创高新网络视频股份有限公司 一种基于运动检测的去隔行方法和装置
CN101588444B (zh) * 2008-05-20 2011-07-20 华为技术有限公司 视频数据的去隔行方法、去隔行装置及视频处理系统
CN101309385B (zh) * 2008-07-09 2010-09-08 北京航空航天大学 一种基于运动检测的去隔行处理方法
US8902358B1 (en) * 2012-07-26 2014-12-02 Altera Corporation Method and apparatus for performing robust cadence detection in a video deinterlacer
CN104202555B (zh) * 2014-09-29 2017-10-20 建荣集成电路科技(珠海)有限公司 去隔行方法及装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1984305A (zh) * 2005-12-12 2007-06-20 深圳艾科创新微电子有限公司 一种去隔行技术的三维自适应运动检测方法
US20100177241A1 (en) * 2009-01-12 2010-07-15 Himax Technologies Limited Apparatus and method for motion adaptive de-interlacing with chroma up-sampling error remover
CN101552867A (zh) * 2009-05-06 2009-10-07 凌阳科技股份有限公司 去交错处理系统
CN101699856A (zh) * 2009-10-30 2010-04-28 北京中科大洋科技发展股份有限公司 一种运动自适应的去交织方法
CN102186045A (zh) * 2011-04-20 2011-09-14 广东威创视讯科技股份有限公司 去隔行处理的三场运动检测方法、装置以及去隔行系统
CN104580978A (zh) * 2015-02-11 2015-04-29 北京海尔集成电路设计有限公司 一种视频检测及处理方法、装置

Also Published As

Publication number Publication date
CN105898179A (zh) 2016-08-24

Similar Documents

Publication Publication Date Title
US6839094B2 (en) Method and apparatus for eliminating motion artifacts from video
US6118488A (en) Method and apparatus for adaptive edge-based scan line interpolation using 1-D pixel array motion detection
US6473460B1 (en) Method and apparatus for calculating motion vectors
US7057664B2 (en) Method and system for converting interlaced formatted video to progressive scan video using a color edge detection scheme
US7259794B2 (en) De-interlacing device and method therefor
JP2004312680A (ja) スクロールされるテキストまたはグラフィックデータの検出が可能な動き推定装置および方法
US20050225671A1 (en) Method of Processing Fields of Images and Related Device for Data Lines Similarity Detection
US8111324B2 (en) Apparatus and method for film source reconstruction
US20050018767A1 (en) Apparatus and method for detecting film mode
JPH01318376A (ja) 静止画像フリッカ抑制方法
JP2004007696A (ja) インターレース−プログレッシブ変換用のエッジ適応補間のための方法及びシステム
US9516260B2 (en) Video processing method and apparatus
EP1964395A2 (en) Methods and apparatus for progressive scanning of interlaced video
WO2017101348A1 (zh) 一种隔行视频的去隔行方法及装置
CN101662681A (zh) 一种确定在视频帧序列中的场优势的方法
JP2009164729A (ja) 合成映像検出装置
US8704945B1 (en) Motion adaptive deinterlacer
KR20040063562A (ko) 인터레이스 영상에서의 동일 프레임 검출 장치
CN111294545B (zh) 图像数据插值方法及装置、存储介质、终端
TWI434573B (zh) 短距離運動補償之解交錯掃描的系統、方法及電腦程式產品
US8233085B1 (en) Method and system for interpolating a pixel value of a pixel located at an on-screen display
US8432976B2 (en) System and method for detecting field order of video sequences
US20070140357A1 (en) Method and apparatus for treating a video signal
TWI590663B (zh) 影像處理裝置及其影像處理方法
JP2010213181A (ja) テレシネ映像信号検出装置、映像処理装置、テレシネ映像信号検出方法、およびコンピュータープログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16874410

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16874410

Country of ref document: EP

Kind code of ref document: A1