CN112261406B - Filtering-based ultra-high definition video color bar anomaly real-time detection method - Google Patents

Filtering-based ultra-high definition video color bar anomaly real-time detection method Download PDF

Info

Publication number
CN112261406B
CN112261406B CN202010989802.5A CN202010989802A CN112261406B CN 112261406 B CN112261406 B CN 112261406B CN 202010989802 A CN202010989802 A CN 202010989802A CN 112261406 B CN112261406 B CN 112261406B
Authority
CN
China
Prior art keywords
color bar
candidate
color
video image
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010989802.5A
Other languages
Chinese (zh)
Other versions
CN112261406A (en
Inventor
崔进
孙剑
赵松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Television Information Technology Beijing Co ltd
Original Assignee
China Television Information Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Television Information Technology Beijing Co ltd filed Critical China Television Information Technology Beijing Co ltd
Priority to CN202010989802.5A priority Critical patent/CN112261406B/en
Publication of CN112261406A publication Critical patent/CN112261406A/en
Application granted granted Critical
Publication of CN112261406B publication Critical patent/CN112261406B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/02Diagnosis, testing or measuring for television systems or their details for colour television signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44204Monitoring of content usage, e.g. the number of times a movie has been viewed, copied or the amount which has been watched

Abstract

The invention provides a filtering-based ultra-high definition video color bar anomaly real-time detection method, which comprises the following steps of: scanning the video image frame once every preset n1 rows according to the top-down direction, and identifying candidate color bar areas; identifying a color bar thick detection area; carrying out fine extraction on the color bar area to obtain a color bar identification frame area; calculating color bar confidence in the color bar identification frame area; judging whether the currently detected video image frame is a color bar abnormal frame or not according to the color bar confidence level; and judging based on multi-frame color bar association. The invention has the following advantages: (1) The detection of standard color bars, regional color bars, up-and-down staggered color bars and four or eight color bars is supported, the application range is wide, and the comprehensiveness and the accuracy of color bar detection are improved; (2) strong anti-interference force, and improving the detection precision; (3) When the color bar anomaly detection is carried out, the scanning of the whole pixel point is not needed for the whole video image frame, so that the real-time performance of the color bar anomaly detection is improved.

Description

Filtering-based ultra-high definition video color bar anomaly real-time detection method
Technical Field
The invention belongs to the technical field of color bar anomaly real-time detection, and particularly relates to a filtering-based ultra-high definition video color bar anomaly real-time detection method.
Background
With the development of digital televisions, the number of channels is increased, color bar abnormity is monitored on a broadcasted video picture in real time, and when color bar abnormity is monitored, an alarm is given in time, so that the method has become an important task for guaranteeing the safe broadcasting of television programs.
The color stripes are mainly generated by clamping frames when producing domain production materials or generating color stripes when a camera shoots the materials. The existing various video color bar anomaly detection technologies generally have the problems of limited anti-interference capability, easy occurrence of false detection, limited detection real-time performance and the like, so that the application effect of color bar anomaly detection is restricted.
Disclosure of Invention
Aiming at the defects existing in the prior art, the invention provides a filtering-based ultra-high definition video color bar anomaly real-time detection method which can effectively solve the problems.
The technical scheme adopted by the invention is as follows:
the invention provides a filtering-based ultra-high definition video color bar anomaly real-time detection method, which comprises the following steps of:
step 1, a first video image frame A1 to be detected is obtained, filtering processing is performed on the first video image frame A1, and whether the first video image frame A1 is an abnormal color bar is identified, specifically including the following steps:
step 101, starting from the top line of the image, scanning the first video image frame A1 once every n1 preset lines from top to bottom, and performing the following filtering processing on each scanning line every time a scanning line is scanned:
each scanning line consists of a plurality of pixel points from left to right; identifying a pixel value for each pixel point; presetting a first pixel number threshold value confthresh_1 and a second pixel number threshold value confthresh_2; according to the left-to-right direction, when a plurality of continuous pixel points with the same pixel value are identified, the number of the continuous pixel points with the same pixel value identified at this time is set as confThresh, and whether the following conditions are met or not is judged: confthresh_1< confThresh < confthresh_2; if not, continuing to identify other pixel points to the right; if the pixel values are satisfied, the region formed by the continuous pixel points with the same pixel values identified at this time is called a candidate small region; then, identifying other pixel points to the right until all pixel points of the scanning line are identified; thus, for one of the scan lines, a number of candidate small regions are identified;
step 102, for a plurality of identified candidate small areas belonging to the same scan line, assuming that there are n candidate small areas, the n candidate small areas are expressed as: the 1 st candidate small region, the 2 nd candidate small region, …, the n-th candidate small region, performs the following processing:
step 1021, identifying that the pixel value of the 1 st candidate small area is V1, the pixel value of the 2 nd candidate small area is V2, …, and the pixel value of the n th candidate small area is Vn;
judging whether the pixel value V1 is V2 and … and whether the pixel value Vn accords with a jump rule, and if so, judging that the area formed by n candidate small areas is called an initial color bar area; step 1022 is performed again;
wherein, the jump rule is: pixel value V1 transitions up to pixel value V2, pixel value V2 transitions down to pixel value V3, pixel value V3 transitions up to pixel value V4, and so on; and the absolute value of the difference between any two adjacent pixel values is larger than a jump threshold pixThresh;
step 1022, assuming that the identified initial color stripe region is composed of n2 candidate small regions, determining whether the n2 candidate small regions meet the following requirements:
for n2 candidate small areas, the distance between any two adjacent candidate small areas is smaller than the preset distance disThresh of the candidate small areas, and n2 is larger than 4;
if the color bar area meets the requirement, the initial color bar area is a candidate color bar area; so far, the filtering processing of the scanned scanning line is completed;
step 103, sequentially performing filtering treatment on each scanned scanning line according to the direction from top to bottom, and simultaneously judging whether the following requirements are met in real time:
presetting the continuous number of candidate color bar areas as n3; when n3 candidate color bar regions appear consecutively from top to bottom, and the n3 candidate color bar regions satisfy the following requirements: the left boundary distance between the two adjacent color bar candidate areas is smaller than a set threshold value, and if the right boundary distance between the two adjacent color bar candidate areas is smaller than the set threshold value, stopping scanning the line continuously downwards, and executing step 104;
step 104, identifying a color bar rough detection area, wherein the method comprises the following steps:
determining circumscribed rectangles of n3 candidate color bar areas; then, keeping the upper boundary L1 of the circumscribed rectangle unchanged; extending the left boundary of the circumscribed rectangle downwards until the left boundary reaches the bottommost boundary of the first video image frame A1, thereby obtaining a left boundary extension edge L2; likewise, the right boundary is extended downward until the right boundary reaches the bottommost boundary of the first video image frame A1, thereby obtaining a right boundary extension side L3; connecting the bottom corner point of the left boundary extension side L2 with the bottom corner point of the right boundary extension side L3 to obtain a lower boundary L4; a rectangle surrounded by the lower boundary L4, the right boundary extension edge L3, the upper boundary L1 and the left boundary extension edge L2 is a color bar coarse detection area;
step 105, in the color bar rough detection area, n3 candidate color bar areas are identified; in the color bar rough detection area, and below n3 candidate color bar areas, continuously scanning n1 lines at intervals, and filtering the scanned scanning lines in real time, thereby identifying the candidate color bar areas of the scanning lines, and simultaneously, in the process of scanning and filtering the n1 lines at intervals, judging whether the following requirements are met in real time:
in the color bar rough detection area, the number of continuous candidate color bar areas exceeds a threshold value dthresh from top to bottom;
if the requirements are not met, continuing to scan and filter downwards at intervals of n1 lines; if the above requirement is met, stopping continuing to scan and filter downwards at n1 lines, assuming that the current scan and filter is at line C1, and then executing step 106;
step 106, in the color bar rough detection area, the area above the C1 row is called a color bar identification frame area, thereby finishing the fine extraction of the color bar area;
step 107, scanning the total number of lines to be N1 in the color bar identification frame area; if the number of lines of the candidate color bar area is identified as N2, the color bar confidence e is calculated by adopting the following formula:
e=N2/N1
step 108, if the color bar confidence e is greater than the color bar confidence threshold barThresh, indicating that the currently detected first video image frame A1 is a color bar abnormal frame, wherein the currently detected first video image frame A1 includes a color bar; otherwise, the first video image frame A1 which is detected currently does not contain color bars;
step 2, a multi-frame-based color bar association judging method specifically comprises the following steps:
reading a next adjacent second video image frame A2; aligning the second video image frame A2 with the first video image frame A1 so as to locate the color bar identification frame region B2 at the same position of the second video image frame A2 in accordance with the color bar identification frame region B1 located in the first video image frame A1;
in the color bar identification frame area B2, performing n 1-line filtering processing, finally calculating color bar confidence coefficient e, and judging whether the second video image frame A2 is a color bar abnormal frame or not according to the calculated color bar confidence coefficient e;
by adopting the same method, the number of preset continuous frames is set to be N3, color bar anomaly identification is sequentially carried out on the rest N3-1 video image frames from the first video image frame A1, when the continuous N3 video image frames are all color bar anomaly frames, the continuous N3 frames are finally obtained to be the color bar anomaly frames, and alarm prompt is carried out.
The ultra-high definition video color bar anomaly real-time detection method based on filtering provided by the invention has the following advantages:
(1) The method not only supports 100% and 75% of standard color bar detection, but also supports detection of regional color bars, SMPYE color bars, up-down misplacement color bars, four and eight color bars, has wide application range and improves the comprehensiveness and accuracy of color bar detection;
(2) The anti-interference force is strong, and the detection precision is improved;
(3) When the color bar anomaly detection is carried out, the scanning of the whole pixel point is not needed for the whole video image frame, so that the real-time performance of the color bar anomaly detection is improved.
Drawings
FIG. 1 is a flow chart of a method for detecting abnormal color bars of ultra-high definition video in real time based on filtering;
FIG. 2 is a schematic diagram of candidate small regions and initial color bar regions;
FIG. 3 is a schematic diagram of a color bar coarse detection area;
FIG. 4 is a schematic view of a color bar identification frame area;
fig. 5 is a specific application scene diagram of the color bar identification frame area display.
Detailed Description
In order to make the technical problems, technical schemes and beneficial effects solved by the invention more clear, the invention is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
The invention provides a filtering-based ultra-high definition video color bar anomaly real-time detection method, which has the following advantages:
(1) The method not only supports 100% and 75% of standard color bar detection, but also supports detection of regional color bars, SMPYE color bars, up-down misplacement color bars, four and eight color bars, has wide application range and improves the comprehensiveness and accuracy of color bar detection;
(2) The anti-interference force is strong, and the detection precision is improved;
the invention can detect various standard and nonstandard color bars with a center band, characters, images and the like, and solves the technical problem that the existing color bar detection technology can only detect single standard color bars; meanwhile, the invention can eliminate channel package contents such as station marks, corner marks, clocks, bottom flying captions and the like and scene interference similar to color bars, and improve the abnormal detection accuracy of the color bars.
(3) When the color bar anomaly detection is carried out, the scanning of the whole pixel point is not needed to be carried out on the whole video image frame, so that the real-time performance of the color bar anomaly detection is improved.
The method provided by the invention is applied to color bar anomaly detection of high/standard definition and 4K ultra-high definition SDI signals, can rapidly locate fault points on a broadcasting link in real time, and provides a reliable basis for intelligent emergency switching.
The invention provides a filtering-based ultra-high definition video color bar anomaly real-time detection method, which mainly comprises the following steps:
the color bar detection method does not use single colors and characteristics such as standard values, histograms, template matching and the like of the color bars to judge the color bars, determines the area where the color bars are in abnormal state through filtering line by line, and then carries out color bar detection according to the mode of combining various characteristics such as equal width of 4 or 8 hue areas of the color bars, jump number of pixels of each color being more than 3, vertical distribution and the like, thereby preventing erroneous judgment. After the video image frame is filtered according to the rows and color bars are identified, candidate color bar areas are primarily obtained, but the candidate color bar areas are often mixed with lines (such as color bars of special effect scenes, bar or lattice background and the like) with other similar characteristics on the video image frame, and in order to effectively remove the noise scenes, multi-frame association is combined to improve the color bar detection accuracy of the video image frame.
Referring to fig. 1, the method for detecting the color bar anomaly of the ultra-high definition video in real time based on filtering comprises the following steps:
step 1, a first video image frame A1 to be detected is obtained, filtering processing is performed on the first video image frame A1, and whether the first video image frame A1 is an abnormal color bar is identified, specifically including the following steps:
step 101, starting from the topmost line of the image, scanning the first video image frame A1 once every other preset n1 lines in the top-down direction; the specific number of n1 rows may be flexibly set according to the actual detection requirement, and may be 2 rows, 4 rows, etc., which is not limited in the present invention.
Each time a scan line is scanned, the row height of each scan line is equal, and the following filtering process is performed on the scan line:
each scanning line consists of a plurality of pixel points from left to right; identifying a pixel value for each pixel point; presetting a first pixel number threshold value confthresh_1 and a second pixel number threshold value confthresh_2; the first pixel number threshold value confthresh_1 is the minimum width of the color stripes appearing on the video image, and the second pixel number threshold value confthresh_2 is the maximum width of the color stripes appearing on the video image. According to the left-to-right direction, when a plurality of continuous pixel points with the same pixel value are identified, the number of the continuous pixel points with the same pixel value identified at this time is set as confThresh, and whether the following conditions are met or not is judged: confthresh_1< confThresh < confthresh_2; as a specific implementation, assume that the row height of the scan line is width; then confthresh_1 may be set to width/8; confthresh_2 may be set to width/4;
if not, continuing to identify other pixel points to the right; if the color bar is satisfied, the area formed by the continuous pixel points with the same pixel value identified at this time is called a candidate small area, and the candidate small area is a uniform color bar; then, identifying other pixel points to the right until all pixel points of the scanning line are identified; thus, for one of the scan lines, a number of candidate small regions are identified;
referring to fig. 2, h1 represents the 1 st scan line; h2 represents the 2 nd scan line; h3 represents the 3 rd scan line; when the 1 st scanning line is scanned and filtered, 5 uniform color bands are identified, namely a color band f1, a color band f2, a color band f3, a color band f4 and a color band k1; that is, the pixel values of the pixel points in each color band are the same. The widths of the color bars f1, f2, f3 and f4 meet the requirements; while the width of the color bar k1 does not meet the requirements. Therefore, the bands f1, f2, f3, f4 are candidate small areas, and the band k1 is not a candidate small area. Therefore, the color bar f1 is the candidate small region f1, the color bar f2 is the candidate small region f2, the color bar f3 is the candidate small region f3, and the color bar f4 is the candidate small region f4.
Step 102, for a plurality of identified candidate small areas belonging to the same scan line, assuming that there are n candidate small areas, the n candidate small areas are expressed as: the 1 st candidate small region, the 2 nd candidate small region, …, the n-th candidate small region, performs the following processing:
step 1021, identifying that the pixel value of the 1 st candidate small area is V1, the pixel value of the 2 nd candidate small area is V2, …, and the pixel value of the n th candidate small area is Vn;
judging whether the pixel value V1 is V2 and … and whether the pixel value Vn accords with a jump rule, and if so, judging that the area formed by n candidate small areas is called an initial color bar area; step 1022 is performed again;
wherein, the jump rule is: pixel value V1 transitions up to pixel value V2, pixel value V2 transitions down to pixel value V3, pixel value V3 transitions up to pixel value V4, and so on; and the absolute value of the difference between any two adjacent pixel values is larger than a jump threshold pixThresh;
still taking fig. 2 as an example, when scanning and filtering the 1 st scan line, there are 4 candidate small areas, in order from left to right: candidate small region f1, candidate small region f2, candidate small region f3, candidate small region f4.
In this step, if the pixel values of the candidate small region f1, the candidate small region f2, the candidate small region f3, and the candidate small region f4 are respectively: 0. 100, 50, 100, the 4 candidate small areas satisfy the jump rule, so the area formed by the 4 candidate small areas is the initial color stripe area, namely, the rectangular area surrounded by the vertexes r1, r2, r4 and r3 in fig. 2.
That is, for a television color bar, it is composed of a plurality of color bands; the above jump rule is satisfied between the pixel values of adjacent color bands, and the number of color bands is within a specific range. For non-television color bars, the pixel values of adjacent color bars generally do not meet the above jump rule, or the number of color bars does not meet the above requirement.
By the method, the color bar area can be primarily identified.
Step 1022, assuming that the identified initial color stripe region is composed of n2 candidate small regions, determining whether the n2 candidate small regions meet the following requirements:
for n2 candidate small areas, the distance between any two adjacent candidate small areas is smaller than the preset distance disThresh of the candidate small areas, and n2 is larger than 4;
if the color bar area meets the requirement, the initial color bar area is a candidate color bar area; so far, the filtering processing of the scanned scanning line is completed;
step 103, sequentially performing filtering treatment on each scanned scanning line according to the direction from top to bottom, and simultaneously judging whether the following requirements are met in real time:
presetting the continuous number of candidate color bar areas as n3; when n3 candidate color bar regions appear consecutively from top to bottom, and the n3 candidate color bar regions satisfy the following requirements: the left boundary distance between the two adjacent color bar candidate areas is smaller than a set threshold value, and if the right boundary distance between the two adjacent color bar candidate areas is smaller than the set threshold value, stopping scanning the line continuously downwards, and executing step 104;
referring to fig. 3, assume that n3 is 20; when the candidate color bar areas ST20 are scanned and identified, the 20 candidate color bar areas ST1 to ST20 satisfy the above requirement, at which point the downward scanning is not continued, but step 104 is performed.
Step 104, identifying a color bar rough detection area, wherein the method comprises the following steps:
determining circumscribed rectangles of n3 candidate color bar areas; then, keeping the upper boundary L1 of the circumscribed rectangle unchanged; extending the left boundary of the circumscribed rectangle downwards until the left boundary reaches the bottommost boundary of the first video image frame A1, thereby obtaining a left boundary extension edge L2; likewise, the right boundary is extended downward until the right boundary reaches the bottommost boundary of the first video image frame A1, thereby obtaining a right boundary extension side L3; connecting the bottom corner point of the left boundary extension side L2 with the bottom corner point of the right boundary extension side L3 to obtain a lower boundary L4; a rectangle surrounded by the lower boundary L4, the right boundary extension edge L3, the upper boundary L1 and the left boundary extension edge L2 is a color bar coarse detection area;
referring to fig. 3, the rectangle enclosed by the vertices S1, S2, S3, and S4 is the color stripe rough detection area.
Step 105, in the color bar rough detection area, n3 candidate color bar areas are identified; in the color bar rough detection area, and below n3 candidate color bar areas, continuously scanning n1 lines at intervals, and filtering the scanned scanning lines in real time, thereby identifying the candidate color bar areas of the scanning lines, and simultaneously, in the process of scanning and filtering the n1 lines at intervals, judging whether the following requirements are met in real time:
in the color bar rough detection area, the number of continuous candidate color bar areas exceeds a threshold value dthresh from top to bottom;
if the requirements are not met, continuing to scan and filter downwards at intervals of n1 lines; if the above requirement is met, stopping continuing to scan and filter downwards at n1 lines, assuming that the current scan and filter is at line C1, and then executing step 106;
step 106, in the color bar rough detection area, the area above the C1 row is called a color bar identification frame area, thereby finishing the fine extraction of the color bar area;
referring to fig. 4, the rectangle enclosed by the vertices S1, S5, S6, and S4 is the color bar identification frame area.
Referring to fig. 5, a specific application scenario is illustrated. In fig. 5, the rectangle enclosed by the vertices T1, T2, T3, and T4 is the color bar identification frame area.
Step 107, scanning the total number of lines to be N1 in the color bar identification frame area; if the number of lines of the candidate color bar area is identified as N2, the color bar confidence e is calculated by adopting the following formula:
e=N2/N1
step 108, if the color bar confidence e is greater than the color bar confidence threshold barThresh, indicating that the currently detected first video image frame A1 is a color bar abnormal frame, wherein the currently detected first video image frame A1 includes a color bar; otherwise, the first video image frame A1 which is detected currently does not contain color bars;
through the steps, in the invention, through the rough extraction of the color bar area, the image data without color bars can be removed from the video image, the image data with color bars can be reserved, the data processing process can be simplified, unnecessary data processing and calculation are avoided, and the data processing efficiency is improved.
Step 2, a multi-frame-based color bar association judging method specifically comprises the following steps:
reading a next adjacent second video image frame A2; aligning the second video image frame A2 with the first video image frame A1 so as to locate the color bar identification frame region B2 at the same position of the second video image frame A2 in accordance with the color bar identification frame region B1 located in the first video image frame A1;
in the color bar identification frame area B2, performing n 1-line filtering processing, finally calculating color bar confidence coefficient e, and judging whether the second video image frame A2 is a color bar abnormal frame or not according to the calculated color bar confidence coefficient e;
by adopting the same method, the number of preset continuous frames is set to be N3, color bar anomaly identification is sequentially carried out on the rest N3-1 video image frames from the first video image frame A1, when the continuous N3 video image frames are all color bar anomaly frames, the continuous N3 frames are finally obtained to be the color bar anomaly frames, and alarm prompt is carried out.
That is, in the continuous color bar identification frame region, each video image frame is compared, and when the color bar of the continuous video image frame continues for a certain frame, an alarm occurs.
Therefore, after the color bar area is extracted finely, a color bar identification frame area is obtained, color bar confidence coefficient calculation is carried out in the color bar identification frame area of multiple frames, and if the confidence coefficient value is larger than the color bar confidence coefficient threshold barThresh continuously, the color bar abnormal frame is determined. In this embodiment, the color bar abnormal state is determined by the color bar confidence, which not only can adjust the sensitivity of color bar detection, but also can avoid false detection caused by the situation that the actual static frame is detected as the non-static frame by the program in an image comparison mode.
Therefore, the filtering-based ultra-high definition video color bar anomaly real-time detection method provided by the invention comprises the steps of firstly acquiring a video image frame to be detected, such as a 4K ultra-high definition image; then, generating a filter map of the color bar image from top to bottom according to the rows, and simultaneously determining a color bar detection area in the filter map according to the position information and the continuity of the color bars generated in each row to finish the rough extraction of the color bar area; and then continuously generating a color bar filtering map from top to bottom according to the rows in the color bar rough extraction area, determining a color bar identification frame according to the vertical distribution characteristics of the color bars, finishing fine extraction of the color bar area, and obtaining the color bar confidence coefficient. And finally, continuously judging the consistency of the images by multiple frames in the color bar identification frame according to the multiple frame association, and judging the color bar abnormal state.
Compared with the prior art, the color bar detection method and device can improve the color bar detection precision by utilizing the mode of generating the color bar filtering map, performing coarse extraction and then fine extraction, and then determining the color bar area through multi-frame association, thereby further relieving the technical problems that the color bar accuracy identified by the existing color bar detection technology is poor and the color bar detection technology is limited to identifying 100% and 75% of color bars in the full screen.
As can be seen from the above description, in the present invention, the image to be detected may be acquired by a 4K ultra-high definition signal, and the image may be sent to a terminal device for identification, or sent to a server for identification, which is not particularly limited in the present invention.
The invention provides a filtering-based ultra-high definition video color bar anomaly real-time detection method, which has the following advantages:
1. the color bar detection method is used for detecting color bars creatively. No prior information such as color bar standard values is needed to be known.
2. The invention creatively uses the method of the color stripe confidence to determine the confidence of the color stripe region.
3. The method of the invention creatively uses the combination of coarse extraction and fine extraction, improves the accuracy and the efficiency of color bar detection, and obtains real-time detection in 4K ultra-high definition video.
4. The invention relates to a universal method for detecting the abnormal states of various standard color bars and non-standard color bars. The invention can detect eight color bars of the full screen, and simultaneously supports the detection of non-standard abnormal color bars such as regional color bars, SMPYE color bars, four color bars and the like.
5. The invention does not need to adjust the threshold value for different videos and has good adaptability.
6. The invention eliminates the content of channel packages such as station marks, corner marks, clocks, bottom fly captions and the like and the interference of special effect scenes, and prevents missed detection.
It will be appreciated by those skilled in the art that implementing all or part of the above-described methods in accordance with the embodiments may be accomplished by hardware associated with computer program instructions, and that the above-described program may be stored on a computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a random access Memory (RAM: randomAccess Memory), or the like.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which is also intended to be covered by the present invention.

Claims (1)

1. The method for detecting the color bar anomaly of the ultra-high definition video in real time based on filtering is characterized by comprising the following steps of:
step 1, a first video image frame A1 to be detected is obtained, filtering processing is performed on the first video image frame A1, and whether the first video image frame A1 is an abnormal color bar is identified, specifically including the following steps:
step 101, starting from the top line of the image, scanning the first video image frame A1 once every n1 preset lines from top to bottom, and performing the following filtering processing on each scanning line every time a scanning line is scanned:
each scanning line consists of a plurality of pixel points from left to right; identifying a pixel value for each pixel point; presetting a first pixel number threshold value confthresh_1 and a second pixel number threshold value confthresh_2; according to the left-to-right direction, when a plurality of continuous pixel points with the same pixel value are identified, the number of the continuous pixel points with the same pixel value identified at this time is set as confThresh, and whether the following conditions are met or not is judged: confthresh_1< confThresh < confthresh_2; if not, continuing to identify other pixel points to the right; if the pixel values are satisfied, the region formed by the continuous pixel points with the same pixel values identified at this time is called a candidate small region; then, identifying other pixel points to the right until all pixel points of the scanning line are identified; thus, for one of the scan lines, a number of candidate small regions are identified;
step 102, for a plurality of identified candidate small areas belonging to the same scan line, assuming that there are n candidate small areas, the n candidate small areas are expressed as: the 1 st candidate small region, the 2 nd candidate small region, …, the n-th candidate small region, performs the following processing:
step 1021, identifying that the pixel value of the 1 st candidate small area is V1, the pixel value of the 2 nd candidate small area is V2, …, and the pixel value of the n th candidate small area is Vn;
judging whether the pixel value V1 is V2 and … and whether the pixel value Vn accords with a jump rule, and if so, judging that the area formed by n candidate small areas is called an initial color bar area; step 1022 is performed again;
wherein, the jump rule is: pixel value V1 transitions up to pixel value V2, pixel value V2 transitions down to pixel value V3, pixel value V3 transitions up to pixel value V4, and so on; and the absolute value of the difference between any two adjacent pixel values is larger than a jump threshold pixThresh;
step 1022, assuming that the identified initial color stripe region is composed of n2 candidate small regions, determining whether the n2 candidate small regions meet the following requirements:
for n2 candidate small areas, the distance between any two adjacent candidate small areas is smaller than the preset distance disThresh of the candidate small areas, and n2 is larger than 4;
if the color bar area meets the requirement, the initial color bar area is a candidate color bar area; so far, the filtering processing of the scanned scanning line is completed;
step 103, sequentially performing filtering treatment on each scanned scanning line according to the direction from top to bottom, and simultaneously judging whether the following requirements are met in real time:
presetting the continuous number of candidate color bar areas as n3; when n3 candidate color bar regions appear consecutively from top to bottom, and the n3 candidate color bar regions satisfy the following requirements: the left boundary distance between the two adjacent color bar candidate areas is smaller than a set threshold value, and if the right boundary distance between the two adjacent color bar candidate areas is smaller than the set threshold value, stopping scanning the line continuously downwards, and executing step 104;
step 104, identifying a color bar rough detection area, wherein the method comprises the following steps:
determining circumscribed rectangles of n3 candidate color bar areas; then, keeping the upper boundary L1 of the circumscribed rectangle unchanged; extending the left boundary of the circumscribed rectangle downwards until the left boundary reaches the bottommost boundary of the first video image frame A1, thereby obtaining a left boundary extension edge L2; likewise, the right boundary is extended downward until the right boundary reaches the bottommost boundary of the first video image frame A1, thereby obtaining a right boundary extension side L3; connecting the bottom corner point of the left boundary extension side L2 with the bottom corner point of the right boundary extension side L3 to obtain a lower boundary L4; a rectangle surrounded by the lower boundary L4, the right boundary extension edge L3, the upper boundary L1 and the left boundary extension edge L2 is a color bar coarse detection area;
step 105, in the color bar rough detection area, n3 candidate color bar areas are identified; in the color bar rough detection area, and below n3 candidate color bar areas, continuously scanning n1 lines at intervals, and filtering the scanned scanning lines in real time, thereby identifying the candidate color bar areas of the scanning lines, and simultaneously, in the process of scanning and filtering the n1 lines at intervals, judging whether the following requirements are met in real time:
in the color bar rough detection area, the number of continuous candidate color bar areas exceeds a threshold value dthresh from top to bottom;
if the requirements are not met, continuing to scan and filter downwards at intervals of n1 lines; if the above requirement is met, stopping continuing to scan and filter downwards at n1 lines, assuming that the current scan and filter is at line C1, and then executing step 106;
step 106, in the color bar rough detection area, the area above the C1 row is called a color bar identification frame area, thereby finishing the fine extraction of the color bar area;
step 107, scanning the total number of lines to be N1 in the color bar identification frame area; if the number of lines of the candidate color bar area is identified as N2, the color bar confidence e is calculated by adopting the following formula:
e=N2/N1
step 108, if the color bar confidence e is greater than the color bar confidence threshold barThresh, indicating that the currently detected first video image frame A1 is a color bar abnormal frame, wherein the currently detected first video image frame A1 includes a color bar; otherwise, the first video image frame A1 which is detected currently does not contain color bars;
step 2, a multi-frame-based color bar association judging method specifically comprises the following steps:
reading a next adjacent second video image frame A2; aligning the second video image frame A2 with the first video image frame A1 so as to locate the color bar identification frame region B2 at the same position of the second video image frame A2 in accordance with the color bar identification frame region B1 located in the first video image frame A1;
in the color bar identification frame area B2, performing n 1-line filtering processing, finally calculating color bar confidence coefficient e, and judging whether the second video image frame A2 is a color bar abnormal frame or not according to the calculated color bar confidence coefficient e;
by adopting the same method, the number of preset continuous frames is set to be N3, color bar anomaly identification is sequentially carried out on the rest N3-1 video image frames from the first video image frame A1, when the continuous N3 video image frames are all color bar anomaly frames, the continuous N3 frames are finally obtained to be the color bar anomaly frames, and alarm prompt is carried out.
CN202010989802.5A 2020-09-18 2020-09-18 Filtering-based ultra-high definition video color bar anomaly real-time detection method Active CN112261406B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010989802.5A CN112261406B (en) 2020-09-18 2020-09-18 Filtering-based ultra-high definition video color bar anomaly real-time detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010989802.5A CN112261406B (en) 2020-09-18 2020-09-18 Filtering-based ultra-high definition video color bar anomaly real-time detection method

Publications (2)

Publication Number Publication Date
CN112261406A CN112261406A (en) 2021-01-22
CN112261406B true CN112261406B (en) 2023-08-22

Family

ID=74232318

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010989802.5A Active CN112261406B (en) 2020-09-18 2020-09-18 Filtering-based ultra-high definition video color bar anomaly real-time detection method

Country Status (1)

Country Link
CN (1) CN112261406B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1452331A (en) * 2003-04-24 2003-10-29 北京永新同方信息工程有限公司 Digitized real time multi-channel video and audio differential mode detecting method
CN101815222A (en) * 2009-02-25 2010-08-25 北大方正集团有限公司 Video color bar detecting method and device
KR101617428B1 (en) * 2014-11-24 2016-05-12 대한민국(국가기록원) Method and apparatus for degraded region detection in digital video file

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1452331A (en) * 2003-04-24 2003-10-29 北京永新同方信息工程有限公司 Digitized real time multi-channel video and audio differential mode detecting method
CN101815222A (en) * 2009-02-25 2010-08-25 北大方正集团有限公司 Video color bar detecting method and device
KR101617428B1 (en) * 2014-11-24 2016-05-12 대한민국(국가기록원) Method and apparatus for degraded region detection in digital video file

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
电影频道SDI信号检测方法最新探索和实现;李嘉林;;现代电视技术(01);全文 *

Also Published As

Publication number Publication date
CN112261406A (en) 2021-01-22

Similar Documents

Publication Publication Date Title
CN110136071B (en) Image processing method and device, electronic equipment and storage medium
US5546131A (en) Television receiver having an arrangement for vertically shifting subtitles
US8363132B2 (en) Apparatus for demosaicing colors and method thereof
US20030016864A1 (en) Methods of and system for detecting a cartoon in a video data stream
US9916645B2 (en) Chroma subsampling
US7271850B2 (en) Method and apparatus for cross color/cross luminance suppression
US8077774B1 (en) Automated monitoring of digital video image quality
US7822271B2 (en) Method and apparatus of false color suppression
US7280159B2 (en) Method and apparatus for cross color and/or cross luminance suppression
CN112261406B (en) Filtering-based ultra-high definition video color bar anomaly real-time detection method
US8891609B2 (en) System and method for measuring blockiness level in compressed digital video
US6031581A (en) System for removing color bleed in a television image adapted for digital printing
US6865337B1 (en) System and method for detecting modifications of video signals designed to prevent copying by traditional video tape recorders
CN107666560B (en) Video de-interlacing method and device
US20200396440A1 (en) Method for video quality detection and image processing circuit using the same
WO2020259041A1 (en) Image processing method and device
US20060218619A1 (en) Block artifacts detection
US8908093B2 (en) Determining aspect ratio for display of video
CN111414877B (en) Table cutting method for removing color frame, image processing apparatus and storage medium
CN114170153A (en) Wafer defect detection method and device, electronic equipment and storage medium
CN113313707A (en) Original image processing method, device, equipment and readable storage medium
US9883162B2 (en) Stereoscopic image inspection device, stereoscopic image processing device, and stereoscopic image inspection method
US7092575B2 (en) Moving image encoding apparatus and moving image encoding method
CN114666649B (en) Identification method and device of subtitle cut video, electronic equipment and storage medium
JP5067044B2 (en) Image processing apparatus and image processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant