CN112261406A - Ultrahigh-definition video color bar anomaly real-time detection method based on filtering - Google Patents

Ultrahigh-definition video color bar anomaly real-time detection method based on filtering Download PDF

Info

Publication number
CN112261406A
CN112261406A CN202010989802.5A CN202010989802A CN112261406A CN 112261406 A CN112261406 A CN 112261406A CN 202010989802 A CN202010989802 A CN 202010989802A CN 112261406 A CN112261406 A CN 112261406A
Authority
CN
China
Prior art keywords
color bar
color
candidate
video image
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010989802.5A
Other languages
Chinese (zh)
Other versions
CN112261406B (en
Inventor
崔进
孙剑
赵松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Television Information Technology Beijing Co ltd
Original Assignee
China Television Information Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Television Information Technology Beijing Co ltd filed Critical China Television Information Technology Beijing Co ltd
Priority to CN202010989802.5A priority Critical patent/CN112261406B/en
Publication of CN112261406A publication Critical patent/CN112261406A/en
Application granted granted Critical
Publication of CN112261406B publication Critical patent/CN112261406B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/02Diagnosis, testing or measuring for television systems or their details for colour television signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44204Monitoring of content usage, e.g. the number of times a movie has been viewed, copied or the amount which has been watched

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a filtering-based ultrahigh-definition video color bar anomaly real-time detection method, which comprises the following steps of: scanning the video image frame once every preset n1 lines according to the top-down direction, and identifying candidate color bar areas; identifying a color bar coarse detection area; carrying out fine extraction on the color bar area to obtain a color bar identification frame area; calculating the confidence coefficient of the color bar in the color bar identification frame area; judging whether the currently detected video image frame is a color bar abnormal frame or not according to the color bar confidence coefficient; and judging based on the association of the multi-frame color bars. The invention has the following advantages: (1) the detection of standard color strips, regional color strips, vertically staggered color strips and four or eight color strips is supported, the application range is wide, and the comprehensiveness and the accuracy of color strip detection are improved; (2) the anti-interference ability is strong, and the detection precision is improved; (3) when the method is used for detecting the abnormal color stripes, the scanning of the overall pixel points of the whole video image frame is not needed, so that the real-time performance of the abnormal color stripe detection is improved.

Description

Ultrahigh-definition video color bar anomaly real-time detection method based on filtering
Technical Field
The invention belongs to the technical field of color bar abnormal state real-time detection, and particularly relates to a filtering-based ultrahigh-definition video color bar abnormal state real-time detection method.
Background
With the development of digital televisions, the number of channels is more and more, the abnormal color bar monitoring is carried out on the played video pictures in real time, and when the abnormal color bar state is monitored, the timely alarm is given, so that the method becomes an important task for guaranteeing the safe playing of television programs.
The color bar is mainly generated by frame clamping when the material is produced in a production field or by the color bar when the material is shot by a camera. The existing various video color bar abnormal state detection technologies generally have the problems of limited interference resistance, easy occurrence of false detection, limited detection real-time performance and the like, so that the application effect of color bar abnormal state detection is restricted.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides a filtering-based ultrahigh-definition video color bar anomaly real-time detection method, which can effectively solve the problems.
The technical scheme adopted by the invention is as follows:
the invention provides a filtering-based ultrahigh-definition video color bar anomaly real-time detection method, which comprises the following steps of:
step 1, acquiring a first video image frame a1 to be detected, filtering the first video image frame a1, and identifying whether the first video image frame a1 is an abnormal color bar, specifically comprising the following steps:
step 101, scanning the first video image frame a1 once every preset n1 lines from the top line of the image in the up-down direction, and performing the following filtering process on each scanned line every time one scanned line is scanned:
each scanning line consists of a plurality of pixel points from left to right; identifying a pixel value of each pixel point; presetting a first pixel point quantity threshold confThresh _1 and a second pixel point quantity threshold confThresh _ 2; according to the left-to-right direction, when a plurality of continuous pixel points with the same pixel value are identified, the number of the continuous pixel points with the same pixel value identified this time is set as confThresh, and whether the following conditions are met is judged: confThresh _1< confThresh _ 2; if not, continuing to identify other pixel points rightwards; if yes, the area formed by the continuous pixel points with the same pixel value identified this time is called a candidate small area; then, continuing to identify other pixel points rightwards until all the pixel points of the scanning line are identified; thus, for one of said scan lines, several candidate small regions are identified;
step 102, assuming that there are n candidate small regions for the identified candidate small regions belonging to the same scan line, respectively expressed as: the 1 st candidate small region, the 2 nd candidate small region, … and the nth candidate small region are processed as follows:
step 1021, identifying that the pixel value of the 1 st candidate small region is V1, the pixel value of the 2 nd candidate small region is V2, …, and the pixel value of the nth candidate small region is Vn;
judging whether the pixel value V1 is V2, … and the pixel value Vn accords with the jump rule, if so, the area formed by the n candidate small areas is called as an initial color bar area; then, go to step 1022;
wherein, the jump rule is as follows: the pixel value V1 rising jumps to a pixel value V2, the pixel value V2 falling jumps to a pixel value V3, the pixel value V3 rising jumps to a pixel value V4, and so on; the absolute value of the difference between any two adjacent pixel values is greater than the jump threshold pixThresh;
step 1022, assuming that the identified initial ticker area consists of n2 candidate small areas, determining whether the n2 candidate small areas meet the following requirements:
for n2 candidate small regions, the distance between any two adjacent candidate small regions is less than the preset distance disThresh of the candidate small region, and n2 is greater than 4;
if the color bar area meets the requirement, the initial color bar area is a candidate color bar area; so far, the filtering processing of the scanned scanning lines is completed;
step 103, sequentially filtering each scanned line from top to bottom, and simultaneously, judging whether the following requirements are met in real time:
presetting the continuous number of candidate color bar areas as n 3; when going from top to bottom, n3 consecutive candidate ticker areas appear, and n3 candidate ticker areas meet the following requirements: if the left boundary distance between the adjacent upper and lower candidate color bar areas is smaller than the set threshold value and the right boundary distance between the adjacent upper and lower candidate color bar areas is smaller than the set threshold value, stopping scanning the lines downwards continuously, and executing the step 104;
step 104, identifying a color bar coarse detection area, wherein the method comprises the following steps:
determining circumscribed rectangles of the n3 candidate colorstripe regions; then, the upper boundary L1 of the circumscribed rectangle is kept unchanged; extending the left border of the circumscribed rectangle downward until the left border reaches the bottom-most border of the first video image frame a1, thereby resulting in a left border extension L2; likewise, the right boundary is extended downward until the right boundary reaches the bottommost boundary of the first video image frame a1, thereby obtaining a right boundary extension side L3; connecting the bottom corner point of the left boundary extension side L2 with the bottom corner point of the right boundary extension side L3 to obtain a lower boundary L4; a rectangle surrounded by the lower boundary L4, the right boundary extended side L3, the upper boundary L1 and the left boundary extended side L2 is a color bar coarse detection area;
step 105, in the color bar coarse detection area, n3 candidate color bar areas are identified; in the coarse color bar detection area and below n3 candidate color bar areas, scanning is continuously performed at intervals of n1 lines, filtering processing is performed on the scanned scanning lines in real time, and therefore the candidate color bar areas of the scanning lines are identified, and meanwhile, in the process of scanning and filtering at intervals of n1 lines, whether the following requirements are met or not is judged in real time:
in the coarse color bar detection area, from top to bottom, the number of continuous candidate color bar areas exceeds a threshold value dthresh;
if the above requirements are not met, continuing scanning and filtering the downward n1 lines; if the above requirements are met, stopping scanning and filtering at n1 lines downwards, assuming that the current scanning and filtering line is the C1 line, and then executing the step 106;
106, in the coarse color bar detection area, the area above the C1 th line is called a color bar identification frame area, so that fine color bar area extraction is completed;
step 107, scanning the total line number N1 in the color bar identification frame area; if the number of lines in the candidate color bar region is identified as N2, the color bar confidence e is calculated using the following formula:
e=N2/N1
step 108, if the color bar confidence e is greater than the color bar confidence threshold barThresh, it indicates that the currently detected first video image frame a1 is a color bar abnormal frame, which represents that the currently detected first video image frame a1 contains color bars; otherwise, it indicates that the currently detected first video image frame a1 contains no color bar;
step 2, a multi-frame-based color bar association judgment method specifically comprises the following steps:
reading the next adjacent second video image frame a 2; aligning the second video image frame a2 with the first video image frame a1 so that the color bar recognition frame region B2 is located at the same position of the second video image frame a2 according to the color bar recognition frame region B1 located in the first video image frame a 1;
performing n-spaced 1 line filtering processing in a color bar identification frame area B2, finally calculating a color bar confidence coefficient e, and judging whether the second video image frame A2 is a color bar abnormal frame according to the calculated color bar confidence coefficient e;
by adopting the same method, the preset number of the continuous frames is N3, the color bar abnormal state recognition is sequentially carried out on the rest N3-1 video image frames from the first video image frame A1, and when the continuous N3 video image frames are all color bar abnormal state frames, the continuous N3 frames are finally obtained as the color bar abnormal state frames to carry out alarm prompt.
The ultrahigh-definition video color bar anomaly real-time detection method based on filtering provided by the invention has the following advantages:
(1) the method supports 100% and 75% of standard color bar detection, also supports the detection of regional color bars, SMPYE color bars, vertically staggered color bars and color bars with four or eight colors, has wide application range, and improves the comprehensiveness and accuracy of color bar detection;
(2) the anti-interference ability is strong, and the detection precision is improved;
(3) when the method is used for detecting the abnormal color stripes, the scanning of the overall pixel points of the whole video image frame is not needed, so that the real-time performance of the abnormal color stripe detection is improved.
Drawings
Fig. 1 is a schematic flow chart of a filtering-based ultrahigh-definition video color bar anomaly real-time detection method according to the present invention;
FIG. 2 is a schematic diagram of a candidate small region and an initial color bar region;
FIG. 3 is a schematic diagram of a color bar coarse detection area;
FIG. 4 is a schematic diagram of a color bar identification box area;
fig. 5 is a diagram of a specific application scene displayed in a color bar identification frame area.
Detailed Description
In order to make the technical problems, technical solutions and advantageous effects solved by the present invention more clearly apparent, the present invention is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The invention provides a filtering-based ultrahigh-definition video color bar anomaly real-time detection method, which has the following advantages:
(1) the method supports 100% and 75% of standard color bar detection, also supports the detection of regional color bars, SMPYE color bars, vertically staggered color bars and color bars with four or eight colors, has wide application range, and improves the comprehensiveness and accuracy of color bar detection;
(2) the anti-interference ability is strong, and the detection precision is improved;
the invention can detect various standard and non-standard color strips with center strips, characters, images and the like, and solves the technical problem that the existing color strip detection technology can only detect single standard color strips; meanwhile, the invention can eliminate channel packaging contents such as station caption, corner caption, clock, bottom flying caption and the like and scene interference similar to the color bar, and improve the accuracy of detecting abnormal states of the color bar.
(3) When the color bar abnormal state detection is carried out, the scanning of the whole video image frame for the comprehensive pixel points is not needed, so that the real-time performance of the color bar abnormal state detection is improved.
The method provided by the invention is applied to the color bar anomaly detection of high/standard definition and 4K ultra-high definition SDI signals, can quickly locate the fault point on the broadcasting link in real time, and provides a reliable basis for intelligent emergency switching.
The invention provides a filtering-based ultrahigh-definition video color bar abnormal state real-time detection method, which mainly comprises the following steps of:
the invention provides a real-time detection method of abnormal states of color stripes of an ultra-high definition video based on filtering, which is a universal method for detecting abnormal states of standard color stripes and non-standard color stripes. After the video image frame is filtered and identified by color bars, a candidate color bar area is obtained preliminarily, but the candidate color bar area is often mixed with other lines with similar characteristics (such as color bars of special-effect scenes, strip-shaped or lattice backgrounds and the like) on the video image frame, and in order to effectively remove the noise scenes, the accuracy of color bar detection of the video image frame is improved by combining multi-frame association.
Referring to fig. 1, the method for detecting abnormal state of color stripes of ultra-high definition video in real time based on filtering includes the following steps:
step 1, acquiring a first video image frame a1 to be detected, filtering the first video image frame a1, and identifying whether the first video image frame a1 is an abnormal color bar, specifically comprising the following steps:
step 101, scanning the first video image frame a1 once every preset n1 lines from the top row of the image in the top-down direction; the specific number of n1 rows may be flexibly set according to actual detection requirements, and may be 2 rows, 4 rows, and the like, which is not limited in the present invention.
Every time one scanning line is scanned, the line height of each scanning line is equal, and the following filtering processing is performed on the scanning line:
each scanning line consists of a plurality of pixel points from left to right; identifying a pixel value of each pixel point; presetting a first pixel point quantity threshold confThresh _1 and a second pixel point quantity threshold confThresh _ 2; the first threshold confThresh _1 of the number of pixels is the minimum width of the color bar appearing on the video image, and the second threshold confThresh _2 of the number of pixels is the maximum width of the color bar appearing on the video image. According to the left-to-right direction, when a plurality of continuous pixel points with the same pixel value are identified, the number of the continuous pixel points with the same pixel value identified this time is set as confThresh, and whether the following conditions are met is judged: confThresh _1< confThresh _ 2; as a specific implementation, assume that the height of a scan line is width; confThresh _1 can be set to width/8; confThresh _2 can be set to width/4;
if not, continuing to identify other pixel points rightwards; if yes, the area formed by the continuous pixel points with the same pixel value identified this time is called a candidate small area, and the candidate small area is a uniform color band; then, continuing to identify other pixel points rightwards until all the pixel points of the scanning line are identified; thus, for one of said scan lines, several candidate small regions are identified;
referring to fig. 2, h1 represents the 1 st scan line; h2 represents the 2 nd scan line; h3 represents the 3 rd scan line; when the scan filtering is performed on the 1 st scan line, 5 uniform color bands are identified, namely a color band f1, a color band f2, a color band f3, a color band f4 and a color band k 1; that is, the pixel values of the pixels in each color band are the same. The widths of the ink ribbon f1, the ink ribbon f2, the ink ribbon f3 and the ink ribbon f4 meet the requirements; while the width of the color bar k1 does not meet the requirement. Therefore, the color bar f1, the color bar f2, the color bar f3, and the color bar f4 are small candidate regions, and the color bar k1 is not a small candidate region. Therefore, the color bar f1 is the candidate small region f1, the color bar f2 is the candidate small region f2, the color bar f3 is the candidate small region f3, and the color bar f4 is the candidate small region f 4.
Step 102, assuming that there are n candidate small regions for the identified candidate small regions belonging to the same scan line, respectively expressed as: the 1 st candidate small region, the 2 nd candidate small region, … and the nth candidate small region are processed as follows:
step 1021, identifying that the pixel value of the 1 st candidate small region is V1, the pixel value of the 2 nd candidate small region is V2, …, and the pixel value of the nth candidate small region is Vn;
judging whether the pixel value V1 is V2, … and the pixel value Vn accords with the jump rule, if so, the area formed by the n candidate small areas is called as an initial color bar area; then, go to step 1022;
wherein, the jump rule is as follows: the pixel value V1 rising jumps to a pixel value V2, the pixel value V2 falling jumps to a pixel value V3, the pixel value V3 rising jumps to a pixel value V4, and so on; the absolute value of the difference between any two adjacent pixel values is greater than the jump threshold pixThresh;
still taking fig. 2 as an example, when the 1 st scan line is scanned and filtered, there are 4 candidate small regions, which are sequentially from left to right: candidate small region f1, candidate small region f2, candidate small region f3 and candidate small region f 4.
In this step, if the pixel values of the candidate small region f1, the candidate small region f2, the candidate small region f3, and the candidate small region f4 are: 0. 100, 50 and 100, that the 4 candidate small regions satisfy the transition rule, therefore, the region formed by the 4 candidate small regions is the initial colorstripe region, i.e. the rectangular region enclosed by the vertices r1, r2, r4 and r3 in fig. 2.
That is, for a television color bar, it consists of a plurality of color bands; the above jump rule is satisfied between the pixel values of the adjacent color bands, and the number of the color bands is within a specific range. For non-tv color bars, the pixel values of the adjacent color bars usually do not satisfy the above jump rule, or the number of color bars does not satisfy the above requirement.
By the method, the color bar area can be preliminarily identified.
Step 1022, assuming that the identified initial ticker area consists of n2 candidate small areas, determining whether the n2 candidate small areas meet the following requirements:
for n2 candidate small regions, the distance between any two adjacent candidate small regions is less than the preset distance disThresh of the candidate small region, and n2 is greater than 4;
if the color bar area meets the requirement, the initial color bar area is a candidate color bar area; so far, the filtering processing of the scanned scanning lines is completed;
step 103, sequentially filtering each scanned line from top to bottom, and simultaneously, judging whether the following requirements are met in real time:
presetting the continuous number of candidate color bar areas as n 3; when going from top to bottom, n3 consecutive candidate ticker areas appear, and n3 candidate ticker areas meet the following requirements: if the left boundary distance between the adjacent upper and lower candidate color bar areas is smaller than the set threshold value and the right boundary distance between the adjacent upper and lower candidate color bar areas is smaller than the set threshold value, stopping scanning the lines downwards continuously, and executing the step 104;
referring to fig. 3, assume that n3 is 20; when the candidate colorstripe region ST20 is scanned and identified, the 20 candidate colorstripe regions ST1 through ST20 satisfy the above requirements, at which point the downward scanning is not continued, but step 104 is performed.
Step 104, identifying a color bar coarse detection area, wherein the method comprises the following steps:
determining circumscribed rectangles of the n3 candidate colorstripe regions; then, the upper boundary L1 of the circumscribed rectangle is kept unchanged; extending the left border of the circumscribed rectangle downward until the left border reaches the bottom-most border of the first video image frame a1, thereby resulting in a left border extension L2; likewise, the right boundary is extended downward until the right boundary reaches the bottommost boundary of the first video image frame a1, thereby obtaining a right boundary extension side L3; connecting the bottom corner point of the left boundary extension side L2 with the bottom corner point of the right boundary extension side L3 to obtain a lower boundary L4; a rectangle surrounded by the lower boundary L4, the right boundary extended side L3, the upper boundary L1 and the left boundary extended side L2 is a color bar coarse detection area;
referring to fig. 3, a rectangle enclosed by the vertices S1, S2, S3, and S4 is a color bar coarse detection area.
Step 105, in the color bar coarse detection area, n3 candidate color bar areas are identified; in the coarse color bar detection area and below n3 candidate color bar areas, scanning is continuously performed at intervals of n1 lines, filtering processing is performed on the scanned scanning lines in real time, and therefore the candidate color bar areas of the scanning lines are identified, and meanwhile, in the process of scanning and filtering at intervals of n1 lines, whether the following requirements are met or not is judged in real time:
in the coarse color bar detection area, from top to bottom, the number of continuous candidate color bar areas exceeds a threshold value dthresh;
if the above requirements are not met, continuing scanning and filtering the downward n1 lines; if the above requirements are met, stopping scanning and filtering at n1 lines downwards, assuming that the current scanning and filtering line is the C1 line, and then executing the step 106;
106, in the coarse color bar detection area, the area above the C1 th line is called a color bar identification frame area, so that fine color bar area extraction is completed;
referring to fig. 4, a rectangle enclosed by the vertices S1, S5, S6, and S4 is a color bar identification box region.
Referring to fig. 5, a specific application scenario is shown. In fig. 5, a rectangle enclosed by the vertices T1, T2, T3, and T4 is a color bar identification box region.
Step 107, scanning the total line number N1 in the color bar identification frame area; if the number of lines in the candidate color bar region is identified as N2, the color bar confidence e is calculated using the following formula:
e=N2/N1
step 108, if the color bar confidence e is greater than the color bar confidence threshold barThresh, it indicates that the currently detected first video image frame a1 is a color bar abnormal frame, which represents that the currently detected first video image frame a1 contains color bars; otherwise, it indicates that the currently detected first video image frame a1 contains no color bar;
through the steps, in the invention, through the rough extraction of the color bar area, the image data without color bars can be removed from the video image, and the image data with color bars is reserved, so that the data processing process can be simplified, unnecessary data processing and calculation can be avoided, and the data processing efficiency can be improved.
Step 2, a multi-frame-based color bar association judgment method specifically comprises the following steps:
reading the next adjacent second video image frame a 2; aligning the second video image frame a2 with the first video image frame a1 so that the color bar recognition frame region B2 is located at the same position of the second video image frame a2 according to the color bar recognition frame region B1 located in the first video image frame a 1;
performing n-spaced 1 line filtering processing in a color bar identification frame area B2, finally calculating a color bar confidence coefficient e, and judging whether the second video image frame A2 is a color bar abnormal frame according to the calculated color bar confidence coefficient e;
by adopting the same method, the preset number of the continuous frames is N3, the color bar abnormal state recognition is sequentially carried out on the rest N3-1 video image frames from the first video image frame A1, and when the continuous N3 video image frames are all color bar abnormal state frames, the continuous N3 frames are finally obtained as the color bar abnormal state frames to carry out alarm prompt.
That is, in the continuous color bar identification frame area, comparing each video image frame, and giving an alarm when the continuous video image frame color bar continues for a certain frame.
Therefore, after the color bar area is extracted, the color bar identification frame area is obtained, the color bar confidence coefficient calculation is carried out in the multi-frame color bar identification frame area, and if the confidence value is larger than the color bar confidence coefficient threshold barThresh, the color bar abnormal frame is determined. In this embodiment, the way of determining the abnormal state of the color bar according to the confidence of the color bar can not only adjust the sensitivity of color bar detection, but also avoid false detection caused by the fact that the actual static frame is detected as a non-static frame by the image contrast formula.
Therefore, the ultrahigh-definition video color bar anomaly real-time detection method based on filtering provided by the invention comprises the steps of firstly obtaining a video image frame to be detected, such as a 4K ultrahigh-definition image; then, generating a filter map of the color bar image from top to bottom according to the rows, and simultaneously determining a color bar detection area in the filter map according to the position information and the continuity of the color band generated by each row to finish the coarse extraction of the color bar area; and then, in the crude extraction area of the color bars, continuously generating a color bar filtering map from top to bottom according to the rows, determining a color bar identification frame according to the vertical distribution characteristics of the color bars, finishing the fine extraction of the color bar area, and obtaining the confidence coefficient of the color bars. And finally, judging the consistency of the images by continuous multiframes in the color bar identification frame according to the multiframe relevance, and judging the abnormal state of the color bar.
Compared with the prior art, the method has the advantages that the color bar detection precision can be improved by utilizing the mode of generating the color bar filtering map, firstly carrying out coarse extraction and then carrying out fine extraction, and then determining the color bar area through multi-frame association, so that the technical problems that the color bars identified by the existing color bar detection technology are poor in accuracy and limited to identifying 100% and 75% of full-screen color bars are solved.
As can be seen from the above description, in the present invention, an image to be detected may be acquired through a 4K ultra high definition signal, and the image is sent to a terminal device for identification, or sent to a server for identification, which is not specifically limited in the present invention.
The invention provides a filtering-based ultrahigh-definition video color bar anomaly real-time detection method, which has the following advantages:
1. the invention discloses a method for detecting color bars by using a color bar filtering map. The prior information such as the standard value of the color bar is not required to be known.
2. The invention discloses an innovative method for determining the reliability of a color bar area by using the color bar confidence coefficient.
3. The invention creatively uses a method combining coarse extraction and fine extraction to improve the accuracy and efficiency of color bar detection and obtain real-time detection in 4K ultra-high definition video.
4. The invention relates to a universal abnormal state detection method for various standard color strips and non-standard color strips. The invention can not only detect eight full-screen color bars, but also support the detection of regional color bars, SMPYE color bars, four-color bars and other non-standard abnormal color bars.
5. The invention does not need to adjust the threshold value for different videos and has good adaptability.
6. The invention eliminates the interference of channel package contents such as station caption, corner caption, clock, bottom flying caption and the like and special effect scenes, and prevents missing detection.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above may be implemented by hardware associated with computer program instructions, and the above programs may be stored in a computer readable storage medium, and when executed, may include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and improvements can be made without departing from the principle of the present invention, and such modifications and improvements should also be considered within the scope of the present invention.

Claims (1)

1. A method for detecting abnormal states of color stripes of ultra-high definition video in real time based on filtering is characterized by comprising the following steps:
step 1, acquiring a first video image frame a1 to be detected, filtering the first video image frame a1, and identifying whether the first video image frame a1 is an abnormal color bar, specifically comprising the following steps:
step 101, scanning the first video image frame a1 once every preset n1 lines from the top line of the image in the up-down direction, and performing the following filtering process on each scanned line every time one scanned line is scanned:
each scanning line consists of a plurality of pixel points from left to right; identifying a pixel value of each pixel point; presetting a first pixel point quantity threshold confThresh _1 and a second pixel point quantity threshold confThresh _ 2; according to the left-to-right direction, when a plurality of continuous pixel points with the same pixel value are identified, the number of the continuous pixel points with the same pixel value identified this time is set as confThresh, and whether the following conditions are met is judged: confThresh _1< confThresh _ 2; if not, continuing to identify other pixel points rightwards; if yes, the area formed by the continuous pixel points with the same pixel value identified this time is called a candidate small area; then, continuing to identify other pixel points rightwards until all the pixel points of the scanning line are identified; thus, for one of said scan lines, several candidate small regions are identified;
step 102, assuming that there are n candidate small regions for the identified candidate small regions belonging to the same scan line, respectively expressed as: the 1 st candidate small region, the 2 nd candidate small region, … and the nth candidate small region are processed as follows:
step 1021, identifying that the pixel value of the 1 st candidate small region is V1, the pixel value of the 2 nd candidate small region is V2, …, and the pixel value of the nth candidate small region is Vn;
judging whether the pixel value V1 is V2, … and the pixel value Vn accords with the jump rule, if so, the area formed by the n candidate small areas is called as an initial color bar area; then, go to step 1022;
wherein, the jump rule is as follows: the pixel value V1 rising jumps to a pixel value V2, the pixel value V2 falling jumps to a pixel value V3, the pixel value V3 rising jumps to a pixel value V4, and so on; the absolute value of the difference between any two adjacent pixel values is greater than the jump threshold pixThresh;
step 1022, assuming that the identified initial ticker area consists of n2 candidate small areas, determining whether the n2 candidate small areas meet the following requirements:
for n2 candidate small regions, the distance between any two adjacent candidate small regions is less than the preset distance disThresh of the candidate small region, and n2 is greater than 4;
if the color bar area meets the requirement, the initial color bar area is a candidate color bar area; so far, the filtering processing of the scanned scanning lines is completed;
step 103, sequentially filtering each scanned line from top to bottom, and simultaneously, judging whether the following requirements are met in real time:
presetting the continuous number of candidate color bar areas as n 3; when going from top to bottom, n3 consecutive candidate ticker areas appear, and n3 candidate ticker areas meet the following requirements: if the left boundary distance between the adjacent upper and lower candidate color bar areas is smaller than the set threshold value and the right boundary distance between the adjacent upper and lower candidate color bar areas is smaller than the set threshold value, stopping scanning the lines downwards continuously, and executing the step 104;
step 104, identifying a color bar coarse detection area, wherein the method comprises the following steps:
determining circumscribed rectangles of the n3 candidate colorstripe regions; then, the upper boundary L1 of the circumscribed rectangle is kept unchanged; extending the left border of the circumscribed rectangle downward until the left border reaches the bottom-most border of the first video image frame a1, thereby resulting in a left border extension L2; likewise, the right boundary is extended downward until the right boundary reaches the bottommost boundary of the first video image frame a1, thereby obtaining a right boundary extension side L3; connecting the bottom corner point of the left boundary extension side L2 with the bottom corner point of the right boundary extension side L3 to obtain a lower boundary L4; a rectangle surrounded by the lower boundary L4, the right boundary extended side L3, the upper boundary L1 and the left boundary extended side L2 is a color bar coarse detection area;
step 105, in the color bar coarse detection area, n3 candidate color bar areas are identified; in the coarse color bar detection area and below n3 candidate color bar areas, scanning is continuously performed at intervals of n1 lines, filtering processing is performed on the scanned scanning lines in real time, and therefore the candidate color bar areas of the scanning lines are identified, and meanwhile, in the process of scanning and filtering at intervals of n1 lines, whether the following requirements are met or not is judged in real time:
in the coarse color bar detection area, from top to bottom, the number of continuous candidate color bar areas exceeds a threshold value dthresh;
if the above requirements are not met, continuing scanning and filtering the downward n1 lines; if the above requirements are met, stopping scanning and filtering at n1 lines downwards, assuming that the current scanning and filtering line is the C1 line, and then executing the step 106;
106, in the coarse color bar detection area, the area above the C1 th line is called a color bar identification frame area, so that fine color bar area extraction is completed;
step 107, scanning the total line number N1 in the color bar identification frame area; if the number of lines in the candidate color bar region is identified as N2, the color bar confidence e is calculated using the following formula:
e=N2/N1
step 108, if the color bar confidence e is greater than the color bar confidence threshold barThresh, it indicates that the currently detected first video image frame a1 is a color bar abnormal frame, which represents that the currently detected first video image frame a1 contains color bars; otherwise, it indicates that the currently detected first video image frame a1 contains no color bar;
step 2, a multi-frame-based color bar association judgment method specifically comprises the following steps:
reading the next adjacent second video image frame a 2; aligning the second video image frame a2 with the first video image frame a1 so that the color bar recognition frame region B2 is located at the same position of the second video image frame a2 according to the color bar recognition frame region B1 located in the first video image frame a 1;
performing n-spaced 1 line filtering processing in a color bar identification frame area B2, finally calculating a color bar confidence coefficient e, and judging whether the second video image frame A2 is a color bar abnormal frame according to the calculated color bar confidence coefficient e;
by adopting the same method, the preset number of the continuous frames is N3, the color bar abnormal state recognition is sequentially carried out on the rest N3-1 video image frames from the first video image frame A1, and when the continuous N3 video image frames are all color bar abnormal state frames, the continuous N3 frames are finally obtained as the color bar abnormal state frames to carry out alarm prompt.
CN202010989802.5A 2020-09-18 2020-09-18 Filtering-based ultra-high definition video color bar anomaly real-time detection method Active CN112261406B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010989802.5A CN112261406B (en) 2020-09-18 2020-09-18 Filtering-based ultra-high definition video color bar anomaly real-time detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010989802.5A CN112261406B (en) 2020-09-18 2020-09-18 Filtering-based ultra-high definition video color bar anomaly real-time detection method

Publications (2)

Publication Number Publication Date
CN112261406A true CN112261406A (en) 2021-01-22
CN112261406B CN112261406B (en) 2023-08-22

Family

ID=74232318

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010989802.5A Active CN112261406B (en) 2020-09-18 2020-09-18 Filtering-based ultra-high definition video color bar anomaly real-time detection method

Country Status (1)

Country Link
CN (1) CN112261406B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1452331A (en) * 2003-04-24 2003-10-29 北京永新同方信息工程有限公司 Digitized real time multi-channel video and audio differential mode detecting method
CN101815222A (en) * 2009-02-25 2010-08-25 北大方正集团有限公司 Video color bar detecting method and device
KR101617428B1 (en) * 2014-11-24 2016-05-12 대한민국(국가기록원) Method and apparatus for degraded region detection in digital video file

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1452331A (en) * 2003-04-24 2003-10-29 北京永新同方信息工程有限公司 Digitized real time multi-channel video and audio differential mode detecting method
CN101815222A (en) * 2009-02-25 2010-08-25 北大方正集团有限公司 Video color bar detecting method and device
KR101617428B1 (en) * 2014-11-24 2016-05-12 대한민국(국가기록원) Method and apparatus for degraded region detection in digital video file

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李嘉林;: "电影频道SDI信号检测方法最新探索和实现", 现代电视技术, no. 01 *

Also Published As

Publication number Publication date
CN112261406B (en) 2023-08-22

Similar Documents

Publication Publication Date Title
US5546131A (en) Television receiver having an arrangement for vertically shifting subtitles
US8363132B2 (en) Apparatus for demosaicing colors and method thereof
US7606423B2 (en) Method and apparatus for blocking artifact detection and measurement in block-coded video
JP2005287049A (en) Motion compensation method and apparatus at vector-based image borders
CN110830787B (en) Method and device for detecting screen-patterned image
CN101620731A (en) Method for detecting layout areas in a video image and method for generating a reduced size image using the detection method
US7271850B2 (en) Method and apparatus for cross color/cross luminance suppression
US8319888B2 (en) Method of determining field dominance in a sequence of video frames
US8077774B1 (en) Automated monitoring of digital video image quality
US7822271B2 (en) Method and apparatus of false color suppression
US8891609B2 (en) System and method for measuring blockiness level in compressed digital video
CN112261406B (en) Filtering-based ultra-high definition video color bar anomaly real-time detection method
CN107666560B (en) Video de-interlacing method and device
CN110312133B (en) Image processing method and device
CN114071209A (en) Method and device for detecting video image display area in real time and electronic equipment
US20060218619A1 (en) Block artifacts detection
US10404936B2 (en) Method and apparatus for processing video signal
US8908093B2 (en) Determining aspect ratio for display of video
US7092575B2 (en) Moving image encoding apparatus and moving image encoding method
US20130156308A1 (en) Picture detection device, picture recording device, picture recording/reproduction device, picture detection method, picture recording method, and picture recording/reproduction method
JP3435334B2 (en) Apparatus and method for extracting character area in video and recording medium
Ekin et al. Spatial detection of TV channel logos as outliers from the content
JP3609236B2 (en) Video telop detection method and apparatus
CN114666649B (en) Identification method and device of subtitle cut video, electronic equipment and storage medium
CN108363981B (en) Title detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant