CN113038146A - Method and system for detecting consistency of self-adaptive IPTV live broadcast content - Google Patents
Method and system for detecting consistency of self-adaptive IPTV live broadcast content Download PDFInfo
- Publication number
- CN113038146A CN113038146A CN201911249148.8A CN201911249148A CN113038146A CN 113038146 A CN113038146 A CN 113038146A CN 201911249148 A CN201911249148 A CN 201911249148A CN 113038146 A CN113038146 A CN 113038146A
- Authority
- CN
- China
- Prior art keywords
- comparison
- source node
- feature
- node
- code
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 65
- 238000001514 detection method Methods 0.000 claims abstract description 90
- 238000000605 extraction Methods 0.000 claims abstract description 62
- 230000001360 synchronised effect Effects 0.000 claims abstract description 29
- 238000012544 monitoring process Methods 0.000 claims description 310
- 239000000872 buffer Substances 0.000 claims description 255
- 239000000523 sample Substances 0.000 claims description 66
- 125000004122 cyclic group Chemical group 0.000 claims description 51
- 230000008569 process Effects 0.000 claims description 34
- 239000011159 matrix material Substances 0.000 claims description 12
- 230000002159 abnormal effect Effects 0.000 claims description 10
- 230000007774 longterm Effects 0.000 claims description 10
- 238000012545 processing Methods 0.000 claims description 9
- 230000003044 adaptive effect Effects 0.000 claims description 6
- 239000012634 fragment Substances 0.000 claims description 6
- 239000000284 extract Substances 0.000 claims description 4
- 238000004458 analytical method Methods 0.000 claims description 2
- 230000005856 abnormality Effects 0.000 abstract 1
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000007689 inspection Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
- 238000011144 upstream manufacturing Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/218—Source of audio or video content, e.g. local disk arrays
- H04N21/2187—Live feed
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/23418—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/266—Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/44008—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Computer Networks & Wireless Communication (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
The invention discloses a method and a system for detecting the consistency of self-adaptive IPTV live content, wherein the detection method comprises the following steps: extracting live video feature codes by the feature code extraction equipment and uploading the live video feature codes to a comparison server; the comparison server synchronously compares the feature codes uploaded by different detection points to judge whether the live broadcast contents are consistent; and the comparison server selects an optimal feature code extraction and comparison method according to the synchronous matching condition of the feature codes to carry out efficient IPTV live broadcast video stream content consistency detection, alarm for abnormality by comparison and find out the safety problem that the live broadcast program is tampered in time. The invention can carry out high-efficiency consistency detection on the IPTV live video stream content and improve the detection efficiency.
Description
Technical Field
The invention belongs to the technical field of electronic communication, relates to an IPTV live broadcast system, and particularly relates to a method and a system for detecting consistency of self-adaptive IPTV live broadcast content.
Background
In recent years, with the comprehensive popularization of the policy of 'integration of three networks' in China, the construction and management of the IPTV (interactive network television) actively adapts to the great trend of informatization and networking development, the rapid development is realized, the scale of IPTV users is nearly three hundred million, and the IPTV system becomes an important force for spreading positive energy and carrying forward the melody.
The IPTV live content consistency detection system mainly comprises two components, namely feature code extraction equipment and a comparison server. The feature code extraction equipment is deployed at different detection points (generally comprising a source node and a monitoring node) of the IPTV network, extracts live video feature codes and uploads the live video feature codes to the comparison server, and the comparison server synchronously compares the feature codes uploaded at the different detection points and judges whether live contents are consistent or not. At present, two types of feature code extraction methods, namely, TS code stream layer feature codes and image layer feature codes, are mainly adopted in an IPTV live content consistency detection system in the industry.
The feature code of the TS code stream layer is extracted by mainly performing calculation processing on an MPEG2-TS code stream commonly used in the field of broadcasting and television, for example, calculating an MD5 value of the TS packet data content as the feature code. The method for calculating the feature code is simple and efficient, and is suitable for performing content consistency detection on scenes that TS code stream content carried in IP video streams cannot be modified at upstream and downstream nodes of an IPTV network. Although commonly used IP network devices (such as switches, routers, etc.) do not modify the clear core part (TS code stream) of an IP packet, there are still a lot of IPTV network devices (such as encoding and transcoding devices, multiplexing devices, IP stream matrices, etc.) that modify the contents of the TS code stream (for example, resolution conversion, PSI table insertion, audio/video PID information modification, etc.), which may cause the characteristic code of the TS code stream to change, and the image itself does not change, thereby causing false alarm.
The image layer feature code is obtained by decoding and restoring IPTV live video stream into an image, and then processing, calculating and extracting pixels in the image. The feature code extraction method is not influenced by the changes of additional information (such as PSI (program specific information) tables and PID (proportion integration differentiation) information) and encoding formats (such as H.264/H.265/AVS (Audio video coding standard) of TS (transport stream) data packets, and can achieve the comparison result of video content consistency identical to human eyes, thereby being applicable to various safety monitoring scenes. However, since the video stream needs to be decoded and image restored for image layer feature code calculation, a high requirement is provided for the performance of the feature code extraction device, the number of video streams that can be processed by one image layer feature code extraction device is greatly lower than that of the code stream layer feature value extraction device, and a great pressure is caused to the construction cost of the direct broadcast security comparison system.
The two feature code extraction methods have defects, so that the use requirements of people cannot be met. In view of the above, there is a need to design an inspection method to overcome the above-mentioned defects of the existing inspection methods.
Disclosure of Invention
The invention provides a method and a system for detecting the consistency of IPTV live broadcast contents, which can efficiently detect the consistency of the IPTV live broadcast video stream contents and improve the detection efficiency.
In order to solve the technical problem, according to one aspect of the present invention, the following technical solutions are adopted:
a method for detecting consistency of IPTV live content comprises the following steps:
step S1, extracting live video feature codes by the feature code extraction equipment and uploading the live video feature codes to a comparison server; the detection point deployed in the IPTV network comprises a source node and a monitoring node; the feature code extraction equipment respectively collects ES feature codes and image feature codes of IPTV live video stream contents at a source node and a monitoring node, and uploads the ES feature codes and the image feature codes to the comparison server for video content consistency comparison;
step S2, the comparison server compares the feature codes uploaded by different detection points synchronously to judge whether the live broadcast contents are consistent; the comparison server selects an optimal feature code extraction and comparison method according to the synchronous matching condition of the two types of feature codes to carry out efficient IPTV live broadcast video stream content consistency detection, alarm for abnormity by comparison and find out the safety problem that the live broadcast program is tampered in time;
the step S2 includes:
step S21: an ES (Elementary Stream) feature code synchronization step, namely outputting an ES synchronization identifier Syn _ ES to indicate whether the ES feature codes of the source node and the monitoring node can realize synchronous matching or not;
step S22: the ES characteristic code cyclic comparison step is used for cyclically acquiring and comparing the ES characteristic codes of the video streams of the source node and the monitoring node and outputting a comparison detection Result;
step S23: an image feature code synchronization step, namely outputting an image synchronization identifier Syn _ Pic, and indicating whether the image feature codes of the source node and the monitoring node can realize synchronization or not;
step S24: circularly comparing the image characteristic codes, namely circularly acquiring and comparing the image characteristic codes of the video streams of the source node and the monitoring node, and outputting a comparison detection Result;
if Result is 1, the video contents of the source node and the monitoring node sampled in the current 5 seconds are consistent, and the program continues to execute the cyclic comparison of the feature codes;
if Result is 2, the video contents of the source node and the monitoring node sampled in the current 5 seconds are inconsistent, and the program outputs an abnormal comparison alarm;
if Result is 3, the video contents of the source node and the monitoring node which are continuously sampled for 6 seconds are inconsistent, the program outputs a comparison long-term abnormity alarm, and the ES/image feature code synchronization step is executed again;
the step S21 includes:
step S211, the probe devices of the source node and the monitoring node respectively collect the live broadcast ES video streams of the monitoring points;
s212, the source node probe acquires the ES characteristic code of the video according to the frequency once per second and sends the ES characteristic code to a source node buffer area BufE _ S; the monitoring node probe also acquires the ES characteristic code of the video according to the frequency once per second and sends the ES characteristic code to a monitoring node buffer area BufE _ d;
the process of collecting the ES characteristic code comprises the following steps: performing protocol analysis on a video ES stream, retrieving a start character of a NALU data Unit (Network Abstract Layer data Unit) in the data stream, if a start byte of the NALU data Unit is found, analyzing bytes of a NALU packet header, extracting a NALU type field nal _ Unit _ type in the NALU packet header, and if not, continuously and circularly searching; if the nal _ unit _ type is Slice, the NALU data unit is indicated that the clean kernel of the NALU data unit stores the fragment information of the image, and at the moment, the NALU clean kernel is encrypted by a secure hash algorithm to generate an SHA-1 value; encrypting the SHA-1 value by using an SM4 encryption algorithm, and carrying a UTC time tag of the current ES frame as an ES feature code;
step S213, if Len (BufE _ S) > (60 andslen (BufE _ d) > (30), this indicates that the number of ES feature codes buffered in the source node buffer and the monitoring node buffer has reached the preset target required for performing the synchronization comparison;
step S214, a synchronization comparison process is executed, namely, pattern matching searching is carried out on the feature codes in the BufE _ S of the source node, and whether the situation that the continuous 5-second feature code value in the BufE _ d of the monitoring node is equal to the continuous 5-second feature code value in the BufE _ S exists or not is detected;
step S215, if the synchronization matching is successful, that is, if there is a situation that the feature codes of the source node and the monitoring node for 5 consecutive seconds are completely equal, outputting Syn _ ES ═ true, otherwise, outputting Syn _ ES ═ false;
the step S22 includes:
step S221, writing the ES characteristic code of the source node into a source ES characteristic code buffer BufX _ S according to the ES characteristic code synchronization result described in the step S21; writing the ES characteristic code of the monitoring node into a monitoring ES characteristic code buffer BufX _ d;
step S222, if Len (BufX _ S) > (60) and Len (BufX _ d) > (30), this indicates that the number of feature codes in the source node buffer or the monitor node buffer exceeds the threshold of the buffer, that is, an overflow occurs, and then the output Result is 3, and step S22 is exited;
if Len (BufX _ s) > (5) and Len (BufX _ d) > (5), which indicates that feature codes in the source node buffer area and the monitoring node buffer area reach 5 seconds, comparing the feature codes in the earliest 5 seconds buffer areas in the source node buffer area and the monitoring node buffer area, and removing the compared 5 seconds feature codes from the buffer areas after the comparison is completed;
if the 5-second feature codes of the source node and the monitoring node are consistent in comparison, outputting a Result equal to 1, and exiting from the step S22;
if the feature code comparison is inconsistent for the continuous 30 seconds, outputting a Result of 3, otherwise, outputting a Result of 2, and exiting from step S22;
the step S23 includes:
s231, respectively acquiring the live broadcast ES video streams of respective monitoring points by probe equipment of a source node and a monitoring node, positioning an I-frame position in the live broadcast ES video streams, and restoring the I-frame position into a picture;
s232, the source node probe acquires image feature codes of the video according to the frequency once per second and sends the image feature codes to a source node buffer area BufP _ S; the monitoring node probe also acquires the image feature code of the video according to the frequency once per second and sends the image feature code to a monitoring node buffer zone BufP _ d;
the image feature code acquisition process comprises the steps of carrying out gray processing on a picture decoded and restored from an ES video, namely converting color pixel points in the picture into gray pixel points; averagely dividing the grayed picture according to the block unit size of [ NxN ], wherein the default N is 64; calculating the average brightness value of the pixel points in each block, and storing the average brightness value into an image feature code matrix Lumi [ NxN ]; encrypting an image feature code matrix Lumi [ NxN ] by using an SM4 encryption algorithm, and carrying a UTC time label of a current frame as an image feature code;
step S233, if Len (BufP _ S) > (60) and Len (BufP _ d) > (30), this indicates that the number of image feature codes buffered in the source node buffer area and the monitoring node buffer area has reached the preset target required for performing the synchronization comparison;
step S234, executing a synchronization comparison process, namely performing pattern matching search on the feature codes in the BufP _ S of the source node, and detecting whether the condition that the continuous 5-second feature code value in the BufP _ d of the monitoring node is equal to the continuous 5-second feature code value in the BufP _ S exists or not;
step S235, if the synchronization matching is successful, that is, if there is a situation that the feature codes of the source node and the monitoring node for 5 consecutive seconds are completely equal, outputting Syn _ Pic ═ true, otherwise, outputting Syn _ Pic ═ false;
the step S24 includes:
step S241, writing the image feature code of the source node into a source image feature code buffer BufX _ S according to the image feature code synchronization result described in the step S23; writing the image feature code of the monitoring node into a monitoring image feature code buffer BufX _ d;
step S242, if Len (BufX _ S) > (60) and Len (BufX _ d) > (30), this indicates that the number of feature codes in the source node buffer or the monitor node buffer exceeds the threshold of the buffer, that is, overflow occurs, and then the output Result is 3, and step S24 is exited;
if Len (BufX _ s) > (5) and Len (BufX _ d) > (5), which indicates that feature codes in a source node buffer area and a monitoring node buffer area reach 5 seconds, comparing the feature codes of the earliest 5 seconds buffer areas in the source node buffer area and the monitoring node buffer area, and removing the compared 5 seconds feature codes from the buffer areas after the comparison is completed;
if the 5-second feature codes of the source node and the monitoring node are consistent in comparison, outputting a Result equal to 1, and exiting from the step S24;
if the comparison of the feature codes for 30 seconds is inconsistent, the Result is equal to 3, otherwise, the Result is equal to 2, and the process exits from step S24.
A method for detecting consistency of IPTV live content comprises the following steps:
step S1, extracting live video feature codes by the feature code extraction equipment and uploading the live video feature codes to a comparison server; the detection point deployed in the IPTV network comprises a source node and a monitoring node; the feature code extraction equipment respectively collects ES feature codes and image feature codes of IPTV live video stream contents at a source node and a monitoring node, and uploads the ES feature codes and the image feature codes to the comparison server for video content consistency comparison;
step S2, the comparison server compares the feature codes uploaded by different detection points synchronously to judge whether the live broadcast contents are consistent; and the comparison server selects an optimal feature code extraction and comparison method according to the synchronous matching condition of the two types of feature codes to carry out efficient IPTV live video stream content consistency detection, and alarms for abnormity by comparison so as to find out the safety problem that the live program is tampered in time.
As an embodiment of the present invention, the step S2 includes:
step S21: an ES characteristic code synchronization step, namely outputting an ES synchronization identifier Syn _ ES to indicate whether the ES characteristic codes of the source node and the monitoring node can realize synchronous matching or not;
step S22: the ES characteristic code cyclic comparison step is used for cyclically acquiring and comparing the ES characteristic codes of the video streams of the source node and the monitoring node and outputting a comparison detection Result;
step S23: an image feature code synchronization step, namely outputting an image synchronization identifier Syn _ Pic, and indicating whether the image feature codes of the source node and the monitoring node can realize synchronization or not;
step S24: circularly comparing the image characteristic codes, namely circularly acquiring and comparing the image characteristic codes of the video streams of the source node and the monitoring node, and outputting a comparison detection Result;
if Result is 1, the video contents of the source node and the monitoring node sampled in the current 5 seconds are consistent, and the program continues to execute the cyclic comparison step of the feature codes;
if Result is 2, the video contents of the source node and the monitoring node sampled in the current 5 seconds are inconsistent, and the program outputs an abnormal comparison alarm;
if Result is 3, it indicates that the video contents of the source node and the monitoring node which are continuously sampled for 6 seconds are inconsistent, the program outputs a 'comparison long-term anomaly' alarm, and the ES/image feature code synchronization step is executed again.
As an embodiment of the present invention, the step S21 includes:
step S211, the probe devices of the source node and the monitoring node respectively collect the live broadcast ES video streams of the monitoring points;
s212, the source node probe acquires the ES characteristic code of the video according to the frequency once per second and sends the ES characteristic code to a source node buffer area BufE _ S; the monitoring node probe also acquires the ES characteristic code of the video according to the frequency once per second and sends the ES characteristic code to a monitoring node buffer area BufE _ d;
step S213, if Len (BufE _ S) > (60 andslen (BufE _ d) > (30), this indicates that the number of ES feature codes buffered in the source node buffer and the monitoring node buffer has reached the preset target required for performing the synchronization comparison;
step S214, a synchronization comparison process is executed, namely, pattern matching searching is carried out on the feature codes in the BufE _ S of the source node, and whether the situation that the continuous 5-second feature code value in the BufE _ d of the monitoring node is equal to the continuous 5-second feature code value in the BufE _ S exists or not is detected;
step S215, if the synchronization matching is successful, that is, if there is a situation that the signatures of the source node and the monitor node for 5 consecutive seconds are completely equal, outputting Syn _ ES ═ true, otherwise, outputting Syn _ ES ═ false.
As an embodiment of the present invention, the step S22 includes:
step S221, writing the ES characteristic code of the source node into a source ES characteristic code buffer BufX _ S according to the ES characteristic code synchronization result described in the step S21; writing the ES characteristic code of the monitoring node into a monitoring ES characteristic code buffer BufX _ d;
step S222, if Len (BufX _ S) > (60) and Len (BufX _ d) > (30), this indicates that the number of feature codes in the source node buffer or the monitor node buffer exceeds the threshold of the buffer, that is, an overflow occurs, and then the output Result is 3, and step S22 is exited;
if Len (BufX _ s) > (5) and Len (BufX _ d) > (5), which indicates that feature codes in the source node buffer area and the monitoring node buffer area reach 5 seconds, comparing the feature codes in the earliest 5 seconds buffer areas in the source node buffer area and the monitoring node buffer area, and removing the compared 5 seconds feature codes from the buffer areas after the comparison is completed;
if the 5-second feature codes of the source node and the monitoring node are consistent in comparison, outputting a Result equal to 1, and exiting from the step S22;
if the comparison of the feature codes for 30 seconds is inconsistent, the Result is equal to 3, otherwise, the Result is equal to 2, and the process exits from step S22.
As an embodiment of the present invention, the step S23 includes:
s231, respectively acquiring the live broadcast ES video streams of respective monitoring points by probe equipment of a source node and a monitoring node, positioning an I-frame position in the live broadcast ES video streams, and restoring the I-frame position into a picture;
s232, the source node probe acquires image feature codes of the video according to the frequency once per second and sends the image feature codes to a source node buffer area BufP _ S; the monitoring node probe also acquires the image feature code of the video according to the frequency once per second and sends the image feature code to a monitoring node buffer zone BufP _ d;
step S233, if Len (BufP _ S) > (60) and Len (BufP _ d) > (30), this indicates that the number of image feature codes buffered in the source node buffer area and the monitoring node buffer area has reached the preset target required for performing the synchronization comparison;
step S234, executing a synchronization comparison process, namely performing pattern matching search on the feature codes in the BufP _ S of the source node, and detecting whether the condition that the continuous 5-second feature code value in the BufP _ d of the monitoring node is equal to the continuous 5-second feature code value in the BufP _ S exists or not;
step S235, if the synchronization matching is successful, that is, if there is a situation that the signatures of the source node and the monitor node for 5 consecutive seconds are completely equal, outputting Syn _ Pic ═ true, otherwise, outputting Syn _ Pic ═ false.
As an embodiment of the present invention, the step S24 includes:
step S241, writing the image feature code of the source node into a source image feature code buffer BufX _ S according to the image feature code synchronization result described in the step S23; writing the image feature code of the monitoring node into a monitoring image feature code buffer BufX _ d;
step S242, if Len (BufX _ S) > (60) and Len (BufX _ d) > (30), this indicates that the number of feature codes in the source node buffer or the monitor node buffer exceeds the threshold of the buffer, that is, overflow occurs, and then the output Result is 3, and step S24 is exited;
if Len (BufX _ s) > (5) and Len (BufX _ d) > (5), which indicates that feature codes in the source node buffer area and the monitoring node buffer area reach 5 seconds, comparing the feature codes in the earliest 5 seconds buffer areas in the source node buffer area and the monitoring node buffer area, and removing the compared 5 seconds feature codes from the buffer areas after the comparison is completed;
if the 5-second feature codes of the source node and the monitoring node are consistent in comparison, outputting a Result equal to 1, and exiting from the step S24;
if the comparison of the feature codes for 30 seconds is inconsistent, the Result is equal to 3, otherwise, the Result is equal to 2, and the process exits from step S24.
An adaptive IPTV live content consistency detection system, the system comprising: the system comprises at least one feature code extraction device and a comparison server, wherein the comparison server is respectively connected with the feature code extraction devices;
the feature code extraction equipment is deployed at different detection points of the IPTV network and used for extracting feature codes of live videos and uploading the feature codes to the comparison server; the detection point deployed in the IPTV network comprises a source node and a monitoring node; the feature code extraction equipment respectively collects ES feature codes and image feature codes of IPTV live video stream contents at a source node and a monitoring node, and uploads the ES feature codes and the image feature codes to the comparison server for video content consistency comparison;
the comparison server is used for synchronously comparing the feature codes uploaded by different detection points and judging whether the live broadcast contents are consistent or not; the comparison server selects an optimal feature code extraction and comparison method according to the synchronous matching condition of the two types of feature codes to carry out efficient IPTV live broadcast video stream content consistency detection, alarm for abnormity by comparison and find out the safety problem that the live broadcast program is tampered in time;
the comparison server comprises:
the ES feature code synchronization module is used for outputting an ES synchronization identifier Syn _ ES and indicating whether the ES feature codes of the source node and the monitoring node can realize synchronous matching or not;
the image feature code synchronization module is used for outputting an image synchronization identifier Syn _ Pic and indicating whether the image feature codes of the source node and the monitoring node can realize synchronization or not;
the characteristic code cyclic comparison module is used for cyclically acquiring and comparing the ES characteristic codes of the video streams of the source node and the monitoring node and outputting a comparison detection Result; the feature code cyclic comparison module is also used for cyclically acquiring and comparing the image feature codes of the video streams of the source node and the monitoring node and outputting a comparison detection Result;
if Result is 1, the video contents of the source node and the monitoring node sampled in the current 5 seconds are consistent, and the program continues to execute the feature code cyclic comparison module;
if Result is 2, the video contents of the source node and the monitoring node sampled in the current 5 seconds are inconsistent, and the program outputs an abnormal comparison alarm;
if Result is 3, the video contents of the source node and the monitoring node which are continuously sampled for 65 seconds are inconsistent, and the program outputs a warning of comparison for long-term abnormity;
the ES feature code synchronization module is used for respectively acquiring the live broadcast ES video streams of the monitoring points through probe equipment of the source node and the monitoring nodes;
the source node probe acquires the ES characteristic code of the video according to the frequency once per second and sends the ES characteristic code into a source node buffer area BufE _ s; the monitoring node probe also acquires the ES characteristic code of the video according to the frequency once per second and sends the ES characteristic code to a monitoring node buffer area BufE _ d;
if Len (BufE _ s) > (60) and Len (BufE _ d) > (30), this indicates that the number of ES feature codes buffered in the source node buffer area and the monitoring node buffer area has reached the preset target required for performing the synchronization comparison;
executing a synchronization comparison process, namely performing pattern matching search on the feature codes in the BufE _ s of the source node, and detecting whether the condition that the continuous 5-second feature code value in the BufE _ d of the monitoring node is equal to the continuous 5-second feature code value in the BufE _ s exists or not;
if the synchronization matching is successful, namely the condition that the continuous 5-second feature codes are completely equal exists in the buffer areas of the source node and the monitoring node, the module outputs Syn _ ES ═ true, otherwise, outputs Syn _ ES ═ false;
the probe equipment of the source node and the monitoring node comprises an ES characteristic code extraction module for acquiring the ES characteristic codes of videos:
the ES characteristic code extraction module analyzes the protocol of the video ES stream, retrieves the initial character of the NALU data unit in the data stream, analyzes the byte of the NALU packet header if the initial byte of the NALU data unit is found, extracts the NALU type field nal _ unit _ type in the NALU packet header, and otherwise, continues to search circularly; if the nal _ unit _ type is Slice, the NALU data unit is indicated that the clean kernel of the NALU data unit stores the fragment information of the image, and at the moment, the NALU clean kernel is encrypted by a secure hash algorithm to generate an SHA-1 value; encrypting the SHA-1 value by using an SM4 encryption algorithm, and carrying a UTC time tag of the current ES frame as an ES feature code;
the image feature code synchronization module is used for respectively acquiring the live broadcast ES video streams of respective monitoring points through probe equipment of a source node and monitoring nodes, positioning an I-frame position in the live broadcast ES video streams, and restoring the I-frame position into a picture;
the source node probe acquires image feature codes of the video according to the frequency once per second and sends the image feature codes into a source node buffer area BufP _ s; the monitoring node probe also acquires the image feature code of the video according to the frequency once per second and sends the image feature code to a monitoring node buffer zone BufP _ d;
if Len (BufP _ s) > (60) and Len (BufP _ d) > (30), indicating that the number of the image feature codes buffered in the source node buffer area and the monitoring node buffer area has reached a preset target required for performing synchronous comparison;
at this time, a synchronous comparison process can be executed, namely, pattern matching searching is carried out on the feature codes in the BufP _ s of the source node, and whether the condition that the continuous 5-second feature code value in the BufP _ d of the monitoring node is equal to the continuous 5-second feature code value in the BufP _ s exists or not is detected;
if the synchronization matching is successful, namely the condition that the feature codes of 5 consecutive seconds are completely equal exists in the buffer areas of the source node and the monitoring node, the image feature code synchronization module outputs Syn _ Pic ═ true, otherwise, outputs Syn _ Pic ═ false;
the image feature code extraction module is used for carrying out gray processing on the image decoded and restored from the ES video, namely converting color pixel points in the image into gray pixel points;
averagely dividing the grayed picture according to the block unit size of [ NxN ], wherein the default N is 64; calculating the average brightness value of the pixel points in each block, and storing the average brightness value into an image feature code matrix Lumi [ NxN ];
encrypting an image feature code matrix Lumi [ NxN ] by using an SM4 encryption algorithm, and carrying a UTC time label of a current frame as an image feature code;
the characteristic code cyclic comparison module is used for writing the ES or the image characteristic code of the source node into a source ES or an image characteristic code buffer BufX _ s according to the ES and image characteristic code synchronization result described in the ES characteristic code synchronization module and the image characteristic code synchronization module; writing the ES or image characteristic code of the monitoring node into a monitoring ES or image characteristic code buffer BufX _ d;
if Len (BufX _ s) > (60) or Len (BufX _ d) > (30), this indicates that the number of feature codes in the source node buffer area or the monitoring node buffer area exceeds the threshold of the buffer area, that is, overflow occurs, at this time, the feature code cyclic comparison module outputs a Result of (3), and the cyclic comparison of the feature codes is exited; BufX is a generic term, X represents "E" or "P", BufX represents BufE or BufP;
if Len (BufX _ s) > (5) and Len (BufX _ d) > (5), indicating that the feature codes in the source node buffer area and the monitoring node buffer area reach 5 seconds, comparing the feature codes in the earliest 5 seconds buffer areas in the source node buffer area and the monitoring node buffer area, and removing the compared 5 seconds feature codes from the buffer areas after the comparison is completed;
if the 5-second feature codes of the source node and the monitoring node are compared consistently, outputting a Result which is 1, and exiting the cyclic comparison of the feature codes;
and if the feature code comparison is inconsistent for continuous 30 seconds, outputting a Result which is 3, otherwise, outputting a Result which is 2, and exiting the cyclic comparison of the feature codes.
An adaptive IPTV live content consistency detection system, the system comprising: the system comprises at least one feature code extraction device and a comparison server, wherein the comparison server is respectively connected with the feature code extraction devices;
the feature code extraction equipment is deployed at different detection points of the IPTV network and used for extracting feature codes of live videos and uploading the feature codes to the comparison server; the detection point deployed in the IPTV network comprises a source node and a monitoring node;
the comparison server is used for synchronously comparing the feature codes uploaded by different detection points and judging whether the live broadcast contents are consistent or not;
the feature code extraction equipment respectively collects ES feature codes and image feature codes of IPTV live video stream contents at a source node and a monitoring node, and uploads the ES feature codes and the image feature codes to the comparison server for video content consistency comparison;
the comparison server selects an optimal feature code extraction and comparison method according to the synchronous matching condition of the two types of feature codes to carry out efficient IPTV live broadcast video stream content consistency detection, alarm for abnormity by comparison and find out the safety problem that the live broadcast program is tampered in time;
the comparison server comprises:
the ES feature code synchronization module is used for outputting an ES synchronization identifier Syn _ ES and indicating whether the ES feature codes of the source node and the monitoring node can realize synchronous matching or not;
the image feature code synchronization module is used for outputting an image synchronization identifier Syn _ Pic and indicating whether the image feature codes of the source node and the monitoring node can realize synchronization or not;
the characteristic code cyclic comparison module is used for cyclically acquiring and comparing the ES characteristic codes of the video streams of the source node and the monitoring node and outputting a comparison detection Result; the feature code cyclic comparison module is also used for cyclically acquiring and comparing the image feature codes of the video streams of the source node and the monitoring node and outputting a comparison detection Result;
if Result is 1, the video contents of the source node and the monitoring node sampled in the current 5 seconds are consistent, and the program continues to execute the feature code cyclic comparison module;
if Result is 2, the video contents of the source node and the monitoring node sampled in the current 5 seconds are inconsistent, and the program outputs an abnormal comparison alarm;
if Result is 3, it indicates that the video contents of the source node and the monitoring node which continuously sample for 6 seconds (i.e. 30 seconds) are inconsistent, the program outputs a "comparison long-term anomaly" alarm, and the ES/image feature code synchronization module is executed again.
As an embodiment of the present invention, the ES feature code synchronization module is configured to respectively acquire live ES video streams of respective monitoring points through probe devices of a source node and a monitoring node;
the source node probe acquires the ES characteristic code of the video according to the frequency once per second and sends the ES characteristic code into a source node buffer area BufE _ s; the monitoring node probe also acquires the ES characteristic code of the video according to the frequency once per second and sends the ES characteristic code to a monitoring node buffer area BufE _ d;
if Len (BufE _ s) > (60) and Len (BufE _ d) > (30), this indicates that the number of ES feature codes buffered in the source node buffer area and the monitoring node buffer area has reached the preset target required for performing the synchronization comparison;
executing a synchronization comparison process, namely performing pattern matching search on the feature codes in the BufE _ s of the source node, and detecting whether the condition that the continuous 5-second feature code value in the BufE _ d of the monitoring node is equal to the continuous 5-second feature code value in the BufE _ s exists or not;
if the synchronization matching is successful, namely the condition that the continuous 5-second feature codes are completely equal exists in the buffer areas of the source node and the monitoring node, the module outputs Syn _ ES ═ true, otherwise, outputs Syn _ ES ═ false;
the probe equipment of the source node and the monitoring node comprises an ES characteristic code extraction module for acquiring the ES characteristic codes of videos:
the image feature code synchronization module is used for respectively acquiring the live broadcast ES video streams of respective monitoring points through probe equipment of a source node and monitoring nodes, positioning an I-frame position in the live broadcast ES video streams, and restoring the I-frame position into a picture;
the source node probe acquires image feature codes of the video according to the frequency once per second and sends the image feature codes into a source node buffer area BufP _ s; the monitoring node probe also acquires the image feature code of the video according to the frequency once per second and sends the image feature code to a monitoring node buffer zone BufP _ d;
if Len (BufP _ s) > (60) and Len (BufP _ d) > (30), indicating that the number of the image feature codes buffered in the source node buffer area and the monitoring node buffer area has reached a preset target required for performing synchronous comparison;
at this time, a synchronous comparison process can be executed, namely, pattern matching searching is carried out on the feature codes in the BufP _ s of the source node, and whether the condition that the continuous 5-second feature code value in the BufP _ d of the monitoring node is equal to the continuous 5-second feature code value in the BufP _ s exists or not is detected;
if the synchronization matching is successful, namely the condition that the continuous 5-second feature codes are completely equal exists in the buffers of the source node and the monitoring node, the module outputs Syn _ Pic ═ true, otherwise, outputs Syn _ Pic ═ false;
the characteristic code cyclic comparison module is used for writing the ES or the image characteristic code of the source node into a source ES or an image characteristic code buffer BufX _ s according to the ES and image characteristic code synchronization result described in the ES characteristic code synchronization module and the image characteristic code synchronization module; writing the ES or image characteristic code of the monitoring node into a monitoring ES or image characteristic code buffer BufX _ d;
if Len (BufX _ s) > (60 OR Len (BufX _ d) > (30), this indicates that the number of feature codes in the source node buffer area OR the monitoring node buffer area exceeds the threshold of the buffer area, that is, overflow occurs, at this time, the feature code cyclic comparison module outputs a Result of 3, and the cyclic comparison of the feature codes is exited;
if Len (BufX _ s) > (5) and Len (BufX _ d) > (5), indicating that the feature codes in the source node buffer area and the monitoring node buffer area reach 5 seconds, comparing the feature codes in the earliest 5 seconds buffer areas in the source node buffer area and the monitoring node buffer area, and removing the compared 5 seconds feature codes from the buffer areas after the comparison is completed;
if the 5-second feature codes of the source node and the monitoring node are compared consistently, outputting a Result which is 1, and exiting the feature code cyclic comparison module;
and if the feature code comparison is inconsistent for continuous 30 seconds, outputting a Result which is 3, otherwise, outputting a Result which is 2, and exiting the cyclic comparison of the feature codes.
The invention has the beneficial effects that: the method and the system for detecting the consistency of the self-adaptive IPTV live broadcast content can efficiently detect the consistency of the IPTV live broadcast video stream content and improve the detection efficiency.
Drawings
Fig. 1 is a flowchart of the detecting method step S2 according to an embodiment of the invention.
FIG. 2 is a flowchart illustrating the ES signature synchronization step in the detection method according to an embodiment of the present invention.
FIG. 3 is a flowchart illustrating an image feature code synchronization step in the detection method according to an embodiment of the invention.
FIG. 4 is a flowchart illustrating a cyclic comparison step of feature codes in the detection method according to an embodiment of the present invention.
FIG. 5 is a flowchart illustrating the ES feature code extracting step in the detecting method according to an embodiment of the invention.
Fig. 6 is a flowchart of an image feature code extraction step in the detection method according to an embodiment of the present invention.
Fig. 7 is a diagram of a protocol layer structure of a video stream in the detection method according to an embodiment of the invention.
Detailed Description
Preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
For a further understanding of the invention, reference will now be made to the preferred embodiments of the invention by way of example, and it is to be understood that the description is intended to further illustrate features and advantages of the invention, and not to limit the scope of the claims.
The description in this section is for several exemplary embodiments only, and the present invention is not limited only to the scope of the embodiments described. It is within the scope of the present disclosure and protection that the same or similar prior art means and some features of the embodiments may be interchanged.
The invention discloses a method for detecting the consistency of self-adaptive IPTV live content, which comprises the following steps:
step S1, extracting live video feature codes by the feature code extraction equipment and uploading the live video feature codes to a comparison server; the detection point deployed in the IPTV network comprises a source node and a monitoring node; the feature code extraction equipment respectively collects ES feature codes and image feature codes of IPTV live video stream contents at a source node and a monitoring node, and uploads the ES feature codes and the image feature codes to the comparison server for video content consistency comparison;
step S2, the comparison server compares the feature codes uploaded by different detection points synchronously to judge whether the live broadcast contents are consistent; and the comparison server selects an optimal feature code extraction and comparison method according to the synchronous matching condition of the two types of feature codes to carry out efficient IPTV live video stream content consistency detection, and alarms for abnormity by comparison so as to find out the safety problem that the live program is tampered in time.
FIG. 1 is a flowchart of the detecting method step S2 according to an embodiment of the present invention; referring to fig. 1, in an embodiment of the present invention, the step S2 includes:
step S21: an ES characteristic code synchronization step, namely outputting an ES synchronization identifier Syn _ ES to indicate whether the ES characteristic codes of the source node and the monitoring node can realize synchronous matching or not;
step S22: the ES characteristic code cyclic comparison step is used for cyclically acquiring and comparing the ES characteristic codes of the video streams of the source node and the monitoring node and outputting a comparison detection Result;
step S23: an image feature code synchronization step, namely outputting an image synchronization identifier Syn _ Pic, and indicating whether the image feature codes of the source node and the monitoring node can realize synchronization or not;
step S24: circularly comparing the image characteristic codes, namely circularly acquiring and comparing the image characteristic codes of the video streams of the source node and the monitoring node, and outputting a comparison detection Result;
if Result is 1, the video contents of the source node and the monitoring node sampled in the current 5 seconds are consistent, and the program continues to execute the cyclic comparison of the feature codes;
if Result is 2, the video contents of the source node and the monitoring node sampled in the current 5 seconds are inconsistent, and the program outputs an abnormal comparison alarm;
if Result is 3, it indicates that the video contents of the source node and the monitoring node which are continuously sampled for 6 seconds are inconsistent, the program outputs a 'comparison long-term anomaly' alarm, and the ES/image feature code synchronization step is executed again.
FIG. 2 is a flowchart illustrating the synchronization step of ES signature in the detection method according to an embodiment of the present invention; referring to fig. 2, in an embodiment of the present invention, the step S21 includes:
step S211, the probe devices of the source node and the monitoring node respectively collect the live broadcast ES video streams of the monitoring points;
s212, the source node probe acquires the ES characteristic code of the video according to the frequency once per second and sends the ES characteristic code to a source node buffer area BufE _ S; the monitoring node probe also acquires the ES characteristic code of the video according to the frequency once per second and sends the ES characteristic code to a monitoring node buffer area BufE _ d;
FIG. 5 is a flowchart illustrating the ES feature code extracting step in the detecting method according to an embodiment of the present invention; referring to fig. 5, in an embodiment of the present invention, the process of collecting ES feature codes includes performing protocol parsing on a video ES stream, retrieving a start symbol of a NALU data unit in the data stream, if a start byte of the NALU data unit is found, parsing a byte of a NALU packet header, and extracting a NALU type field nal _ unit _ type therein, otherwise, continuing to perform loop search; if the nal _ unit _ type is Slice, the NALU data unit is indicated that the clean kernel of the NALU data unit stores the fragment information of the image, and at the moment, the NALU clean kernel is encrypted by a secure hash algorithm to generate an SHA-1 value; encrypting the SHA-1 value by using an SM4 encryption algorithm, and carrying a UTC time tag of the current ES frame as an ES feature code;
SHA-1(Secure Hash Algorithm 1, Secure hashing Algorithm 1) is a cryptographic Hash function designed by the U.S. national Security agency and promulgated by the National Institute of Standards and Technology (NIST) as the Federal data processing Standard (FIPS). SHA-1 may generate a 160-bit (20-byte) hash value, called a message digest, which is typically presented in the form of 40 hexadecimal numbers.
Step S213, if Len (BufE _ S) > (60 andslen (BufE _ d) > (30), this indicates that the number of ES feature codes buffered in the source node buffer and the monitoring node buffer has reached the preset target required for performing the synchronization comparison;
step S214, a synchronization comparison process is executed, namely, pattern matching searching is carried out on the feature codes in the BufE _ S of the source node, and whether the situation that the continuous 5-second feature code value in the BufE _ d of the monitoring node is equal to the continuous 5-second feature code value in the BufE _ S exists or not is detected;
step S215, if the synchronization matching is successful, that is, if there is a situation that the signatures of the source node and the monitor node for 5 consecutive seconds are completely equal, outputting Syn _ ES ═ true, otherwise, outputting Syn _ ES ═ false.
FIG. 4 is a flowchart illustrating a cyclic comparison of feature codes in the detection method according to an embodiment of the present invention; referring to fig. 4, in an embodiment of the present invention, the step S22 includes:
step S221, writing the ES characteristic code of the source node into a source ES characteristic code buffer BufX _ S according to the ES characteristic code synchronization result described in the step S21; writing the ES characteristic code of the monitoring node into a monitoring ES characteristic code buffer BufX _ d;
step S222, if Len (BufX _ S) > (60) and Len (BufX _ d) > (30), this indicates that the number of feature codes in the source node buffer or the monitor node buffer exceeds the threshold of the buffer, that is, an overflow occurs, and then the output Result is 3, and step S22 is exited;
if Len (BufX _ s) > (5) and Len (BufX _ d) > (5), which indicates that feature codes in the source node buffer area and the monitoring node buffer area reach 5 seconds, comparing the feature codes in the earliest 5 seconds buffer areas in the source node buffer area and the monitoring node buffer area, and removing the compared 5 seconds feature codes from the buffer areas after the comparison is completed;
if the 5-second feature codes of the source node and the monitoring node are consistent in comparison, outputting a Result equal to 1, and exiting from the step S22;
if the comparison of the feature codes for 30 seconds is inconsistent, the Result is equal to 3, otherwise, the Result is equal to 2, and the process exits from step S22.
FIG. 3 is a flowchart illustrating an image signature synchronization step in the detection method according to an embodiment of the present invention; referring to fig. 3, in an embodiment of the present invention, the step S23 includes:
and S231, respectively acquiring the live broadcast ES video streams of the monitoring points by the probe equipment of the source node and the monitoring nodes, positioning the I-frame position of the live broadcast ES video streams, and restoring the I-frame position to a picture.
S232, the source node probe acquires image feature codes of the video according to the frequency once per second and sends the image feature codes to a source node buffer area BufP _ S; the monitoring node probe also acquires the image feature code of the video according to the frequency once per second and sends the image feature code into a monitoring node buffer zone BufP _ d.
FIG. 6 is a flowchart illustrating an image feature code extracting step in the detecting method according to an embodiment of the invention; referring to fig. 6, in an embodiment of the present invention, the process of collecting the image feature codes includes performing graying processing on the picture decoded and restored from the ES video, that is, converting color pixels in the picture into grayscale pixels; averagely dividing the grayed picture according to the block unit size of [ NxN ], wherein the default N is 64; calculating the average brightness value of the pixel points in each block, and storing the average brightness value into an image feature code matrix Lumi [ NxN ]; the image feature code matrix Lumi [ NxN ] is encrypted using the SM4 encryption algorithm and carries the UTC time tag of the current frame as the image feature code.
In step S233, if Len (BufP _ S) > is 60and Len (BufP _ d) > is 30, this indicates that the number of image feature codes buffered in the source node buffer and the monitoring node buffer has reached the preset target required for performing the synchronization comparison.
Step S234, a synchronization comparison process is executed, that is, the feature codes are subjected to pattern matching search in the BufP _ S of the source node, and whether a situation that the continuous 5-second feature code value in the BufP _ d of the monitoring node is equal to the continuous 5-second feature code value in the BufP _ S exists is detected.
Step S235, if the synchronization matching is successful, that is, if there is a situation that the signatures of the source node and the monitor node for 5 consecutive seconds are completely equal, outputting Syn _ Pic ═ true, otherwise, outputting Syn _ Pic ═ false.
FIG. 4 is a flowchart illustrating a cyclic comparison of feature codes in the detection method according to an embodiment of the present invention; referring to fig. 4, in an embodiment of the present invention, the step S24 includes:
step S241, writing the image feature code of the source node into a source image feature code buffer BufX _ S according to the image feature code synchronization result described in the step S23; writing the image feature code of the monitoring node into a monitoring image feature code buffer BufX _ d;
step S242, if Len (BufX _ S) > (60) and Len (BufX _ d) > (30), this indicates that the number of feature codes in the source node buffer or the monitor node buffer exceeds the threshold of the buffer, that is, overflow occurs, and then the output Result is 3, and step S24 is exited;
if Len (BufX _ s) > (5) and Len (BufX _ d) > (5), which indicates that feature codes in a source node buffer area and a monitoring node buffer area reach 5 seconds, comparing the feature codes of the earliest 5 seconds buffer areas in the source node buffer area and the monitoring node buffer area, and removing the compared 5 seconds feature codes from the buffer areas after the comparison is completed;
if the 5-second feature codes of the source node and the monitoring node are consistent in comparison, outputting a Result equal to 1, and exiting from the step S24;
if the comparison of the feature codes for 30 seconds is inconsistent, the Result is equal to 3, otherwise, the Result is equal to 2, and the process exits from step S24.
FIG. 7 is a diagram of a protocol layer structure of a video stream in the detection method according to an embodiment of the present invention; in one embodiment of the invention, the protocol of the video stream employs the hierarchical structure shown in FIG. 7.
The invention also discloses a system for detecting the consistency of the self-adaptive IPTV live content, which comprises the following components: the system comprises at least one feature code extraction device and a comparison server, wherein the comparison server is respectively connected with the feature code extraction devices.
The feature code extraction equipment is deployed at different detection points of the IPTV network and used for extracting feature codes of live videos and uploading the feature codes to the comparison server; the detection point deployed in the IPTV network comprises a source node and a monitoring node; the feature code extraction equipment respectively collects ES feature codes and image feature codes of IPTV live video stream contents at a source node and a monitoring node, and uploads the ES feature codes and the image feature codes to the comparison server to perform video content consistency comparison.
The comparison server is used for synchronously comparing the feature codes uploaded by different detection points and judging whether the live broadcast contents are consistent or not; the comparison server selects an optimal feature code extraction and comparison method according to the synchronous matching condition of the two types of feature codes to carry out efficient IPTV live broadcast video stream content consistency detection, and alarm for abnormity by comparison so as to find out the safety problem that the live broadcast program is tampered in time.
In an embodiment of the present invention, the comparison server includes: the system comprises an ES characteristic code synchronization module, an image characteristic code synchronization module and a characteristic code cyclic comparison module.
The ES feature code synchronization module is used for outputting an ES synchronization identifier Syn _ ES and indicating whether the ES feature codes of the source node and the monitoring node can realize synchronous matching or not. The image feature code synchronization module is used for outputting an image synchronization identifier Syn _ Pic and indicating whether the image feature codes of the source node and the monitoring node can realize synchronization or not.
The characteristic code cyclic comparison module is used for cyclically acquiring and comparing the ES characteristic codes of the video streams of the source node and the monitoring node and outputting a comparison detection Result; the characteristic code cyclic comparison module is also used for cyclically acquiring and comparing the image characteristic codes of the video streams of the source node and the monitoring node and outputting a comparison detection Result.
If Result is 1, it indicates that the video contents of the source node and the monitoring node sampled in the current 5 seconds are consistent, and the program continues to execute the feature code circular comparison module.
If Result is 2, it indicates that the video contents of the source node and the monitoring node sampled in the current 5 seconds are inconsistent, and the program outputs a "comparison abnormal" alarm.
If Result is 3, it indicates that the video contents of the source node and the monitoring node which are continuously sampled for 6 and 5 seconds are inconsistent, and the program outputs a warning of 'comparison for long-term anomaly'.
The ES feature code synchronization module is used for respectively acquiring the live broadcast ES video streams of the monitoring points through probe equipment of the source node and the monitoring nodes.
The source node probe acquires the ES characteristic code of the video according to the frequency once per second and sends the ES characteristic code into a source node buffer area BufE _ s; the monitoring node probe also acquires the ES characteristic code of the video according to the frequency once per second and sends the ES characteristic code into a monitoring node buffer area BufE _ d. If Len (BufE _ s) > (60) and Len (BufE _ d) > (30), this indicates that the number of ES signature codes buffered in the source node buffer and the monitoring node buffer has reached the preset target required for performing the synchronization comparison.
And executing a synchronization comparison process, namely performing pattern matching search on the feature codes in the BufE _ s of the source node, and detecting whether the condition that the continuous 5-second feature code value in the BufE _ d of the monitoring node is equal to the continuous 5-second feature code value in the BufE _ s exists or not. If the synchronization matching is successful, that is, there is a condition that the signatures of 5 consecutive seconds are completely equal in the buffers of the source node and the monitor node, the module outputs Syn _ ES ═ true, otherwise, outputs Syn _ ES ═ false.
The probe equipment of the source node and the monitoring node comprises an ES characteristic code extraction module used for collecting the ES characteristic codes of videos. The ES characteristic code extraction module analyzes the protocol of the video ES stream, retrieves the initial character of the NALU data unit in the data stream, analyzes the byte of the NALU packet header if the initial byte of the NALU data unit is found, extracts the NALU type field nal _ unit _ type in the NALU packet header, and otherwise, continues to search circularly; if the nal _ unit _ type is Slice, the NALU data unit is indicated that the clean kernel of the NALU data unit stores the fragment information of the image, and at the moment, the NALU clean kernel is encrypted by a secure hash algorithm to generate an SHA-1 value; the SHA-1 value is encrypted using the SM4 encryption algorithm and carries the UTC timestamp of the current ES frame as the ES signature.
The image feature code synchronization module is used for respectively acquiring the live broadcast ES video streams of the monitoring points through probe equipment of the source node and the monitoring nodes, positioning I-frame positions in the live broadcast ES video streams, and restoring the I-frame positions into pictures.
The source node probe acquires image feature codes of the video according to the frequency once per second and sends the image feature codes into a source node buffer area BufP _ s; the monitoring node probe also acquires the image feature code of the video according to the frequency once per second and sends the image feature code into a monitoring node buffer zone BufP _ d. If Len (BufP _ s) > (60) and Len (BufP _ d) > (30), it indicates that the number of image feature codes buffered in the source node buffer and the monitoring node buffer has reached the preset target required for performing the synchronization comparison.
At this time, a synchronization comparison process can be executed, that is, pattern matching search is performed on the feature codes in the BufP _ s of the source node, and whether the situation that the continuous 5-second feature code value in the BufP _ d of the monitoring node is equal to the continuous 5-second feature code value in the BufP _ s exists or not is detected. If the synchronization matching is successful, that is, the situation that the feature codes of 5 consecutive seconds are completely equal exists in the buffers of the source node and the monitoring node, the image feature code synchronization module outputs Syn _ Pic ═ true, otherwise, outputs Syn _ Pic ═ false.
The image feature code extraction module is used for carrying out gray processing on the image decoded and restored from the ES video, namely converting color pixel points in the image into gray pixel points. Averagely dividing the grayed picture according to the block unit size of [ NxN ], wherein the default N is 64; and calculating the average brightness value of the pixel points in each block and storing the average brightness value into an image feature code matrix Lumi [ NxN ]. The image feature code matrix Lumi [ NxN ] is encrypted using the SM4 encryption algorithm and carries the UTC time tag of the current frame as the image feature code.
The characteristic code cyclic comparison module is used for writing the ES or the image characteristic code of the source node into a source ES or an image characteristic code buffer BufX _ s according to the ES and image characteristic code synchronization result described in the ES characteristic code synchronization module and the image characteristic code synchronization module; and writing the ES or the image characteristic code of the monitoring node into a monitoring ES or image characteristic code buffer BufX _ d. If Len (BufX _ s) > (60) or Len (BufX _ d) > (30), this indicates that the number of feature codes in the source node buffer area or the monitoring node buffer area exceeds the threshold of the buffer area, that is, overflow occurs, and then the feature code circular comparison module outputs a Result of 3, and exits from the circular comparison of the feature codes. If Len (BufX _ s) > (5) and Len (BufX _ d) > (5), it indicates that the feature codes in the source node buffer area and the monitoring node buffer area both reach 5 seconds, at this time, the feature codes in the earliest 5 seconds buffer areas in the source node buffer area and the monitoring node buffer area are compared, and after the comparison, the 5 seconds feature codes are removed from the buffer area.
And if the 5-second feature codes of the source node and the monitoring node are consistent in comparison, outputting a Result which is 1, and exiting the cyclic comparison of the feature codes. And if the feature code comparison is inconsistent for continuous 30 seconds, outputting a Result which is 3, otherwise, outputting a Result which is 2, and exiting the cyclic comparison of the feature codes.
In summary, the method and system for detecting consistency of the adaptive IPTV live broadcast content provided by the present invention can efficiently detect consistency of the IPTV live broadcast video stream content, thereby improving detection efficiency.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The description and applications of the invention herein are illustrative and are not intended to limit the scope of the invention to the embodiments described above. Variations and modifications of the embodiments disclosed herein are possible, and alternative and equivalent various components of the embodiments will be apparent to those skilled in the art. It will be clear to those skilled in the art that the present invention may be embodied in other forms, structures, arrangements, proportions, and with other components, materials, and parts, without departing from the spirit or essential characteristics thereof. Other variations and modifications of the embodiments disclosed herein may be made without departing from the scope and spirit of the invention.
Claims (10)
1. A method for detecting consistency of IPTV live content is characterized in that the detection method comprises the following steps:
step S1, extracting live video feature codes by the feature code extraction equipment and uploading the live video feature codes to a comparison server; the detection point deployed in the IPTV network comprises a source node and a monitoring node; the feature code extraction equipment respectively collects ES feature codes and image feature codes of IPTV live video stream contents at a source node and a monitoring node, and uploads the ES feature codes and the image feature codes to the comparison server for video content consistency comparison;
step S2, the comparison server compares the feature codes uploaded by different detection points synchronously to judge whether the live broadcast contents are consistent; the comparison server selects an optimal feature code extraction and comparison method according to the synchronous matching condition of the two types of feature codes to carry out efficient IPTV live broadcast video stream content consistency detection, alarm for abnormity by comparison and find out the safety problem that the live broadcast program is tampered in time;
the step S2 includes:
step S21: an ES characteristic code synchronization step, namely outputting an ES synchronization identifier Syn _ ES to indicate whether the ES characteristic codes of the source node and the monitoring node can realize synchronous matching or not;
step S22: the ES characteristic code cyclic comparison step is used for cyclically acquiring and comparing the ES characteristic codes of the video streams of the source node and the monitoring node and outputting a comparison detection Result;
step S23: an image feature code synchronization step, namely outputting an image synchronization identifier Syn _ Pic, and indicating whether the image feature codes of the source node and the monitoring node can realize synchronization or not;
step S24: circularly comparing the image characteristic codes, namely circularly acquiring and comparing the image characteristic codes of the video streams of the source node and the monitoring node, and outputting a comparison detection Result;
if Result is 1, the video contents of the source node and the monitoring node sampled in the current 5 seconds are consistent, and the program continues to execute the cyclic comparison of the feature codes;
if Result is 2, the video contents of the source node and the monitoring node sampled in the current 5 seconds are inconsistent, and the program outputs an abnormal comparison alarm;
if Result is 3, the video contents of the source node and the monitoring node which are continuously sampled for 6 seconds are inconsistent, the program outputs a comparison long-term abnormity alarm, and the ES/image feature code synchronization step is executed again;
the step S21 includes:
step S211, the probe devices of the source node and the monitoring node respectively collect the live broadcast ES video streams of the monitoring points;
s212, the source node probe acquires the ES characteristic code of the video according to the frequency once per second and sends the ES characteristic code to a source node buffer area BufE _ S; the monitoring node probe also acquires the ES characteristic code of the video according to the frequency once per second and sends the ES characteristic code to a monitoring node buffer area BufE _ d;
the process of collecting the ES characteristic code comprises the following steps: carrying out protocol analysis on the video ES stream, retrieving a start character of an NALU data unit in the data stream, if a start byte of the NALU data unit is found, analyzing bytes of a NALU packet header, extracting a NALU type field nal _ unit _ type in the NALU packet header, and if not, continuously and circularly searching; if the nal _ unit _ type is Slice, the NALU data unit is indicated that the clean kernel of the NALU data unit stores the fragment information of the image, and at the moment, the NALU clean kernel is encrypted by a secure hash algorithm to generate an SHA-1 value; encrypting the SHA-1 value by using an SM4 encryption algorithm, and carrying a UTC time tag of the current ES frame as an ES feature code;
step S213, if Len (BufE _ S) > (60 andslen (BufE _ d) > (30), this indicates that the number of ES feature codes buffered in the source node buffer and the monitoring node buffer has reached the preset target required for performing the synchronization comparison;
step S214, a synchronization comparison process is executed, namely, pattern matching searching is carried out on the feature codes in the BufE _ S of the source node, and whether the situation that the continuous 5-second feature code value in the BufE _ d of the monitoring node is equal to the continuous 5-second feature code value in the BufE _ S exists or not is detected;
step S215, if the synchronization matching is successful, that is, if there is a situation that the feature codes of the source node and the monitoring node for 5 consecutive seconds are completely equal, outputting Syn _ ES ═ true, otherwise, outputting Syn _ ES ═ false;
the step S22 includes:
step S221, writing the ES characteristic code of the source node into a source ES characteristic code buffer BufX _ S according to the ES characteristic code synchronization result described in the step S21; writing the ES characteristic code of the monitoring node into a monitoring ES characteristic code buffer BufX _ d;
step S222, if Len (BufX _ S) > (60) and Len (BufX _ d) > (30), this indicates that the number of feature codes in the source node buffer or the monitor node buffer exceeds the threshold of the buffer, that is, an overflow occurs, and then the output Result is 3, and step S22 is exited;
if Len (BufX _ s) > (5) and Len (BufX _ d) > (5), which indicates that feature codes in the source node buffer area and the monitoring node buffer area reach 5 seconds, comparing the feature codes in the earliest 5 seconds buffer areas in the source node buffer area and the monitoring node buffer area, and removing the compared 5 seconds feature codes from the buffer areas after the comparison is completed;
if the 5-second feature codes of the source node and the monitoring node are consistent in comparison, outputting a Result equal to 1, and exiting from the step S22;
if the feature code comparison is inconsistent for the continuous 30 seconds, outputting a Result of 3, otherwise, outputting a Result of 2, and exiting from step S22;
the step S23 includes:
s231, respectively acquiring the live broadcast ES video streams of respective monitoring points by probe equipment of a source node and a monitoring node, positioning an I-frame position in the live broadcast ES video streams, and restoring the I-frame position into a picture;
s232, the source node probe acquires image feature codes of the video according to the frequency once per second and sends the image feature codes to a source node buffer area BufP _ S; the monitoring node probe also acquires the image feature code of the video according to the frequency once per second and sends the image feature code to a monitoring node buffer zone BufP _ d;
the image feature code acquisition process comprises the steps of carrying out gray processing on a picture decoded and restored from an ES video, namely converting color pixel points in the picture into gray pixel points; averagely dividing the grayed picture according to the block unit size of [ NxN ], wherein the default N is 64; calculating the average brightness value of the pixel points in each block, and storing the average brightness value into an image feature code matrix Lumi [ NxN ]; encrypting an image feature code matrix Lumi [ NxN ] by using an SM4 encryption algorithm, and carrying a UTC time label of a current frame as an image feature code;
step S233, if Len (BufP _ S) > (60) and Len (BufP _ d) > (30), this indicates that the number of image feature codes buffered in the source node buffer area and the monitoring node buffer area has reached the preset target required for performing the synchronization comparison;
step S234, executing a synchronization comparison process, namely performing pattern matching search on the feature codes in the BufP _ S of the source node, and detecting whether the condition that the continuous 5-second feature code value in the BufP _ d of the monitoring node is equal to the continuous 5-second feature code value in the BufP _ S exists or not;
step S235, if the synchronization matching is successful, that is, if there is a situation that the feature codes of the source node and the monitoring node for 5 consecutive seconds are completely equal, outputting Syn _ Pic ═ true, otherwise, outputting Syn _ Pic ═ false;
the step S24 includes:
step S241, writing the image feature code of the source node into a source image feature code buffer BufX _ S according to the image feature code synchronization result described in the step S23; writing the image feature code of the monitoring node into a monitoring ES or an image feature code buffer BufX _ d;
step S242, if Len (BufX _ S) > (60) and Len (BufX _ d) > (30), this indicates that the number of feature codes in the source node buffer or the monitor node buffer exceeds the threshold of the buffer, that is, overflow occurs, and then the output Result is 3, and step S24 is exited;
if Len (BufX _ s) > (5) and Len (BufX _ d) > (5), which indicates that feature codes in a source node buffer area and a monitoring node buffer area reach 5 seconds, comparing the feature codes of the earliest 5 seconds buffer areas in the source node buffer area and the monitoring node buffer area, and removing the compared 5 seconds feature codes from the buffer areas after the comparison is completed;
if the 5-second feature codes of the source node and the monitoring node are consistent in comparison, outputting a Result equal to 1, and exiting from the step S24;
if the comparison of the feature codes for 30 seconds is inconsistent, the Result is equal to 3, otherwise, the Result is equal to 2, and the process exits from step S24.
2. A method for detecting consistency of IPTV live content is characterized in that the detection method comprises the following steps:
step S1, extracting live video feature codes by the feature code extraction equipment and uploading the live video feature codes to a comparison server; the detection point deployed in the IPTV network comprises a source node and a monitoring node; the feature code extraction equipment respectively collects ES feature codes and image feature codes of IPTV live video stream contents at a source node and a monitoring node, and uploads the ES feature codes and the image feature codes to the comparison server for video content consistency comparison;
step S2, the comparison server compares the feature codes uploaded by different detection points synchronously to judge whether the live broadcast contents are consistent; and the comparison server selects an optimal feature code extraction and comparison method according to the synchronous matching condition of the two types of feature codes to carry out efficient IPTV live video stream content consistency detection, and alarms for abnormity by comparison so as to find out the safety problem that the live program is tampered in time.
3. The detection method according to claim 2, characterized in that:
the step S2 includes:
step S21: an ES characteristic code synchronization step, namely outputting an ES synchronization identifier Syn _ ES to indicate whether the ES characteristic codes of the source node and the monitoring node can realize synchronous matching or not;
step S22: the ES characteristic code cyclic comparison step is used for cyclically acquiring and comparing the ES characteristic codes of the video streams of the source node and the monitoring node and outputting a comparison detection Result;
step S23: an image feature code synchronization step, namely outputting an image synchronization identifier Syn _ Pic, and indicating whether the image feature codes of the source node and the monitoring node can realize synchronization or not;
step S24: circularly comparing the image characteristic codes, namely circularly acquiring and comparing the image characteristic codes of the video streams of the source node and the monitoring node, and outputting a comparison detection Result;
if Result is 1, the video contents of the source node and the monitoring node sampled in the current 5 seconds are consistent, and the program continues to execute the cyclic comparison step of the feature codes;
if Result is 2, the video contents of the source node and the monitoring node sampled in the current 5 seconds are inconsistent, and the program outputs an abnormal comparison alarm;
if Result is 3, it indicates that the video contents of the source node and the monitoring node which are continuously sampled for 6 seconds are inconsistent, the program outputs a 'comparison long-term anomaly' alarm, and the ES/image feature code synchronization step is executed again.
4. The detection method according to claim 3, characterized in that:
the step S21 includes:
step S211, the probe devices of the source node and the monitoring node respectively collect the live broadcast ES video streams of the monitoring points;
s212, the source node probe acquires the ES characteristic code of the video according to the frequency once per second and sends the ES characteristic code to a source node buffer area BufE _ S; the monitoring node probe also acquires the ES characteristic code of the video according to the frequency once per second and sends the ES characteristic code to a monitoring node buffer area BufE _ d;
step S213, if Len (BufE _ S) > (60 andslen (BufE _ d) > (30), this indicates that the number of ES feature codes buffered in the source node buffer and the monitoring node buffer has reached the preset target required for performing the synchronization comparison;
step S214, a synchronization comparison process is executed, namely, pattern matching searching is carried out on the feature codes in the BufE _ S of the source node, and whether the situation that the continuous 5-second feature code value in the BufE _ d of the monitoring node is equal to the continuous 5-second feature code value in the BufE _ S exists or not is detected;
step S215, if the synchronization matching is successful, that is, if there is a situation that the signatures of the source node and the monitor node for 5 consecutive seconds are completely equal, outputting Syn _ ES ═ true, otherwise, outputting Syn _ ES ═ false.
5. The detection method according to claim 3, characterized in that:
the step S22 includes:
step S221, writing the ES characteristic code of the source node into a source ES characteristic code buffer BufX _ S according to the ES characteristic code synchronization result described in the step S21; writing the ES characteristic code of the monitoring node into a monitoring ES characteristic code buffer BufX _ d;
step S222, if Len (BufX _ S) > (60) and Len (BufX _ d) > (30), this indicates that the number of feature codes in the source node buffer or the monitor node buffer exceeds the threshold of the buffer, that is, an overflow occurs, and then the output Result is 3, and step S22 is exited;
if Len (BufX _ s) > (5) and Len (BufX _ d) > (5), which indicates that feature codes in the source node buffer area and the monitoring node buffer area reach 5 seconds, comparing the feature codes in the earliest 5 seconds buffer areas in the source node buffer area and the monitoring node buffer area, and removing the compared 5 seconds feature codes from the buffer areas after the comparison is completed;
if the 5-second feature codes of the source node and the monitoring node are consistent in comparison, outputting a Result equal to 1, and exiting from the step S22;
if the comparison of the feature codes for 30 seconds is inconsistent, the Result is equal to 3, otherwise, the Result is equal to 2, and the process exits from step S22.
6. The detection method according to claim 3, characterized in that:
the step S23 includes:
s231, respectively acquiring the live broadcast ES video streams of respective monitoring points by probe equipment of a source node and a monitoring node, positioning an I-frame position in the live broadcast ES video streams, and restoring the I-frame position into a picture;
s232, the source node probe acquires image feature codes of the video according to the frequency once per second and sends the image feature codes to a source node buffer area BufP _ S; the monitoring node probe also acquires the image feature code of the video according to the frequency once per second and sends the image feature code to a monitoring node buffer zone BufP _ d;
step S233, if Len (BufP _ S) > (60) and Len (BufP _ d) > (30), this indicates that the number of image feature codes buffered in the source node buffer area and the monitoring node buffer area has reached the preset target required for performing the synchronization comparison;
step S234, executing a synchronization comparison process, namely performing pattern matching search on the feature codes in the BufP _ S of the source node, and detecting whether the condition that the continuous 5-second feature code value in the BufP _ d of the monitoring node is equal to the continuous 5-second feature code value in the BufP _ S exists or not;
step S235, if the synchronization matching is successful, that is, if there is a situation that the signatures of the source node and the monitor node for 5 consecutive seconds are completely equal, outputting Syn _ Pic ═ true, otherwise, outputting Syn _ Pic ═ false.
7. The detection method according to claim 3, characterized in that:
the step S24 includes:
step S241, writing the image feature code of the source node into a source image feature code buffer BufX _ S according to the image feature code synchronization result described in the step S23; writing the image feature code of the monitoring node into a monitoring image feature code buffer BufX _ d;
step S242, if Len (BufX _ S) > (60) and Len (BufX _ d) > (30), this indicates that the number of feature codes in the source node buffer or the monitor node buffer exceeds the threshold of the buffer, that is, overflow occurs, and then the output Result is 3, and step S24 is exited;
if Len (BufX _ s) > (5) and Len (BufX _ d) > (5), which indicates that feature codes in the source node buffer area and the monitoring node buffer area reach 5 seconds, comparing the feature codes in the earliest 5 seconds buffer areas in the source node buffer area and the monitoring node buffer area, and removing the compared 5 seconds feature codes from the buffer areas after the comparison is completed;
if the 5-second feature codes of the source node and the monitoring node are consistent in comparison, outputting a Result equal to 1, and exiting from the step S24;
if the comparison of the feature codes for 30 seconds is inconsistent, the Result is equal to 3, otherwise, the Result is equal to 2, and the process exits from step S24.
8. An adaptive IPTV live content consistency detection system, characterized in that the system comprises: the system comprises at least one feature code extraction device and a comparison server, wherein the comparison server is respectively connected with the feature code extraction devices;
the feature code extraction equipment is deployed at different detection points of the IPTV network and used for extracting feature codes of live videos and uploading the feature codes to the comparison server; the detection point deployed in the IPTV network comprises a source node and a monitoring node; the feature code extraction equipment respectively collects ES feature codes and image feature codes of IPTV live video stream contents at a source node and a monitoring node, and uploads the ES feature codes and the image feature codes to the comparison server for video content consistency comparison;
the comparison server is used for synchronously comparing the feature codes uploaded by different detection points and judging whether the live broadcast contents are consistent or not; the comparison server selects an optimal feature code extraction and comparison method according to the synchronous matching condition of the two types of feature codes to carry out efficient IPTV live broadcast video stream content consistency detection, alarm for abnormity by comparison and find out the safety problem that the live broadcast program is tampered in time;
the comparison server comprises:
the ES feature code synchronization module is used for outputting an ES synchronization identifier Syn _ ES and indicating whether the ES feature codes of the source node and the monitoring node can realize synchronous matching or not;
the image feature code synchronization module is used for outputting an image synchronization identifier Syn _ Pic and indicating whether the image feature codes of the source node and the monitoring node can realize synchronization or not;
the characteristic code cyclic comparison module is used for cyclically acquiring and comparing the ES characteristic codes of the video streams of the source node and the monitoring node and outputting a comparison detection Result; the feature code cyclic comparison module is also used for cyclically acquiring and comparing the image feature codes of the video streams of the source node and the monitoring node and outputting a comparison detection Result;
if Result is 1, the video contents of the source node and the monitoring node sampled in the current 5 seconds are consistent, and the program continues to execute the feature code cyclic comparison module;
if Result is 2, the video contents of the source node and the monitoring node sampled in the current 5 seconds are inconsistent, and the program outputs an abnormal comparison alarm;
if Result is 3, the video contents of the source node and the monitoring node which are continuously sampled for 65 seconds are inconsistent, and the program outputs a warning of comparison for long-term abnormity;
the ES feature code synchronization module is used for respectively acquiring the live broadcast ES video streams of the monitoring points through probe equipment of the source node and the monitoring nodes;
the source node probe acquires the ES characteristic code of the video according to the frequency once per second and sends the ES characteristic code into a source node buffer area BufE _ s; the monitoring node probe also acquires the ES characteristic code of the video according to the frequency once per second and sends the ES characteristic code to a monitoring node buffer area BufE _ d;
if Len (BufE _ s) > (60) and Len (BufE _ d) > (30), this indicates that the number of ES feature codes buffered in the source node buffer area and the monitoring node buffer area has reached the preset target required for performing the synchronization comparison;
executing a synchronization comparison process, namely performing pattern matching search on the feature codes in the BufE _ s of the source node, and detecting whether the condition that the continuous 5-second feature code value in the BufE _ d of the monitoring node is equal to the continuous 5-second feature code value in the BufE _ s exists or not;
if the synchronization matching is successful, namely the condition that the continuous 5-second feature codes are completely equal exists in the buffer areas of the source node and the monitoring node, the module outputs Syn _ ES ═ true, otherwise, outputs Syn _ ES ═ false;
the probe equipment of the source node and the monitoring node comprises an ES characteristic code extraction module for acquiring the ES characteristic codes of videos:
the ES characteristic code extraction module analyzes the protocol of the video ES stream, retrieves the initial character of the NALU data unit in the data stream, analyzes the byte of the NALU packet header if the initial byte of the NALU data unit is found, extracts the NALU type field nal _ unit _ type in the NALU packet header, and otherwise, continues to search circularly; if the nal _ unit _ type is Slice, the NALU data unit is indicated that the clean kernel of the NALU data unit stores the fragment information of the image, and at the moment, the NALU clean kernel is encrypted by a secure hash algorithm to generate an SHA-1 value; encrypting the SHA-1 value by using an SM4 encryption algorithm, and carrying a UTC time tag of the current ES frame as an ES feature code;
the image feature code synchronization module is used for respectively acquiring the live broadcast ES video streams of respective monitoring points through probe equipment of a source node and monitoring nodes, positioning an I-frame position in the live broadcast ES video streams, and restoring the I-frame position into a picture;
the source node probe acquires image feature codes of the video according to the frequency once per second and sends the image feature codes into a source node buffer area BufP _ s; the monitoring node probe also acquires the image feature code of the video according to the frequency once per second and sends the image feature code to a monitoring node buffer zone BufP _ d;
if Len (BufP _ s) > (60) and Len (BufP _ d) > (30), indicating that the number of the image feature codes buffered in the source node buffer area and the monitoring node buffer area has reached a preset target required for performing synchronous comparison;
at this time, a synchronous comparison process can be executed, namely, pattern matching searching is carried out on the feature codes in the BufP _ s of the source node, and whether the condition that the continuous 5-second feature code value in the BufP _ d of the monitoring node is equal to the continuous 5-second feature code value in the BufP _ s exists or not is detected;
if the synchronization matching is successful, namely the condition that the feature codes of 5 consecutive seconds are completely equal exists in the buffer areas of the source node and the monitoring node, the image feature code synchronization module outputs Syn _ Pic ═ true, otherwise, outputs Syn _ Pic ═ false;
the image feature code extraction module is used for carrying out gray processing on the image decoded and restored from the ES video, namely converting color pixel points in the image into gray pixel points; averagely dividing the grayed picture according to the block unit size of [ NxN ], wherein the default N is 64; calculating the average brightness value of the pixel points in each block, and storing the average brightness value into an image feature code matrix Lumi [ NxN ]; encrypting an image feature code matrix Lumi [ NxN ] by using an SM4 encryption algorithm, and carrying a UTC time label of a current frame as an image feature code;
the characteristic code cyclic comparison module is used for writing the ES or the image characteristic code of the source node into a source ES or an image characteristic code buffer BufX _ s according to the ES and image characteristic code synchronization result described in the ES characteristic code synchronization module and the image characteristic code synchronization module; writing the ES or image characteristic code of the monitoring node into a monitoring ES or image characteristic code buffer BufX _ d;
if Len (BufX _ s) > (60) or Len (BufX _ d) > (30), this indicates that the number of feature codes in the source node buffer area or the monitoring node buffer area exceeds the threshold of the buffer area, that is, overflow occurs, at this time, the feature code cyclic comparison module outputs a Result of (3), and the cyclic comparison of the feature codes is exited; wherein BufX is a generic term, X represents "E" or "P", BufX represents BufE or BufP;
if Len (BufX _ s) > (5) and Len (BufX _ d) > (5), indicating that the feature codes in the source node buffer area and the monitoring node buffer area reach 5 seconds, comparing the feature codes in the earliest 5 seconds buffer areas in the source node buffer area and the monitoring node buffer area, and removing the compared 5 seconds feature codes from the buffer areas after the comparison is completed;
if the 5-second feature codes of the source node and the monitoring node are compared consistently, outputting a Result which is 1, and exiting the cyclic comparison of the feature codes;
and if the feature code comparison is inconsistent for continuous 30 seconds, outputting a Result which is 3, otherwise, outputting a Result which is 2, and exiting the cyclic comparison of the feature codes.
9. An adaptive IPTV live content consistency detection system, characterized in that the system comprises: the system comprises at least one feature code extraction device and a comparison server, wherein the comparison server is respectively connected with the feature code extraction devices;
the feature code extraction equipment is deployed at different detection points of the IPTV network and used for extracting feature codes of live videos and uploading the feature codes to the comparison server; the detection point deployed in the IPTV network comprises a source node and a monitoring node;
the comparison server is used for synchronously comparing the feature codes uploaded by different detection points and judging whether the live broadcast contents are consistent or not;
the feature code extraction equipment respectively collects ES feature codes and image feature codes of IPTV live video stream contents at a source node and a monitoring node, and uploads the ES feature codes and the image feature codes to the comparison server for video content consistency comparison;
the comparison server selects an optimal feature code extraction and comparison method according to the synchronous matching condition of the two types of feature codes to carry out efficient IPTV live broadcast video stream content consistency detection, alarm for abnormity by comparison and find out the safety problem that the live broadcast program is tampered in time;
the comparison server comprises:
the ES feature code synchronization module is used for outputting an ES synchronization identifier Syn _ ES and indicating whether the ES feature codes of the source node and the monitoring node can realize synchronous matching or not;
the image feature code synchronization module is used for outputting an image synchronization identifier Syn _ Pic and indicating whether the image feature codes of the source node and the monitoring node can realize synchronization or not;
the characteristic code cyclic comparison module is used for cyclically acquiring and comparing the ES characteristic codes of the video streams of the source node and the monitoring node and outputting a comparison detection Result; the feature code cyclic comparison module is also used for cyclically acquiring and comparing the image feature codes of the video streams of the source node and the monitoring node and outputting a comparison detection Result;
if Result is 1, the video contents of the source node and the monitoring node sampled in the current 5 seconds are consistent, and the program continues to execute the feature code cyclic comparison module;
if Result is 2, the video contents of the source node and the monitoring node sampled in the current 5 seconds are inconsistent, and the program outputs an abnormal comparison alarm;
if Result is 3, it indicates that the video contents of the source node and the monitoring node which are continuously sampled for 6 seconds are inconsistent, the program outputs a 'comparison long-term anomaly' alarm, and the ES/image feature code synchronization module is executed again.
10. The adaptive IPTV live content consistency detection system of claim 9, wherein:
the ES feature code synchronization module is used for respectively acquiring the live broadcast ES video streams of the monitoring points through probe equipment of the source node and the monitoring nodes;
the source node probe acquires the ES characteristic code of the video according to the frequency once per second and sends the ES characteristic code into a source node buffer area BufE _ s; the monitoring node probe also acquires the ES characteristic code of the video according to the frequency once per second and sends the ES characteristic code to a monitoring node buffer area BufE _ d;
if Len (BufE _ s) > (60) and Len (BufE _ d) > (30), this indicates that the number of ES feature codes buffered in the source node buffer area and the monitoring node buffer area has reached the preset target required for performing the synchronization comparison;
executing a synchronization comparison process, namely performing pattern matching search on the feature codes in the BufE _ s of the source node, and detecting whether the condition that the continuous 5-second feature code value in the BufE _ d of the monitoring node is equal to the continuous 5-second feature code value in the BufE _ s exists or not;
if the synchronization matching is successful, namely the condition that the continuous 5-second feature codes are completely equal exists in the buffer areas of the source node and the monitoring node, the module outputs Syn _ ES ═ true, otherwise, outputs Syn _ ES ═ false;
the probe equipment of the source node and the monitoring node comprises an ES characteristic code extraction module for acquiring the ES characteristic codes of videos:
the image feature code synchronization module is used for respectively acquiring the live broadcast ES video streams of respective monitoring points through probe equipment of a source node and monitoring nodes, positioning an I-frame position in the live broadcast ES video streams, and restoring the I-frame position into a picture;
the source node probe acquires image feature codes of the video according to the frequency once per second and sends the image feature codes into a source node buffer area BufP _ s; the monitoring node probe also acquires the image feature code of the video according to the frequency once per second and sends the image feature code to a monitoring node buffer zone BufP _ d;
if Len (BufP _ s) > (60) and Len (BufP _ d) > (30), indicating that the number of the image feature codes buffered in the source node buffer area and the monitoring node buffer area has reached a preset target required for performing synchronous comparison;
at this time, a synchronous comparison process can be executed, namely, pattern matching searching is carried out on the feature codes in the BufP _ s of the source node, and whether the condition that the continuous 5-second feature code value in the BufP _ d of the monitoring node is equal to the continuous 5-second feature code value in the BufP _ s exists or not is detected;
if the synchronization matching is successful, namely the condition that the continuous 5-second feature codes are completely equal exists in the buffers of the source node and the monitoring node, the module outputs Syn _ Pic ═ true, otherwise, outputs Syn _ Pic ═ false;
the characteristic code cyclic comparison module is used for writing the ES or the image characteristic code of the source node into a source ES or an image characteristic code buffer BufX _ s according to the ES and image characteristic code synchronization result described in the ES characteristic code synchronization module and the image characteristic code synchronization module; writing the ES or image characteristic code of the monitoring node into a monitoring ES or image characteristic code buffer BufX _ d;
if Len (BufX _ s) > (60 OR Len (BufX _ d) > (30), this indicates that the number of feature codes in the source node buffer area OR the monitoring node buffer area exceeds the threshold of the buffer area, that is, overflow occurs, at this time, the feature code cyclic comparison module outputs a Result of 3, and the cyclic comparison of the feature codes is exited;
if Len (BufX _ s) > (5) and Len (BufX _ d) > (5), indicating that the feature codes in the source node buffer area and the monitoring node buffer area reach 5 seconds, comparing the feature codes in the earliest 5 seconds buffer areas in the source node buffer area and the monitoring node buffer area, and removing the compared 5 seconds feature codes from the buffer areas after the comparison is completed;
if the 5-second feature codes of the source node and the monitoring node are compared consistently, outputting a Result which is 1, and exiting the feature code cyclic comparison module;
and if the feature code comparison is inconsistent for continuous 30 seconds, outputting a Result which is 3, otherwise, outputting a Result which is 2, and exiting the cyclic comparison of the feature codes.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911249148.8A CN113038146A (en) | 2019-12-09 | 2019-12-09 | Method and system for detecting consistency of self-adaptive IPTV live broadcast content |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911249148.8A CN113038146A (en) | 2019-12-09 | 2019-12-09 | Method and system for detecting consistency of self-adaptive IPTV live broadcast content |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113038146A true CN113038146A (en) | 2021-06-25 |
Family
ID=76450941
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911249148.8A Pending CN113038146A (en) | 2019-12-09 | 2019-12-09 | Method and system for detecting consistency of self-adaptive IPTV live broadcast content |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113038146A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114205645A (en) * | 2021-12-10 | 2022-03-18 | 北京凯视达信息技术有限公司 | Distributed video content auditing method and device |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020082731A1 (en) * | 2000-11-03 | 2002-06-27 | International Business Machines Corporation | System for monitoring audio content in a video broadcast |
CN101222626A (en) * | 2008-02-01 | 2008-07-16 | 中国传媒大学 | Digital television signal code stream characteristic extraction and recognition method and equipment thereof |
CN101944064A (en) * | 2010-10-12 | 2011-01-12 | 中国人民解放军国防科学技术大学 | Control flow error detection optimizing method based on reconstructed control flow graph |
CN103974061A (en) * | 2014-05-27 | 2014-08-06 | 合一网络技术(北京)有限公司 | Play test method and system |
CN105611326A (en) * | 2015-12-28 | 2016-05-25 | 上海昌视网络科技有限公司 | Stream media on demand auditing and checking method |
CN106454426A (en) * | 2016-10-27 | 2017-02-22 | 四川长虹电器股份有限公司 | Method for identifying analog channel of intelligent television |
CN106658071A (en) * | 2016-11-28 | 2017-05-10 | 北京蓝拓扑电子技术有限公司 | Method and device for determining transmission state of bit stream |
JP2018156344A (en) * | 2017-03-17 | 2018-10-04 | 日本放送協会 | Video stream consistency determination program |
CN108769742A (en) * | 2018-07-10 | 2018-11-06 | 江苏省公用信息有限公司 | A kind of IPTV multicast contents tamper resistant method |
CN109729390A (en) * | 2019-02-01 | 2019-05-07 | 浪潮软件集团有限公司 | A kind of IPTV program monitoring method, apparatus and system |
-
2019
- 2019-12-09 CN CN201911249148.8A patent/CN113038146A/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020082731A1 (en) * | 2000-11-03 | 2002-06-27 | International Business Machines Corporation | System for monitoring audio content in a video broadcast |
CN101222626A (en) * | 2008-02-01 | 2008-07-16 | 中国传媒大学 | Digital television signal code stream characteristic extraction and recognition method and equipment thereof |
CN101944064A (en) * | 2010-10-12 | 2011-01-12 | 中国人民解放军国防科学技术大学 | Control flow error detection optimizing method based on reconstructed control flow graph |
CN103974061A (en) * | 2014-05-27 | 2014-08-06 | 合一网络技术(北京)有限公司 | Play test method and system |
CN105611326A (en) * | 2015-12-28 | 2016-05-25 | 上海昌视网络科技有限公司 | Stream media on demand auditing and checking method |
CN106454426A (en) * | 2016-10-27 | 2017-02-22 | 四川长虹电器股份有限公司 | Method for identifying analog channel of intelligent television |
CN106658071A (en) * | 2016-11-28 | 2017-05-10 | 北京蓝拓扑电子技术有限公司 | Method and device for determining transmission state of bit stream |
JP2018156344A (en) * | 2017-03-17 | 2018-10-04 | 日本放送協会 | Video stream consistency determination program |
CN108769742A (en) * | 2018-07-10 | 2018-11-06 | 江苏省公用信息有限公司 | A kind of IPTV multicast contents tamper resistant method |
CN109729390A (en) * | 2019-02-01 | 2019-05-07 | 浪潮软件集团有限公司 | A kind of IPTV program monitoring method, apparatus and system |
Non-Patent Citations (3)
Title |
---|
康晓军;王劲强;王芸;: "基于扩展块的星载软件控制流容错评价方法", 航天返回与遥感, no. 03 * |
文振?等: "基于自适应哈希算法的直播视频篡改检测", 《深圳大学学报(理工版)》 * |
文振?等: "基于自适应哈希算法的直播视频篡改检测", 《深圳大学学报(理工版)》, no. 02, 30 March 2017 (2017-03-30) * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114205645A (en) * | 2021-12-10 | 2022-03-18 | 北京凯视达信息技术有限公司 | Distributed video content auditing method and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210127179A1 (en) | System and method for signaling security and database population | |
US9241156B2 (en) | Method for estimating the type of the group of picture structure of a plurality of video frames in a video stream | |
CN103067778B (en) | Data monitoring system and data monitoring method | |
US8649278B2 (en) | Method and system of multimedia service performance monitoring | |
CN108769742A (en) | A kind of IPTV multicast contents tamper resistant method | |
CN108401135A (en) | Electric charging station monitor video data processing method and device | |
CN104853244A (en) | Method and apparatus for managing audio visual, audio or visual content | |
KR102035912B1 (en) | Method And Apparatus for Repairing and Detecting Packet Loss | |
CN113038146A (en) | Method and system for detecting consistency of self-adaptive IPTV live broadcast content | |
Dalal et al. | Video steganalysis to obstruct criminal activities for digital forensics: A survey | |
CN112954371A (en) | Live broadcast content ES feature code extraction method and live broadcast content consistency comparison method | |
KR102392888B1 (en) | Method and Apparatus for Improving Packet Loss Recovery | |
CN110366049B (en) | Integrity protection method for streaming video | |
CN112954448A (en) | Live broadcast content image feature code extraction method and live broadcast content consistency comparison method | |
CN117201845A (en) | Live program head-cast and replay content consistency monitoring method based on frame comparison | |
CN111460217A (en) | Video retrieval system and method for operating a video retrieval system | |
CN114827617B (en) | Video coding and decoding method and system based on perception model | |
US20230328308A1 (en) | Synchronization of multiple content streams | |
KR101849092B1 (en) | Method and Apparatus for Detecting Picture Breaks for Video Service of Real Time | |
WO2018157336A1 (en) | Data processing device and method | |
CN113378633A (en) | Method and system for detecting quality of streaming media signal | |
CN112437278A (en) | Cooperative monitoring system, device and method | |
CN112822156B (en) | Confidential information monitoring system and method | |
Iqbal et al. | Compressed-domain spatial adaptation resilient perceptual encryption of live H. 264 video | |
CN117979067B (en) | Identification method for broadcasting television broadcasting without regulation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20210625 |
|
WD01 | Invention patent application deemed withdrawn after publication |