CN115103224B - Video intelligent analysis method supporting GAT1400 protocol - Google Patents

Video intelligent analysis method supporting GAT1400 protocol Download PDF

Info

Publication number
CN115103224B
CN115103224B CN202210636432.6A CN202210636432A CN115103224B CN 115103224 B CN115103224 B CN 115103224B CN 202210636432 A CN202210636432 A CN 202210636432A CN 115103224 B CN115103224 B CN 115103224B
Authority
CN
China
Prior art keywords
video
stream data
video stream
frame
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210636432.6A
Other languages
Chinese (zh)
Other versions
CN115103224A (en
Inventor
余丹
唐霆岳
兰雨晴
邢智涣
王丹星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Standard Intelligent Security Technology Co Ltd
Original Assignee
China Standard Intelligent Security Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Standard Intelligent Security Technology Co Ltd filed Critical China Standard Intelligent Security Technology Co Ltd
Priority to CN202210636432.6A priority Critical patent/CN115103224B/en
Publication of CN115103224A publication Critical patent/CN115103224A/en
Application granted granted Critical
Publication of CN115103224B publication Critical patent/CN115103224B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/643Communication protocols
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The invention provides a video intelligent analysis method supporting GAT1400 protocol, which comprises the steps of constructing an access management subsystem and a video analysis subsystem in terminal equipment to respectively manage connection with external equipment and realize analysis of video stream data, so that video analysis is not required under a GAT1400 frame, and the external equipment is only required to be controlled and managed by using a GAT1400 remote data platform subsequently, thereby improving the processing accuracy and reliability of the video stream data.

Description

Video intelligent analysis method supporting GAT1400 protocol
Technical Field
The invention relates to the technical field of intelligent analysis of video streams, in particular to an intelligent analysis method of video supporting GAT1400 protocol.
Background
Currently, not all terminal devices support the GAT1400 protocol, and the supporting function of the GAT1400 protocol by the terminal devices supporting the GAT1400 protocol is not perfect. In particular, in the aspect of analyzing and processing video stream data by the terminal equipment, additional system conversion is needed to be performed on the video stream data, so that the workload of processing the video stream data by the terminal equipment is increased, accurate and reliable video picture quality analysis on the video stream data after the system conversion cannot be ensured, and the processing accuracy and reliability of the video stream data are reduced.
Disclosure of Invention
Aiming at the defects existing in the prior art, the invention provides a video intelligent analysis method supporting GAT1400 protocol, which constructs an access management subsystem and a video analysis subsystem in terminal equipment, manages the connection state of the terminal equipment and external equipment through the access management subsystem, and acquires video stream data uploaded by the external equipment; carrying out YUV analysis processing on the monitored video stream data through a video analysis subsystem to obtain a video picture content quality identification result of the video stream data, and transmitting the video picture content quality identification result to a GAT1400 remote data platform through an HTTP port; finally, controlling the video stream data uploading state of the external equipment through the GAT1400 remote data platform; according to the analysis method, the access management subsystem and the video analysis subsystem are constructed in the terminal equipment to respectively manage connection with external equipment and realize analysis of video stream data, so that video analysis is not required under the GAT1400 frame, the external equipment is only required to be controlled and managed by using the GAT1400 remote data platform subsequently, and the accuracy and reliability of processing the video stream data are improved.
The invention provides a video intelligent analysis method supporting GAT1400 protocol, which comprises the following steps:
step S1, an access management subsystem and a video analysis subsystem are built in terminal equipment, and system isolation is carried out on the access management subsystem and the video analysis subsystem; after the terminal equipment is awakened, the access management subsystem and the video analysis subsystem are instructed to enter a working state;
step S2, the access management subsystem is instructed to manage the connection state of the terminal equipment and the external equipment, and the terminal equipment is instructed to acquire video stream data uploaded by the external equipment;
step S3, the video analysis subsystem is instructed to monitor video stream data received by the terminal equipment; performing YUV analysis processing on the monitored video stream data to obtain a video picture content quality identification result of the video stream data;
step S4, the video analysis subsystem is instructed to transmit the video picture content quality identification result to a GAT1400 remote data platform through an HTTP port; and then the video stream data uploading state of the external equipment is controlled by the GAT1400 remote data platform.
Further, in the step S1, an access management subsystem and a video analysis subsystem are built in a terminal device, and the system isolation of the access management subsystem and the video analysis subsystem specifically includes:
and respectively constructing an access management subsystem and a video analysis subsystem in the virtual machine corresponding to the terminal equipment, and installing the access management subsystem and the video analysis subsystem in different memory partitions of the virtual machine, so as to realize system isolation between the access management subsystem and the video analysis subsystem.
Further, in the step S1, after the terminal device is awakened, the instructing the access management subsystem and the video analysis subsystem to enter the working state specifically includes:
and after the terminal equipment is awakened, the access management subsystem and the video analysis subsystem are instructed to enter a background working state.
Further, in the step S2, the instructing the access management subsystem to manage the connection state between the terminal device and the external device specifically includes:
the access management subsystem is instructed to monitor the connection state of the external port of the terminal device and the external device periodically;
when the connection between the external port of the terminal equipment and the external equipment is determined, the access management subsystem is instructed to periodically acquire a connection request from the external equipment, authentication processing of a request protocol type is carried out on the connection request, and the communication protocol type of the connection request sent by the external equipment is determined;
if the communication protocol type is supported by the terminal equipment to be compatible, allowing the external equipment to perform data interaction communication with the terminal equipment;
and if the communication protocol type is not supported by the terminal equipment to be compatible, the external equipment and the terminal equipment are not allowed to carry out data interactive communication.
Further, in the step S2, the instructing the terminal device to acquire the video stream data uploaded from the external device specifically includes:
when the external equipment and the terminal equipment are allowed to carry out data interactive communication, the access management subsystem is used for identifying the data frame head and the data frame tail of video stream data uploaded by the external equipment;
and then, according to the data frame head and the data frame tail of the video stream data, packing the video stream data uploaded by the external equipment to form a complete video stream data packet, and uploading the video stream data packet into the designated buffer space of the terminal equipment.
Further, in the step S2, according to the data frame header and the data frame tail of the video stream data, the video stream data uploaded from the external device is packed to form a complete video stream data packet, and uploading the video stream data packet into the designated buffer space of the terminal device specifically includes:
the video stream data is subjected to data compression processing during packaging, and then the video analysis subsystem is instructed to perform decoding processing and video picture frame extraction processing on the shifted video stream data packet, wherein the video picture frame extraction processing is the data decompression of the video stream data; then carrying out YUV analysis processing on the extracted video picture frame to obtain picture content brightness distribution information and picture content color saturation distribution information of the video picture frame, taking the picture content brightness distribution information and the picture content color saturation distribution information as a video picture content quality identification result,
step S201, using the following formula (1), performing data compression processing on the video stream data at the time of packetization,
Figure BDA0003680473550000031
in the above formula (1) [ r ] a (i,j),g a (i,j),b a (i,j)]RGB values representing pixel points in an ith row and a jth column in an image of an a-th frame after data compression processing is carried out on the video stream data during packaging; [ R ] 1 (i,j),G 1 (i,j),B 1 (i,j)]RGB values representing pixel points of an ith row and a jth column in an image of a 1 st frame of the video stream data before data compression processing; [ R ] a (i,j),G a (i,j),B a (i,j)]RGB values representing pixel points of an ith row and a jth column in an image of an a-th frame of the video stream data before data compression processing; [ R ] a-1 (i,j),G a-1 (i,j),B a-1 (i,j)]RGB values representing pixel points in an ith row and a jth column in an image of an a-1 th frame of the video stream data before data compression processing;
step S202, after the video analysis subsystem decodes the shifted video stream data packet, the video analysis subsystem uses the following formula (2) to decompress the decoded video stream data and extract the decoded video stream data to obtain video frame,
Figure BDA0003680473550000041
in the above formula (2) [ R ]' a (i,j),G′ a (i,j),B′ a (i,j)]Representing RGB values of pixel points in the ith row and the jth column in an image of an a-frame video picture, which are obtained by carrying out data decompression processing and extraction on decoded video stream data; [ r ]' 1 (i,j),g′ 1 (i,j),b′ 1 (i,j)]RGB values representing pixel points of the ith row and the jth column in the image of the decoded 1 st frame video stream data; [ r ]' k (i,j),g′ k (i,j),b′ k (i,j)]RGB values representing pixel points of a j-th row in an image of decoded k-th frame video stream data;
step S203, utilizing the following formula (3), carrying out YUV analysis processing on the extracted video picture frame to obtain picture content brightness distribution information and picture content color saturation distribution information of the video picture frame,
Figure BDA0003680473550000042
in the above formula (3), Y a (i, j) represents the brightness value of the ith row and jth column pixel points in the image of the (a) th frame video picture; [ U ] a (i,j),V a (i,j)]Representing chromaticity of pixel points in an ith row and a jth column in an image of an a-frame video picture, wherein the chromaticity comprises a color U value and a saturation V value; m represents the number of pixel points in each row in each frame of image of the video picture; n represents the number of pixel points in each column in each frame of image of the video picture; e (E) a A comprehensive brightness distribution value representing a frame a video picture; s is S a Representing the integrated color saturation distribution value of the a-frame video picture.
Further, in the step S3, the instructing the video analysis subsystem to monitor the video stream data received by the terminal device specifically includes:
and the video analysis subsystem is instructed to monitor the appointed buffer space of the terminal equipment, and when the video stream data packet is uploaded in the appointed buffer space, the uploaded video stream data packet is directly shifted to the video analysis subsystem.
Further, in the step S3, YUV analysis processing is performed on the monitored video stream data, so as to obtain a video picture content quality identification result of the video stream data, which specifically includes:
instructing the video analysis subsystem to decode the shifted video stream data packet and extract video picture frames;
and then carrying out YUV analysis processing on the extracted video picture frame to obtain picture content brightness distribution information and picture content color saturation distribution information of the video picture frame, and taking the picture content brightness distribution information and the picture content color saturation distribution information as a video picture content quality identification result.
Further, in the step S4, the video analysis subsystem is instructed to transmit the video picture content quality identification result to the GAT1400 remote data platform through the HTTP port; the video stream data uploading state of the external equipment is controlled by the GAT1400 remote data platform specifically comprises the following steps:
instructing the video analysis subsystem to transmit the picture content brightness distribution information and the picture content color saturation distribution information to the GAT1400 remote data platform through the HTTP port;
judging whether the picture quality of the current video stream data is qualified or not according to the picture content brightness distribution information and the picture content color saturation distribution information through a GAT1400 remote data platform;
if the picture quality of the current video stream data is not qualified, the current external equipment is forbidden to continuously upload the video stream data to the terminal equipment through the GAT1400 remote data platform.
Compared with the prior art, the intelligent video analysis method supporting the GAT1400 protocol constructs an access management subsystem and a video analysis subsystem in the terminal equipment, manages the connection state of the terminal equipment and the external equipment through the access management subsystem, and acquires video stream data uploaded by the external equipment; carrying out YUV analysis processing on the monitored video stream data through a video analysis subsystem to obtain a video picture content quality identification result of the video stream data, and transmitting the video picture content quality identification result to a GAT1400 remote data platform through an HTTP port; finally, controlling the video stream data uploading state of the external equipment through the GAT1400 remote data platform; according to the analysis method, the access management subsystem and the video analysis subsystem are constructed in the terminal equipment to respectively manage connection with external equipment and realize analysis of video stream data, so that video analysis is not required under the GAT1400 frame, the external equipment is only required to be controlled and managed by using the GAT1400 remote data platform subsequently, and the accuracy and reliability of processing the video stream data are improved.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
The technical scheme of the invention is further described in detail through the drawings and the embodiments.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a video intelligent analysis method supporting GAT1400 protocol provided by the invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, a flow chart of a video intelligent analysis method supporting GAT1400 protocol according to an embodiment of the present invention is shown. The intelligent video analysis method supporting the GAT1400 protocol comprises the following steps:
step S1, an access management subsystem and a video analysis subsystem are built in terminal equipment, and system isolation is carried out on the access management subsystem and the video analysis subsystem; after the terminal equipment is awakened, the access management subsystem and the video analysis subsystem are instructed to enter a working state;
step S2, the access management subsystem is instructed to manage the connection state of the terminal equipment and the external equipment, and the terminal equipment is instructed to acquire video stream data uploaded by the external equipment;
step S3, the video analysis subsystem is instructed to monitor the video stream data received by the terminal equipment; performing YUV analysis processing on the monitored video stream data to obtain a video picture content quality identification result of the video stream data;
step S4, the video analysis subsystem is instructed to transmit the video picture content quality identification result to the GAT1400 remote data platform through the HTTP port; and then the video stream data uploading state of the external equipment is controlled by the GAT1400 remote data platform.
The beneficial effects of the technical scheme are as follows: the intelligent video analysis method supporting GAT1400 protocol constructs an access management subsystem and a video analysis subsystem in terminal equipment, manages the connection state of the terminal equipment and external equipment through the access management subsystem, and acquires video stream data uploaded from the external equipment; carrying out YUV analysis processing on the monitored video stream data through a video analysis subsystem to obtain a video picture content quality identification result of the video stream data, and transmitting the video picture content quality identification result to a GAT1400 remote data platform through an HTTP port; finally, controlling the video stream data uploading state of the external equipment through the GAT1400 remote data platform; according to the analysis method, the access management subsystem and the video analysis subsystem are constructed in the terminal equipment to respectively manage connection with external equipment and realize analysis of video stream data, so that video analysis is not required under the GAT1400 frame, the external equipment is only required to be controlled and managed by using the GAT1400 remote data platform subsequently, and the accuracy and reliability of processing the video stream data are improved.
Preferably, in the step S1, an access management subsystem and a video analysis subsystem are built in a terminal device, and performing system isolation on the access management subsystem and the video analysis subsystem specifically includes:
and respectively constructing an access management subsystem and a video analysis subsystem in the virtual machine corresponding to the terminal equipment, and installing the access management subsystem and the video analysis subsystem in different memory partitions of the virtual machine, so as to realize system isolation between the access management subsystem and the video analysis subsystem.
The beneficial effects of the technical scheme are as follows: an access management subsystem and a video analysis subsystem are simultaneously built in a virtual machine corresponding to the terminal equipment, and the two subsystems serve as virtual subsystems to respectively bear two different functions of connection with external equipment and analysis of video stream data; the access management subsystem and the video analysis subsystem are arranged in different memory partitions of the virtual machine, so that the two subsystems can be effectively isolated, interference of the two subsystems in the working process is avoided, and the running stability of the two subsystems is improved.
Preferably, in the step S1, after the terminal device is awakened, the instructing the access management subsystem and the video analysis subsystem to enter the working state specifically includes:
and after the terminal equipment is awakened, the access management subsystem and the video analysis subsystem are instructed to enter a background working state.
The beneficial effects of the technical scheme are as follows: after the terminal equipment is awakened, the access management subsystem and the video analysis subsystem are instructed to enter a background working state, so that the normal and stable working of the two subsystems can be ensured.
Preferably, in the step S2, the instructing the access management subsystem to manage the connection state of the terminal device and the external device specifically includes:
the access management subsystem is instructed to monitor the connection state of the external port of the terminal device and the external device periodically;
when the connection between the external port of the terminal equipment and the external equipment is determined, the access management subsystem is instructed to periodically acquire a connection request from the external equipment, authentication processing of a request protocol type is carried out on the connection request, and the communication protocol type of the connection request sent by the external equipment is determined;
if the communication protocol type is supported by the terminal equipment to be compatible, allowing the external equipment to perform data interaction communication with the terminal equipment;
if the communication protocol type is not supported by the terminal equipment to be compatible, the external equipment and the terminal equipment are not allowed to carry out data interaction communication.
The beneficial effects of the technical scheme are as follows: by the mode, the external equipment and the terminal equipment can be ensured to carry out data interaction communication under the appointed communication protocol mode, the situation that the communication protocol between the external equipment and the terminal equipment is incompatible is avoided, and the communication stability between the external equipment and the terminal equipment is improved.
Preferably, in the step S2, instructing the terminal device to acquire video stream data uploaded from the external device specifically includes:
when the external equipment and the terminal equipment are allowed to carry out data interactive communication, the access management subsystem is used for identifying the data frame head and the data frame tail of video stream data uploaded by the external equipment;
and then, according to the data frame head and the data frame tail of the video stream data, packing the video stream data uploaded by the external equipment to form a complete video stream data packet, and uploading the video stream data packet into the designated buffer space of the terminal equipment.
The beneficial effects of the technical scheme are as follows: when the external equipment and the terminal equipment are allowed to carry out data interactive communication, the access management subsystem is used for identifying the data frame head and the data frame tail of the video stream data uploaded by the external equipment, and the data frame head and the data frame tail are used as marks for starting and ending the video stream data, so that the integrity of video stream data reception is ensured, and the condition that the video stream data is subjected to data reception deletion is avoided.
Preferably, in the step S2, the step of packing the video stream data uploaded from the external device into a complete video stream data packet according to the data frame header and the data frame tail of the video stream data, and uploading the video stream data packet into the designated buffer space of the terminal device specifically includes:
the video stream data is subjected to data compression processing during packaging, and then the video analysis subsystem is instructed to perform decoding processing and video picture frame extraction processing on the shifted video stream data packet, wherein the video picture frame extraction processing is the data decompression of the video stream data; then carrying out YUV analysis processing on the extracted video picture frame to obtain picture content brightness distribution information and picture content color saturation distribution information of the video picture frame, taking the picture content brightness distribution information and the picture content color saturation distribution information as a video picture content quality identification result,
step S201, using the following formula (1), performing data compression processing on the video stream data at the time of packetization,
Figure BDA0003680473550000101
in the above formula (1) [ r ] a (i,j),g a (i,j),b a (i,j)]RGB values representing the ith row and jth column pixel points in an image of an a-th frame after data compression processing is carried out on the video stream data during packaging; [ R ] 1 (i,j),G 1 (i,j),B 1 (i,j)]RGB values representing the ith row and jth column pixel points in the image of the 1 st frame of the video stream data before the data compression processing; [ R ] a (i,j),G a (i,j),B a (i,j)]RGB values representing the ith row and jth column pixel points in the image of the ith frame of the video stream data without data compression processing;[R a-1 (i,j),G a-1 (i,j),B a-1 (i,j)]RGB values representing pixel points in an ith row and a jth column in an image of an a-1 th frame of the video stream data before data compression processing;
step S202, after the video analysis subsystem decodes the shifted video stream data packet, the video analysis subsystem uses the following formula (2) to decompress the decoded video stream data and extract the decoded video stream data to obtain video frame,
Figure BDA0003680473550000102
in the above formula (2) [ R ]' a (i,j),G′ a (i,j),B′ a (i,j)]Representing RGB values of pixel points in the ith row and the jth column in an image of an a-frame video picture, which are obtained by carrying out data decompression processing and extraction on decoded video stream data; [ r ]' 1 (i,j),g′ 1 (i,j),b′ 1 (i,j)]RGB values representing pixel points of the ith row and the jth column in the image of the decoded 1 st frame video stream data; [ r ]' k (i,j),g′ k (i,j),b′ k (i,j)]RGB values representing the ith row and jth column pixel points in the decoded image of the kth frame video stream data;
step S203, utilizing the following formula (3), carrying out YUV analysis processing on the extracted video picture frame to obtain picture content brightness distribution information and picture content color saturation distribution information of the video picture frame,
Figure BDA0003680473550000103
in the above formula (3), Y a (i, j) represents the brightness value of the ith row and jth column pixel points in the image of the (a) th frame video picture; [ U ] a (i,j),V a (i,j)]Representing chromaticity of pixel points in an ith row and a jth column in an image of an a-frame video picture, wherein the chromaticity comprises a color U value and a saturation V value; m represents the number of pixel points in each row in each frame of image of the video picture; n represents each frame of video pictureThe number of a row of pixel points; e (E) a A comprehensive brightness distribution value representing a frame a video picture; s is S a Representing the integrated color saturation distribution value of the a-frame video picture.
The beneficial effects of the technical scheme are as follows: the data compression processing is carried out on the video stream data during packaging by utilizing the formula (1), and the purpose is to slow down the memory occupied by the video stream data packet in the designated cache space, release the memory space and increase the storage capacity of the memory; then, the decoded video stream data is subjected to data decompression processing by utilizing the formula (2) to extract video picture frames, so that the video picture frames are accurately, intelligently and automatically extracted, and the intelligent and automatic characteristics of the system are reflected; and finally, carrying out YUV analysis processing on the extracted video picture frame by utilizing the formula (3), and obtaining picture content brightness distribution information and picture content color saturation distribution information of the video picture frame, wherein the comprehensive brightness distribution value of the video picture frame and the comprehensive color saturation distribution value of the video picture frame are obtained in a weighted mode, so that the following steps of simplifying and judging can be realized, unnecessary calculation of a system is saved, and the working efficiency of the system is improved.
Preferably, in the step S3, the instructing the video analysis subsystem to monitor the video stream data received by the terminal device specifically includes:
the video analysis subsystem is instructed to monitor the appointed buffer space of the terminal equipment, and when the video stream data packet is uploaded in the appointed buffer space, the uploaded video stream data packet is directly shifted to the video analysis subsystem.
The beneficial effects of the technical scheme are as follows: the video analysis subsystem is instructed to monitor the appointed buffer space of the terminal equipment, and once the appointed buffer space is detected to be uploaded with the video stream data packet, the transmitted video stream data packet is directly shifted to the video analysis subsystem, so that the video stream data packet can be timely shifted, the appointed buffer space is prevented from being occupied by the video stream data packet, and the utilization rate of the appointed buffer space is saved.
Preferably, in the step S3, YUV analysis processing is performed on the monitored video stream data, so as to obtain a video picture content quality identification result of the video stream data, where the video picture content quality identification result specifically includes:
instructing the video analysis subsystem to perform decoding processing and video picture frame extraction processing on the shifted video stream data packet;
and then carrying out YUV analysis processing on the extracted video picture frame to obtain picture content brightness distribution information and picture content color saturation distribution information of the video picture frame, and taking the picture content brightness distribution information and the picture content color saturation distribution information as a video picture content quality identification result.
The beneficial effects of the technical scheme are as follows: by carrying out YUV analysis processing on the extracted video picture frames, the brightness and color saturation of the picture content of the video picture can be analyzed, and the judgment of the quality of the picture content of the video picture in brightness distribution and color saturation distribution can be determined.
Preferably, in the step S4, the video analysis subsystem is instructed to transmit the video picture content quality identification result to the GAT1400 remote data platform through the HTTP port; the video stream data uploading state of the external equipment is controlled by the GAT1400 remote data platform specifically comprises the following steps:
instructing the video analysis subsystem to transmit the picture content brightness distribution information and the picture content color saturation distribution information to the GAT1400 remote data platform through the HTTP port;
judging whether the picture quality of the current video stream data is qualified or not according to the picture content brightness distribution information and the picture content color saturation distribution information through a GAT1400 remote data platform;
if the picture quality of the current video stream data is not qualified, the current external equipment is forbidden to continuously upload the video stream data to the terminal equipment through the GAT1400 remote data platform.
The beneficial effects of the technical scheme are as follows: the video analysis subsystem is instructed to transmit the picture content brightness distribution information and the picture content color saturation distribution information to the GAT1400 remote data platform through the HTTP port, so that the relevant judgment of video stream data can be carried out under the GAT1400 frame, the integral processing of the video stream data under the GAT1400 frame is avoided, and the data processing efficiency is reduced. In addition, if the picture quality of the current video stream data is not qualified, the current external equipment is forbidden to continuously upload the video stream data to the terminal equipment through the GAT1400 remote data platform, so that the condition that the terminal equipment cannot normally display the video stream data due to the fact that the current external equipment continuously uploads the quality of the unqualified video stream data to the terminal equipment can be avoided.
As can be seen from the foregoing embodiments, the video intelligent analysis method supporting the GAT1400 protocol constructs an access management subsystem and a video analysis subsystem in a terminal device, manages a connection state between the terminal device and an external device through the access management subsystem, and obtains video stream data uploaded from the external device; carrying out YUV analysis processing on the monitored video stream data through a video analysis subsystem to obtain a video picture content quality identification result of the video stream data, and transmitting the video picture content quality identification result to a GAT1400 remote data platform through an HTTP port; finally, controlling the video stream data uploading state of the external equipment through the GAT1400 remote data platform; according to the analysis method, the access management subsystem and the video analysis subsystem are constructed in the terminal equipment to respectively manage connection with external equipment and realize analysis of video stream data, so that video analysis is not required under the GAT1400 frame, the external equipment is only required to be controlled and managed by using the GAT1400 remote data platform subsequently, and the accuracy and reliability of processing the video stream data are improved.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (6)

1. The intelligent video analysis method supporting the GAT1400 protocol is characterized by comprising the following steps:
step S1, an access management subsystem and a video analysis subsystem are built in terminal equipment, and system isolation is carried out on the access management subsystem and the video analysis subsystem; after the terminal equipment is awakened, the access management subsystem and the video analysis subsystem are instructed to enter a working state;
step S2, the access management subsystem is instructed to manage the connection state of the terminal equipment and the external equipment, and the terminal equipment is instructed to acquire video stream data uploaded by the external equipment;
step S3, the video analysis subsystem is instructed to monitor video stream data received by the terminal equipment; performing YUV analysis processing on the monitored video stream data to obtain a video picture content quality identification result of the video stream data;
step S4, the video analysis subsystem is instructed to transmit the video picture content quality identification result to a GAT1400 remote data platform through an HTTP port; then the video stream data uploading state of the external equipment is controlled through the GAT1400 remote data platform;
in the step S2, the step of instructing the terminal device to obtain the video stream data uploaded from the external device specifically includes:
when the external equipment and the terminal equipment are allowed to carry out data interactive communication, the access management subsystem is used for identifying the data frame head and the data frame tail of video stream data uploaded by the external equipment;
then, according to the data frame head and the data frame tail of the video stream data, the video stream data uploaded from the external equipment is packed to form a complete video stream data packet, and the video stream data packet is uploaded into the appointed buffer space of the terminal equipment;
in the step S3, the instructing the video analysis subsystem to monitor the video stream data received by the terminal device specifically includes:
the video analysis subsystem is instructed to monitor an appointed buffer space of the terminal equipment, and when the video stream data packet is uploaded in the appointed buffer space, the uploaded video stream data packet is directly shifted to the video analysis subsystem;
in the step S3, YUV analysis is performed on the monitored video stream data to obtain a video picture content quality identification result of the video stream data, where the video picture content quality identification result specifically includes:
instructing the video analysis subsystem to decode the shifted video stream data packet and extract video picture frames;
and then carrying out YUV analysis processing on the extracted video picture frame to obtain picture content brightness distribution information and picture content color saturation distribution information of the video picture frame, and taking the picture content brightness distribution information and the picture content color saturation distribution information as a video picture content quality identification result.
2. The intelligent analysis method for video supporting GAT1400 protocol as claimed in claim 1, wherein:
in the step S1, an access management subsystem and a video analysis subsystem are built in a terminal device, and system isolation for the access management subsystem and the video analysis subsystem specifically includes:
and respectively constructing an access management subsystem and a video analysis subsystem in the virtual machine corresponding to the terminal equipment, and installing the access management subsystem and the video analysis subsystem in different memory partitions of the virtual machine, so as to realize system isolation between the access management subsystem and the video analysis subsystem.
3. The intelligent analysis method for video supporting GAT1400 protocol according to claim 2, wherein:
in the step S1, after the terminal device is awakened, the step of indicating the access management subsystem and the video analysis subsystem to enter a working state specifically includes:
and after the terminal equipment is awakened, the access management subsystem and the video analysis subsystem are instructed to enter a background working state.
4. The intelligent analysis method for video supporting GAT1400 protocol as recited in claim 1, wherein
The method comprises the following steps:
in the step S2, the instructing the access management subsystem to manage the connection state between the terminal device and the external device specifically includes:
the access management subsystem is instructed to monitor the connection state of the external port of the terminal device and the external device periodically;
when the connection between the external port of the terminal equipment and the external equipment is determined, the access management subsystem is instructed to periodically acquire a connection request from the external equipment, authentication processing of a request protocol type is carried out on the connection request, and the communication protocol type of the connection request sent by the external equipment is determined;
if the communication protocol type is supported by the terminal equipment to be compatible, allowing the external equipment to perform data interaction communication with the terminal equipment;
and if the communication protocol type is not supported by the terminal equipment to be compatible, the external equipment and the terminal equipment are not allowed to carry out data interactive communication.
5. The intelligent analysis method for video supporting GAT1400 protocol as claimed in claim 1, wherein:
in the step S2, according to the data frame header and the data frame tail of the video stream data, the video stream data uploaded from the external device is packed to form a complete video stream data packet, and the uploading of the video stream data packet into the designated buffer space of the terminal device specifically includes:
the video stream data is subjected to data compression processing during packaging, and then the video analysis subsystem is instructed to perform decoding processing and video picture frame extraction processing on the shifted video stream data packet, wherein the video picture frame extraction processing is the data decompression of the video stream data;
then carrying out YUV analysis processing on the extracted video picture frame to obtain picture content brightness distribution information and picture content color saturation distribution information of the video picture frame, taking the picture content brightness distribution information and the picture content color saturation distribution information as a video picture content quality identification result,
step S201, using the following formula (1), performing data compression processing on the video stream data at the time of packetization,
Figure FDA0004040272450000041
in the above formula (1) [ r ] a (i,j),g a (i,j),b a (i,j)]RGB values representing pixel points in an ith row and a jth column in an image of an a-th frame after data compression processing is carried out on the video stream data during packaging; [ R ] 1 (i,j),G 1 (i,j),B 1 (i,j)]RGB values representing pixel points of an ith row and a jth column in an image of a 1 st frame of the video stream data before data compression processing; [ R ] a (i,j),G a (i,j),B a (i,j)]RGB values representing pixel points of an ith row and a jth column in an image of an a-th frame of the video stream data before data compression processing; [ R ] a-1 (i,j),G a-1 (i,j),B a-1 (i,j)]RGB values representing pixel points in an ith row and a jth column in an image of an a-1 th frame of the video stream data before data compression processing;
step S202, after the video analysis subsystem decodes the shifted video stream data packet, the video analysis subsystem uses the following formula (2) to decompress the decoded video stream data and extract the decoded video stream data to obtain video frame,
Figure FDA0004040272450000042
in the above formula (2) [ R ]' a (i,j),G′ a (i,j),B′ a (i,j)]Representing RGB values of pixel points in the ith row and the jth column in an image of an a-frame video picture, which are obtained by carrying out data decompression processing and extraction on decoded video stream data; [ r ]' 1 (i,j),g′ 1 (i,j),b′ 1 (i,j)]RGB values representing pixel points of the ith row and the jth column in the image of the decoded 1 st frame video stream data; [ r ]' k (i,j),g′ k (i,j),b′ k (i,j)]RGB values representing pixel points of a j-th row in an image of decoded k-th frame video stream data;
step S203, utilizing the following formula (3), carrying out YUV analysis processing on the extracted video picture frame to obtain picture content brightness distribution information and picture content color saturation distribution information of the video picture frame,
Figure FDA0004040272450000051
in the above formula (3), Y a (i, j) represents the brightness value of the ith row and jth column pixel points in the image of the (a) th frame video picture; [ U ] a (i,j),V a (i,j)]Representing chromaticity of pixel points in an ith row and a jth column in an image of an a-frame video picture, wherein the chromaticity comprises a color U value and a saturation V value; m represents the number of pixel points in each row in each frame of image of the video picture; n represents the number of pixel points in each column in each frame of image of the video picture; e (E) a A comprehensive brightness distribution value representing a frame a video picture; s is S a Representing the integrated color saturation distribution value of the a-frame video picture.
6. The intelligent analysis method for video supporting GAT1400 protocol as claimed in claim 1, wherein:
in the step S4, the video analysis subsystem is instructed to transmit the video picture content quality identification result to the GAT1400 remote data platform through the HTTP port; the video stream data uploading state of the external equipment is controlled by the GAT1400 remote data platform specifically comprises the following steps:
instructing the video analysis subsystem to transmit the picture content brightness distribution information and the picture content color saturation distribution information to the GAT1400 remote data platform through the HTTP port;
judging whether the picture quality of the current video stream data is qualified or not according to the picture content brightness distribution information and the picture content color saturation distribution information through a GAT1400 remote data platform;
if the picture quality of the current video stream data is not qualified, the current external equipment is forbidden to continuously upload the video stream data to the terminal equipment through the GAT1400 remote data platform.
CN202210636432.6A 2022-06-07 2022-06-07 Video intelligent analysis method supporting GAT1400 protocol Active CN115103224B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210636432.6A CN115103224B (en) 2022-06-07 2022-06-07 Video intelligent analysis method supporting GAT1400 protocol

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210636432.6A CN115103224B (en) 2022-06-07 2022-06-07 Video intelligent analysis method supporting GAT1400 protocol

Publications (2)

Publication Number Publication Date
CN115103224A CN115103224A (en) 2022-09-23
CN115103224B true CN115103224B (en) 2023-04-25

Family

ID=83289716

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210636432.6A Active CN115103224B (en) 2022-06-07 2022-06-07 Video intelligent analysis method supporting GAT1400 protocol

Country Status (1)

Country Link
CN (1) CN115103224B (en)

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100073362A1 (en) * 2008-09-23 2010-03-25 Ike Ikizyan Method And System For Scene Adaptive Dynamic 3-D Color Management
CN102413355A (en) * 2011-12-30 2012-04-11 武汉烽火众智数字技术有限责任公司 Detecting method for video signal deletion in video quality diagnostic system
CN104378657A (en) * 2014-09-01 2015-02-25 国家电网公司 Video security access system based on agency and isolation and method of video security access system
CN108173697B (en) * 2018-01-17 2021-10-15 北京科来神州科技有限公司 Video private network safety operation and maintenance early warning management and control system
CN109391844A (en) * 2018-11-20 2019-02-26 国网安徽省电力有限公司信息通信分公司 Video quality diagnosing method and system based on video conference scene
CN109618139A (en) * 2019-01-10 2019-04-12 深圳市华金盾信息科技有限公司 A kind of intelligent video monitoring system and method for view-based access control model routing
CN111405035A (en) * 2020-03-13 2020-07-10 北京旷视科技有限公司 Data transmission method and data channel system
CN112702595A (en) * 2020-12-21 2021-04-23 公安部第一研究所 SVAC2.0 video comparison method and system thereof
CN113660427A (en) * 2021-09-22 2021-11-16 广州网路通电子有限公司 Image analysis system and method applied to video monitoring tester

Also Published As

Publication number Publication date
CN115103224A (en) 2022-09-23

Similar Documents

Publication Publication Date Title
WO2021114708A1 (en) Method and apparatus for implementing multi-person video live-streaming service, and computer device
CN106851386B (en) Method and device for realizing augmented reality in television terminal based on Android system
CN109089173B (en) Method and system for detecting advertisement delivery of smart television terminal
US10360913B2 (en) Speech recognition method, device and system based on artificial intelligence
CN107094244A (en) Intelligent passenger flow monitoring device and method capable of being managed and controlled in centralized mode
CN105847825A (en) Encoding, index storage and access methods for video encoding code stream and corresponding apparatus
US20180131949A1 (en) Method and system for encoding and decoding, encoder and decoder
US20120106650A1 (en) Method and System for Block and DVC Compression
CN106331603A (en) Video monitoring method, apparatus, system and server
US20140022382A1 (en) Video setting method
CN115103224B (en) Video intelligent analysis method supporting GAT1400 protocol
CN111107394A (en) System and method for integrating video streams across platforms
CN105450980B (en) A kind of high definition is taken photo by plane control and video retransmission method and system
CN103747191B (en) Network interaction high-definition character superimposition system
CN115955568B (en) Low-delay video compression and intelligent target identification method based on Hai Si chip
CN106412518A (en) Wireless video transmission system based on TD-LTE emergency communication
CN114329126A (en) Device and intelligent terminal for controlling intelligent LED display screen
WO2021217467A1 (en) Method and apparatus for testing intelligent camera
CN105472467B (en) interface display method and system
US20190306500A1 (en) Bit rate optimization system and method
CN112887293A (en) Streaming media processing method and device and electronic equipment
CN112379856B (en) Display picture reconstruction device and method
CN202272595U (en) Elevator monitoring system
CN114245070B (en) Method and system for centralized viewing of regional monitoring content
CN116320536B (en) Video processing method, device, computer equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant