CN115776601A - Video streaming information evaluation method, device, equipment and storage medium - Google Patents

Video streaming information evaluation method, device, equipment and storage medium Download PDF

Info

Publication number
CN115776601A
CN115776601A CN202211431577.9A CN202211431577A CN115776601A CN 115776601 A CN115776601 A CN 115776601A CN 202211431577 A CN202211431577 A CN 202211431577A CN 115776601 A CN115776601 A CN 115776601A
Authority
CN
China
Prior art keywords
video stream
key frame
collision
video
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211431577.9A
Other languages
Chinese (zh)
Inventor
刘旸
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Ailing Network Co ltd
Original Assignee
Shenzhen Ailing Network Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Ailing Network Co ltd filed Critical Shenzhen Ailing Network Co ltd
Priority to CN202211431577.9A priority Critical patent/CN115776601A/en
Publication of CN115776601A publication Critical patent/CN115776601A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Data Exchanges In Wide-Area Networks (AREA)

Abstract

The application provides a video stream transmission information evaluation method, device, equipment and storage medium, and belongs to the technical field of data processing. The method comprises the following steps: receiving video streams transmitted by the video stream transmission sources, wherein the video streams comprise: a plurality of key frames arranged according to a time sequence, wherein each key frame consists of a plurality of data packets; determining first transmission information of the video stream based on the arrival time of the data packet in each key frame, wherein the first transmission information is used for representing the transmission quality of the key frame in each video stream; and determining second transmission information of the video streams based on the duration of the key frames generating the collision in the video streams, wherein the second transmission information is used for representing the influence information of other video streams on each video stream. According to the method and the device, the accuracy and the diversity of the evaluation result can be improved, and the condition of the transmission quality in the video stream transmission process can be evaluated and determined more comprehensively.

Description

Video streaming information evaluation method, device, equipment and storage medium
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a method, an apparatus, a device, and a storage medium for evaluating video streaming information.
Background
In the process of processing multiple video streams, it is generally necessary for one receiving device to receive the video streams sent by multiple sending devices, for example: a monitoring processing device as a receiving device receives video streams transmitted by a plurality of monitoring cameras as a transmitting device.
In the prior art, usually, a receiving device receives and processes these video streams, however, the current receiving device can only analyze and process the received video streams according to the service requirements. Although the existing receiving device can perform certain quality evaluation on the video stream, the existing quality evaluation can only evaluate the transmission quality in the transmission process of a single video stream, and does not consider the situation that the transmission quality may be influenced mutually during the transmission of a plurality of video streams, so that the evaluation result has limitation.
Disclosure of Invention
The application aims to provide a method, a device, equipment and a storage medium for evaluating video streaming transmission information, which can improve the accuracy and diversity of an evaluation result and more comprehensively evaluate and determine the transmission quality in the video streaming transmission process.
The embodiment of the application is realized as follows:
in one aspect of the embodiments of the present application, a method for evaluating video streaming information is provided, where the method is applied to a video streaming processing device, and the video streaming processing device is in communication connection with a plurality of video streaming transmission sources, and the method includes:
receiving video streams transmitted by the video stream transmission sources, wherein the video streams comprise: a plurality of key frames arranged according to a time sequence, wherein each key frame consists of a plurality of data packets;
determining first transmission information of the video stream based on the arrival time of the data packet in each key frame, wherein the first transmission information is used for representing the transmission quality of the key frame in each video stream;
and determining second transmission information of the video streams based on the duration of the key frames generating the collision in the video streams, wherein the second transmission information is used for representing the influence information of other video streams on each video stream.
Optionally, the first transmission information includes: time delay of arrival;
determining first transmission information of the video stream based on the arrival time of the data packet in each key frame, wherein the first transmission information comprises:
acquiring the starting time of a video stream transmission source corresponding to a video stream;
and determining the arrival time delay based on the arrival time of the first data packet in the first key frame of the video stream and the starting time of a video stream transmission source.
Optionally, the first transmission information includes: phase jitter information;
determining first transmission information of the video stream based on the arrival time of the data packet in each key frame, wherein the first transmission information comprises:
and determining whether the phase jitter of each key frame in the video stream exceeds a preset threshold value or not based on the arrival time of the first data packet in each key frame of the video stream.
Optionally, determining whether the phase jitter of each key frame in the video stream exceeds a preset threshold based on the arrival time of the first data packet in each key frame of the video stream includes:
determining a first time difference based on the arrival time of a first data packet in a first key frame and the arrival time of a first data packet in a second key frame of the video stream;
determining a second time difference based on the arrival time of a first data packet in a second key frame and the arrival time of a first data packet in a third key frame of the video stream, wherein the first key frame, the second key frame and the third key frame are sequentially adjacent key frames;
determining whether the phase jitter of the first key frame exceeds a preset threshold based on the first time difference and the second time difference.
Optionally, the second transmission information includes: severity of crash;
determining second transmission information of the video stream based on the duration of the key frame generating the collision in the video stream, comprising:
determining key frames of the video streams generating the collision and at least one collision video stream colliding with the video streams in the collision process;
determining the number of collisions corresponding to the video stream and the accumulated time length of single collision based on the duration of the key frame of the video stream generating the collision;
the severity of the collision of the video streams is determined based on the number of collisions, the cumulative duration of a single collision, and the number of colliding video streams.
Optionally, determining the collision severity of the video stream based on the number of collisions, the cumulative duration of a single collision, and the number of collided video streams comprises:
determining the weight of each collision based on the accumulated duration of the single collision and the number of the collision video streams;
a video stream collision severity score is calculated based on the number of collisions and the weight of each collision.
Optionally, determining a key frame of the video stream generating the collision and at least one collision video stream colliding with the video stream during the collision comprises:
and aiming at a target key frame currently received in the video stream, determining whether a key frame which has the same receiving time as the target key frame and belongs to an associated video stream other than the video stream exists, if so, determining the target key frame as a key frame for generating collision, and determining the associated video stream as a collision video stream colliding with the video stream.
In another aspect of the embodiments of the present application, there is provided a video stream transmission information evaluation apparatus applied to a video stream processing device which is communicatively connected to a plurality of video stream transmission sources, the apparatus including: the device comprises a receiving module, a first information determining module and a second information determining module;
a receiving module, configured to receive a video stream sent by each video stream sending source, where the video stream includes: a plurality of key frames arranged according to a time sequence, wherein each key frame consists of a plurality of data packets;
the first information determining module is used for determining first transmission information of the video stream based on the arrival time of the data packet in each key frame, wherein the first transmission information is used for representing the transmission quality of the key frame in each video stream;
and the second information determining module is used for determining second transmission information of the video streams based on the duration of the key frames which generate the collision in the video streams, wherein the second transmission information is used for representing the influence information of other video streams on each video stream.
Optionally, the first transmission information includes: time delay of arrival; the first information determining module is specifically used for acquiring the starting time of a video stream transmission source corresponding to the video stream; and determining the arrival time delay based on the arrival time of the first data packet in the first key frame of the video stream and the starting time of a video stream transmission source.
Optionally, the first transmission information includes: phase jitter information; the first information determining module is specifically configured to determine whether phase jitter of each key frame in the video stream exceeds a preset threshold based on an arrival time of a first data packet in each key frame of the video stream.
Optionally, the first information determining module is specifically configured to determine a first time difference based on an arrival time of a first data packet in a first key frame of the video stream and an arrival time of a first data packet in a second key frame of the video stream; determining a second time difference based on the arrival time of a first data packet in a second key frame and the arrival time of a first data packet in a third key frame of the video stream, wherein the first key frame, the second key frame and the third key frame are sequentially adjacent key frames; determining whether the phase jitter of the first key frame exceeds a preset threshold based on the first moveout and the second moveout.
Optionally, the second transmission information includes: severity of crash; the second information determination module is specifically used for determining key frames of the video streams, which generate collision, and at least one collision video stream which collides with the video streams in the collision process; determining the number of collisions corresponding to the video stream and the accumulated time length of single collision based on the duration of the key frame of the video stream generating the collision; the severity of the collision of the video streams is determined based on the number of collisions, the cumulative duration of a single collision, and the number of colliding video streams.
Optionally, the second information determining module is specifically configured to determine a weight of each collision based on an accumulated duration of a single collision and the number of collision video streams; a video stream collision severity score is calculated based on the number of collisions and the weight of each collision.
Optionally, the second information determining module is further specifically configured to determine, for a currently received target key frame in the video stream, whether a key frame of an associated video stream that has the same receiving time as the target key frame and belongs to a different video stream exists, and if so, determine that the target key frame is a key frame that generates a collision, and determine that the associated video stream is a collision video stream that collides with the video stream.
In another aspect of the embodiments of the present application, there is provided a computer device, including: the device comprises a memory and a processor, wherein a computer program capable of running on the processor is stored in the memory, and when the processor executes the computer program, the steps of the video stream transmission information evaluation method are realized.
In another aspect of the embodiments of the present application, a computer-readable storage medium is provided, and a computer program is stored on the storage medium, and when being executed by a processor, the computer program implements the steps of the video streaming information evaluation method.
The beneficial effects of the embodiment of the application include:
in the method, the apparatus, the device, and the storage medium for evaluating video stream transmission information provided in the embodiments of the present application, a video stream transmitted by each video stream transmission source may be received, and then first transmission information of the video stream may be determined based on the arrival time of a data packet in each key frame, and second transmission information of the video stream may be determined based on the duration of a key frame in the video stream where a collision occurs. The first transmission information is used for representing the transmission quality of key frames in each video stream, the second transmission information is used for representing the influence information of each video stream by other video streams, the transmission quality of the key frames in each video stream and the related information of each video stream influenced by other video streams can be comprehensively evaluated and determined by determining the first transmission information and the second transmission information, so that the accuracy and diversity of evaluation results can be improved, and the condition of the transmission quality in the video stream transmission process can be comprehensively evaluated and determined.
Drawings
To more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a schematic view of an application scenario of a video stream transmission information evaluation method according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a video streaming information evaluation method according to an embodiment of the present disclosure;
fig. 3 is another schematic flowchart of a video streaming information evaluation method according to an embodiment of the present disclosure;
fig. 4 is another schematic flow chart of a video streaming information evaluation method according to an embodiment of the present application;
fig. 5 is another schematic flow chart of a video streaming information evaluation method according to an embodiment of the present application;
fig. 6 is another schematic flow chart of a video streaming information evaluation method according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an apparatus for evaluating video streaming information according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, as generally described and illustrated in the figures herein, could be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
In the description of the present application, it is noted that the terms "first", "second", "third", and the like are used merely for distinguishing between descriptions and are not intended to indicate or imply relative importance.
The following specifically explains a practical scenario to which the embodiments of the present application are applied.
Fig. 1 is a schematic view of an application scenario of a video stream transmission information evaluation method according to an embodiment of the present application, please refer to fig. 1, where the scenario may include a video stream processing device 110 and a plurality of video stream transmission sources 120, where the video stream processing device 110 is communicatively connected to the plurality of video stream transmission sources 120.
Optionally, the video stream processing device 110 may specifically be a device that processes a video stream, for example: the computer, the server, the mobile phone, the edge computing node, the 5G base station, and other related computer devices are not specifically limited herein, and any video stream processing may be implemented.
The video stream transmission source 120 may be a device having a video recording function, such as a camera or a mobile phone, or any type of device that can implement video stream forwarding, and is not particularly limited herein, and the video stream transmission source 120 may include a plurality of video stream transmission sources 120, and each of the plurality of video stream transmission sources 120 may be communicatively connected to the video stream processing device 110.
It should be noted that the video stream may specifically be a file in a video format, and specifically may be a video file that performs video playing and file transmission in a frame format, where the specific format of the video is not limited herein.
The following specifically explains a specific implementation process of the video streaming information evaluation method provided in the embodiment of the present application based on the above application scenario.
Fig. 2 is a flowchart illustrating a method for evaluating video streaming information according to an embodiment of the present application, please refer to fig. 2, where the method includes:
s210: video streams transmitted by the respective video stream transmission sources are received.
Wherein the video stream comprises: and a plurality of key frames arranged according to a time sequence, wherein each key frame consists of a plurality of data packets.
Optionally, the application body of the method may be a video stream processing device in the application scenario. Wherein the video stream processing apparatus is communicatively connected to a plurality of video stream transmission sources, each of which is capable of transmitting a video stream to the video stream processing apparatus.
The video stream may be generated and transmitted by the video stream transmission source, or may be forwarded, which is not limited herein.
It should be noted that the video stream may specifically be a video file that performs video playing and file transmission in a frame form, that is, a plurality of key frames may be arranged in a time sequence in the video stream, and the key frames may include a plurality of data packets, and during transmission, the video stream may be transmitted in one key frame, and each key frame may be transmitted in one data packet, and the data packets may be arranged in a preset sequence in the key frame, and accordingly, the key frames may also be arranged in the video stream in the preset sequence.
S220: first transmission information of the video stream is determined based on arrival times of the data packets in the key frames.
Wherein the first transmission information is used for characterizing the transmission quality of key frames in each video stream. Specifically, the transmission quality of each key frame in each video stream may be, for example: any one of the key frames in the video stream a may have the first transmission information, and the first transmission information of different key frames may be different.
Optionally, the first transmission information may specifically include multiple types, and all relevant parameters or attributes that may characterize the transmission quality of the key frame may be used, which is not limited herein.
It should be noted that, for a key frame, the size of each key frame is the sum of all data packets constituting the key frame. In the process of key frame transmission, the arrival time of the first data packet in the key frame is the starting time of the key frame, and the arrival time of the last data packet is the ending time of the key frame.
Wherein the arrival time refers to the time of the data packet to the video stream processing device.
In determining the first transmission information, the first transmission information of the video stream may be determined based on the arrival time of the data packet in each key frame, and the first transmission information of different types may be determined based on the arrival time of different data packets, which is not limited herein.
S230: second transmission information of the video stream is determined based on a duration of a key frame in the video stream that generates a collision.
And the second transmission information is used for representing the influence information of each video stream by other video streams.
Optionally, both the second transmission information and the first transmission information may be used as data for performing quality evaluation on the video stream, that is, the video stream transmission information referred to in this embodiment refers to the first transmission information and the second transmission information.
It should be noted that, the information that each video stream represented by the second transmission information is affected by other video streams may be, for example, when a certain video stream is transmitted, another video stream is also transmitted at the same time, and when the transmission bandwidth is fixed, if too many video streams are transmitted at the same time, a phenomenon such as packet loss may occur in the transmission process, and the second transmission information may be, for example, related information that collision occurs between multiple transmitted videos in the transmission process.
The duration of the key frame may specifically be an average value of differences between the arrival time of the first data packet and the arrival time of the last data packet in the key frame when the phase jitter of the key frame is lower than a preset threshold.
Alternatively, in determining the second transmission information, the second transmission information of the video stream may be determined based on the duration of each key frame that generates a collision in the video stream.
It should be noted that, the steps S220 and S230 may be executed synchronously or sequentially, and are not limited in particular, and fig. 2 illustrates an example of synchronous execution, and may perform any order adjustment in the actual implementation process.
The first transmission information and the second transmission information respectively obtained based on the above steps can be used as evaluation results to evaluate the transmission quality of the video stream.
In the video stream transmission information evaluation method provided by the embodiment of the present application, video streams sent by video stream sending sources may be received, and then first transmission information of the video streams may be determined based on the arrival time of a data packet in each key frame, and second transmission information of the video streams may be determined based on the duration of a key frame in the video stream where a collision occurs. The first transmission information is used for representing the transmission quality of key frames in each video stream, the second transmission information is used for representing the influence information of each video stream by other video streams, the transmission quality of the key frames in each video stream and the related information of each video stream influenced by other video streams can be comprehensively evaluated and determined by determining the first transmission information and the second transmission information, so that the accuracy and diversity of evaluation results can be improved, and the condition of the transmission quality in the video stream transmission process can be comprehensively evaluated and determined.
Another specific implementation procedure in the video stream transmission information evaluation method provided in the embodiment of the present application is specifically explained below.
Fig. 3 is another schematic flow chart of a method for evaluating video stream transmission information according to an embodiment of the present application, please refer to fig. 3, in which the first transmission information includes: time delay of arrival; determining first transmission information of the video stream based on the arrival time of the data packet in each key frame, wherein the first transmission information comprises:
s310, the starting time of the video stream transmission source corresponding to the video stream is obtained.
Alternatively, the video stream transmission source is communicatively connected to the video stream processing apparatus, and the time at which transmission is started may be transmitted to the video stream processing apparatus as the relevant transfer data during transmission of the video stream by the video stream transmission source. The video processing apparatus may receive the start time transmitted by the video stream transmission source in the course of receiving the video stream.
The start time may specifically be a time at which the video stream starts to be transmitted from the video stream transmission source.
S320: and determining the arrival time delay based on the arrival time of the first data packet in the first key frame of the video stream and the starting time of a video stream transmission source.
It should be noted that, when determining each arrival time, an observation window may be set in the video stream processing apparatus, and the observation window start time is synchronized with the video stream transmission source start time. The length of the observation window is not less than N times the maximum value of the frame periods of the key frames in all video streams (N being the number of video streams) to ensure that a number of complete key frame periods can be observed within the window. In order to calculate the average value and variance of the arrival time delay of the key frame, the sampling can be repeated for multiple times, that is, the camera is restarted for multiple times and the arrival time of the first data packet of the first key frame of the current video stream is recorded again.
Alternatively, after the video stream processing device receives the video stream, the arrival time of the first data packet in the first key frame in the video stream can be determined.
The specific calculation formula is as follows:
T 0 =T 1 -T 2
wherein, T 0 For the arrival delay, T, of the video stream 1 Is the arrival time, T, of the first data packet in the first key frame of the video stream 2 Is the start time of the video stream.
In the time of arrival determination, it may specifically be performed by recording the number of all video streams detected within the current time window. Each video stream is numbered as a flow _ ID to distinguish the data of the different video streams. For example, if N video streams are detected in the current window, the ith video stream is: flow _ ID i (1≤i≤N)。
Illustratively, if N video streams are detected within the current window, then there are:
flow_ID=[1,2,3,…,N];
according to the video stream numbers, one or more pieces of key frame information included in each video stream in the observation window can be recorded, including but not limited to the start time and the end time of the key frame, the number of all data packets and the packet size included in the current key frame, and the like.
Exemplarily, the ith video stream flow _ ID is recorded within the current viewing window i There are M key frames, where the start time of the jth key frame is recorded as:
Figure BDA0003943681210000101
the end time of the jth key frame is recorded as:
Figure BDA0003943681210000102
assuming that the current key frame includes S packets, the size of the kth packet in the current key frame is recorded as: pktSize k (1≤k≤S)。
After the arrival delay is calculated in the above manner, the size of the arrival delay can be used as the first transmission information as the evaluation result of the transmission quality of the video stream.
In the video stream transmission information evaluation method provided in the embodiment of the present application, the start time of a video stream transmission source corresponding to a video stream may be obtained, and then the arrival time delay is determined based on the arrival time of a first data packet in a first key frame of the video stream and the start time of the video stream transmission source; after the arrival delay is calculated in the above manner, the quality of video streaming transmission can be evaluated by using the arrival delay, so that the quality of video streaming transmission can be determined more comprehensively.
It should be noted that the first transmission information may include phase jitter information in addition to the arrival delay.
Optionally, determining the first transmission information of the video stream based on the arrival time of the data packet in each key frame includes: and determining whether the phase jitter of each key frame in the video stream exceeds a preset threshold value or not based on the arrival time of the first data packet in each key frame of the video stream.
The phase jitter information specifically refers to the inconsistency of the difference between the arrival times of the first data packets in two adjacent key frames in the video stream.
Optionally, it may be determined whether the phase jitter of each key frame in the video stream exceeds a preset threshold based on the arrival time of the first data packet in each key frame in the video stream, where the preset threshold may be a fixed value preset according to actual requirements.
The following specifically explains a specific implementation procedure for determining phase jitter information in the video streaming information evaluation method provided in the embodiment of the present application.
Fig. 4 is another flow chart of a video stream transmission information evaluation method according to an embodiment of the present application, please refer to fig. 4, which is a flowchart illustrating a method for determining whether a phase jitter of each key frame in a video stream exceeds a preset threshold based on an arrival time of a first data packet in each key frame of the video stream, including:
s410: a first time difference is determined based on the arrival time of a first packet in a first key frame and the arrival time of a first packet in a second key frame of the video stream.
It should be noted that after the video stream transmission source starts transmission, the phase jitter between the first few key frames received by the video stream processing apparatus is relatively obvious, and as the number of received key frames increases, the key frame phase jitter gradually decreases and remains relatively stable. When the phase jitter of the key frame exceeds the threshold, the related key frame can not participate in the calculation of the related parameters such as the size of the key frame, the period of the key frame, the frame duration of the key frame and the like.
For example: the arrival time of the first data packet in the first key frame is T i The arrival time of the first data packet in the second key frame is T i+1
The first time difference b is: b = T i+1 -T i
S420: the second time difference is determined based on the arrival time of the first packet in the second key frame and the arrival time of the first packet in the third key frame of the video stream.
For example: the arrival time of the first data packet in the third key frame is T i+3
The second time difference a is: b = T i+2 -T i+1
The first key frame, the second key frame and the third key frame are sequentially adjacent key frames. That is, the order of the first key frame, the second key frame, and the third key frame is a fixed and adjacent key frame.
S430: determining whether the phase jitter of the first key frame exceeds a preset threshold based on the first moveout and the second moveout.
Optionally, the specific calculation formula is as follows:
c=(b-a)/a;
where c is the phase jitter, and after the phase jitter is obtained, it may be determined whether the phase jitter exceeds a preset threshold, for example, the preset threshold may be 0.05.
When c > 0.05, it can be determined that the key frame phase jitter exceeds the threshold, that is, the phase jitter information is too large; accordingly, when c ≦ 0.05, it may be determined that the key frame phase jitter does not exceed the threshold, i.e., the phase jitter information is normal.
In order to calculate the key frame phase jitter mean and variance, the sampling may be repeated multiple times, i.e., the video stream transmission source is restarted multiple times and specific values of several key frame phase jitters exceeding the threshold of the video stream are re-recorded multiple times, and the mean and variance of the results of the multiple recordings are calculated.
After the phase jitter information is determined in the above manner, the phase jitter information may be used as the first transmission information as the result of evaluating the transmission quality of the video stream.
In the video stream transmission information evaluation method provided in the embodiment of the present application, a first time difference may be determined based on the arrival time of a first data packet in a first key frame and the arrival time of a first data packet in a second key frame of a video stream; determining a second time difference based on the arrival time of the first packet in the second key frame and the arrival time of the first packet in the third key frame of the video stream; and determining whether the phase jitter of the first key frame exceeds a preset threshold based on the first time difference and the second time difference. The phase jitter is calculated and the phase jitter threshold is compared in the above manner, so that the corresponding phase jitter information can be determined, and the quality of video streaming can be evaluated by using the phase jitter information, so that the quality of video streaming can be determined more comprehensively.
Another specific implementation procedure in the video stream transmission information evaluation method provided in the embodiment of the present application is specifically explained below.
Fig. 5 is another flow chart of a video stream transmission information evaluation method according to an embodiment of the present application, please refer to fig. 5, where the second transmission information includes: severity of crash; determining second transmission information of the video stream based on the duration of the key frame generating the collision in the video stream, comprising:
s510: determining a key frame of the video stream generating the collision and at least one collision video stream colliding with the video stream in the collision process.
It should be noted that the collision severity specifically refers to the severity of the collision of the video stream.
Optionally, the specific key frame that generates the collision in the video stream may be determined first, and may be multiple continuous key frames, or may be a part of a certain key frame, and there may be a transmission collision in the transmission process. After determining the key frame generating the collision, at least one collision video stream colliding with the video stream during the collision may be determined, and if only one other video stream collides with the video stream at a certain time, the collision video stream is one, and correspondingly, if there are a plurality of other video streams colliding with the video stream, the collision video stream is a plurality.
S520: and determining the number of collisions corresponding to the video stream and the accumulated time length of the single collision based on the duration of the key frame of the video stream generating the collision.
Alternatively, the number of collisions of the video stream may be determined based on the dispersion interval of the duration of the collided key frame of the video stream, and further, the duration of each dispersion interval may be determined as the single collision accumulated duration.
S530: determining the collision severity of the video streams based on the number of collisions, the accumulated duration of a single collision, and the number of colliding video streams.
Optionally, the number of times of collision, the accumulated duration of a single collision, and the number of collision video streams are respectively determined based on the above manner, the collision severity of the video streams may be determined by calculation, and specifically, the collision severity score may be calculated to represent the collision severity of the video streams.
The following specifically explains a specific implementation procedure for determining the collision severity of a video stream in the video streaming information evaluation method provided in the embodiment of the present application.
Fig. 6 is another flow chart of the method for evaluating video stream transmission information according to the embodiment of the present application, please refer to fig. 6, which is a flowchart for determining the severity of a collision of a video stream based on the number of collisions, the cumulative duration of a single collision, and the number of collided video streams, and includes:
s610: the weight for each collision is determined based on the accumulated duration of a single collision and the number of colliding video streams.
Alternatively, assuming that there are a maximum of M collisions occurring within the current observation window, a cumulative timer for a total of M collision states [1,2,3, \ 8230;, M ] is set, assigning corresponding M weights. I.e., the higher the number of collision frames, the higher the weight, representing the more severe the overall collision. Conversely, the lower the number of collision frames, the lower the weight, which means that the overall collision is more slight.
The weight may specifically be determined according to the accumulated duration of each collision and the number of collision video streams. When the number of the collision video streams is the same, the longer the accumulated duration of the collision is, the larger the weight is; when the cumulative duration of the collision is the same, the more the number of the collision video streams is, the higher the weight is, otherwise, the similar is not described herein.
S620: a video stream collision severity score is calculated based on the number of collisions and the weight of each collision.
Alternatively, a video stream collision severity score may be determined based on the number of collisions and the weight of each collision, and the calculated score may be used as the evaluation result of the video stream collision severity.
It should be noted that, when calculating the video stream collision severity score, the calculation may be specifically performed based on a preset calculation formula, for example: the number of times of collision and the weight of each collision may be calculated, or the cumulative duration of a single collision and the number of collision video streams may be calculated, which is not specifically limited herein, and a formula for representing the severity may be determined.
Optionally, determining a key frame of the video stream generating the collision and at least one collision video stream colliding with the video stream during the collision comprises: and determining whether a key frame which has the same receiving time with the target key frame and belongs to an associated video stream other than the video stream exists or not aiming at the currently received target key frame in the video stream, if so, determining the target key frame as a key frame generating collision, and determining the associated video stream as a collision video stream colliding with the video stream.
Optionally, in the specific process of determining the collision video stream, it is determined whether there is a key frame belonging to an associated video stream other than the video stream, the key frame having the same receiving time as the target key frame, with respect to a currently received target key frame in the video stream.
Specifically, the determination may be made in the following manner:
when a key frame collision occurs, the start time and end time of the key frame collision occurrence are recorded, and then the collision duration is equal to the collision end time minus the collision start time. While recording the combination of video streams participating in the frame collision of the key frame. Setting timers of different collision scenes, and respectively calculating the accumulated collision time under the different collision scenes. The accumulated time timer under a certain key frame collision scene in the current observation window is represented by ti; timer for duration of j-th collision in ith frame collision scene i j (j.gtoreq.1). Then the following expression is given:
t i =∑ j Timer i j
for example, assume that the following collision information is recorded in the current observation window:
and (3) 1 time of five-frame collision, wherein the collision time is as follows:
Figure BDA0003943681210000151
the collision flow combination is:
flow_ID=[3,4,5,6,8]
and 3 times of three-frame collision, wherein the collision time is respectively as follows:
Figure BDA0003943681210000152
the collision flow combination is as follows:
flow_ID=[2,5,6],[5,8,9],[2,5,6]
and 4 times of two-frame collision, wherein the collision time is respectively as follows:
Figure BDA0003943681210000153
the collision flow combination is:
flow ID =[1,2],[4,7],[8,9],[1,2],[4,7]
the following cumulative collision times for each collision scenario can be obtained:
Figure BDA0003943681210000154
Figure BDA0003943681210000155
Figure BDA0003943681210000156
accordingly, a collision video stream, which may be plural, may be determined each time a collision is generated.
The following describes a device, an apparatus, and a storage medium for executing the video stream transmission information evaluation method provided by the present application, and for specific implementation processes and technical effects thereof, reference is made to the above description, and details are not described below.
Fig. 7 is a schematic structural diagram of a video stream transmission information evaluation apparatus according to an embodiment of the present application, and please refer to fig. 7, the apparatus is applied to a video stream processing device, the video stream processing device is communicatively connected to a plurality of video stream transmission sources, and the apparatus includes: a receiving module 710, a first information determining module 720, and a second information determining module 730;
a receiving module 710, configured to receive a video stream sent by each video stream sending source, where the video stream includes: a plurality of key frames arranged according to a time sequence, wherein each key frame consists of a plurality of data packets;
a first information determining module 720, configured to determine first transmission information of the video stream based on the arrival time of the data packet in each key frame, where the first transmission information is used to characterize the transmission quality of the key frame in each video stream;
and a second information determining module 730, configured to determine second transmission information of the video streams based on the duration of the key frame generating the collision in the video streams, where the second transmission information is used to represent information that each video stream is affected by other video streams.
Optionally, the first transmission information includes: time delay of arrival; a first information determining module 720, configured to specifically obtain start time of a video stream transmission source corresponding to a video stream; and determining the arrival time delay based on the arrival time of the first data packet in the first key frame of the video stream and the starting time of a video stream transmission source.
Optionally, the first transmission information includes: phase jitter information; the first information determining module 720 is specifically configured to determine whether the phase jitter of each key frame in the video stream exceeds a preset threshold based on the arrival time of the first data packet in each key frame of the video stream.
Optionally, the first information determining module 720 is specifically configured to determine a first time difference based on an arrival time of a first data packet in a first key frame of the video stream and an arrival time of a first data packet in a second key frame; determining a second time difference based on the arrival time of a first data packet in a second key frame and the arrival time of a first data packet in a third key frame of the video stream, wherein the first key frame, the second key frame and the third key frame are sequentially adjacent key frames; determining whether the phase jitter of the first key frame exceeds a preset threshold based on the first moveout and the second moveout.
Optionally, the second transmission information includes: severity of crash; the second information determining module 730, specifically configured to determine a key frame of the video stream that generates a collision and at least one collision video stream that collides with the video stream during the collision; determining the number of collisions corresponding to the video stream and the accumulated time length of single collision based on the duration of the key frame of the video stream generating the collision; determining the collision severity of the video streams based on the number of collisions, the accumulated duration of a single collision, and the number of colliding video streams.
Optionally, the second information determining module 730 is specifically configured to determine a weight of each collision based on the accumulated duration of a single collision and the number of collision video streams; a video stream crash severity score is calculated based on the number of crashes and the weight of each crash.
Optionally, the second information determining module 730 is specifically further configured to determine, for a currently received target key frame in the video stream, whether a key frame of an associated video stream that has the same receiving time as the target key frame and belongs to a different video stream exists, and if so, determine the target key frame as a key frame that generates a collision and determine that the associated video stream is a collision video stream that collides with the video stream.
In the video stream transmission information evaluation device provided in the embodiment of the present application, the video streams sent by the video stream sending sources may be received, and further, the first transmission information of the video streams may be determined based on the arrival time of the data packet in each key frame, and the second transmission information of the video streams may be determined based on the duration of the key frame that generates the collision in the video streams. The first transmission information is used for representing the transmission quality of key frames in each video stream, the second transmission information is used for representing the influence information of each video stream by other video streams, the transmission quality of the key frames in each video stream and the related information of each video stream influenced by other video streams can be comprehensively evaluated and determined by determining the first transmission information and the second transmission information, so that the accuracy and diversity of evaluation results can be improved, and the condition of the transmission quality in the video stream transmission process can be comprehensively evaluated and determined.
The above-mentioned apparatus is used for executing the method provided by the foregoing embodiment, and the implementation principle and technical effect are similar, which are not described herein again.
These above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors, or one or more Field Programmable Gate Arrays (FPGAs), etc. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Fig. 8 is a schematic structural diagram of a computer device according to an embodiment of the present application, and referring to fig. 8, the computer device includes: the memory 810 and the processor 820, wherein the memory 810 stores a computer program operable on the processor 820, and the processor 820 implements the steps of the video streaming information evaluation method when executing the computer program.
Optionally, the computer device may be specifically a video stream processing device.
In another aspect of the embodiments of the present application, a computer-readable storage medium is further provided, where a computer program is stored on the storage medium, and when the computer program is executed by a processor, the computer program implements the steps of the video streaming information evaluation method.
In the several embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit is only a logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or in the form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (in english: processor) to execute some steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a portable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other media capable of storing program codes.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think of the changes or substitutions within the technical scope of the present application, and shall cover the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made to the present application by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A video stream transmission information evaluation method applied to a video stream processing apparatus which is communicatively connected to a plurality of video stream transmission sources, comprising:
receiving a video stream transmitted by each video stream transmission source, the video stream including: a plurality of key frames arranged according to a time sequence, wherein each key frame consists of a plurality of data packets;
determining first transmission information of the video stream based on the arrival time of the data packet in each key frame, wherein the first transmission information is used for representing the transmission quality of the key frame in each video stream;
and determining second transmission information of the video streams based on the duration of the key frames which generate the collision in the video streams, wherein the second transmission information is used for representing the influence information of other video streams on each video stream.
2. The video streaming information evaluation method of claim 1, wherein the first transmission information includes: time delay of arrival;
the determining the first transmission information of the video stream based on the arrival time of the data packet in each key frame comprises:
acquiring the starting time of a video stream transmission source corresponding to the video stream;
and determining the arrival time delay based on the arrival time of the first data packet in the first key frame of the video stream and the starting time of the video stream transmission source.
3. The video streaming information evaluation method of claim 1, wherein the first transmission information comprises: phase jitter information;
the determining the first transmission information of the video stream based on the arrival time of the data packet in each key frame comprises:
and determining whether the phase jitter of each key frame in the video stream exceeds a preset threshold value based on the arrival time of the first data packet in each key frame of the video stream.
4. The method of claim 3, wherein the determining whether the phase jitter of each key frame of the video stream exceeds a preset threshold based on the arrival time of the first data packet in each key frame of the video stream comprises:
determining a first time difference based on the arrival time of a first data packet in a first key frame and the arrival time of a first data packet in a second key frame of the video stream;
determining a second time difference based on the arrival time of a first data packet in a second key frame and the arrival time of a first data packet in a third key frame of the video stream, wherein the first key frame, the second key frame and the third key frame are sequentially adjacent key frames;
determining whether the phase jitter of the first key frame exceeds a preset threshold based on the first time difference and the second time difference.
5. The video streaming information evaluation method of any of claims 1 to 4, wherein the second transmission information comprises: severity of crash;
the determining second transmission information of the video stream based on the duration of the key frame generating the collision in the video stream comprises:
determining a key frame of the video stream generating a collision and at least one collision video stream colliding with the video stream in a collision process;
determining the number of collisions corresponding to the video stream and the accumulated time length of single collision based on the duration of the key frames of the video stream generating the collisions;
determining the collision severity of the video streams based on the number of collisions, the cumulative duration of the single collision, and the number of colliding video streams.
6. The method of claim 5, wherein determining the severity of the collision for the video stream based on the number of collisions, the cumulative duration of the single collision, and the number of colliding video streams comprises:
determining the weight of each collision based on the accumulated duration of the single collision and the number of collision video streams;
calculating the video stream collision severity score based on the number of collisions and the weight of each collision.
7. The video streaming information evaluation method of claim 5, wherein the determining of the collision-generating key frames of the video stream and the at least one colliding video stream that collides with the video stream during the collision includes:
and determining whether a key frame which has the same receiving time with the target key frame and belongs to an associated video stream other than the video stream exists or not aiming at a target key frame currently received in the video stream, if so, determining the target key frame as a key frame generating collision, and determining the associated video stream as a collision video stream colliding with the video stream.
8. A video stream transmission information evaluation apparatus applied to a video stream processing device which is communicatively connected to a plurality of video stream transmission sources, comprising: the device comprises a receiving module, a first information determining module and a second information determining module;
the receiving module is configured to receive a video stream sent by each of the video stream sending sources, where the video stream includes: a plurality of key frames arranged according to a time sequence, wherein each key frame consists of a plurality of data packets;
the first information determining module is configured to determine first transmission information of the video stream based on an arrival time of a data packet in each of the key frames, where the first transmission information is used to characterize transmission quality of the key frames in each of the video streams;
the second information determining module is configured to determine second transmission information of the video streams based on the duration of the key frame generating the collision in the video streams, where the second transmission information is used to represent information that each video stream is affected by other video streams.
9. A computer device, comprising: memory in which a computer program is stored which is executable on the processor, a processor which, when executing the computer program, carries out the steps of the method according to any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that a computer program is stored on the storage medium, which computer program, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
CN202211431577.9A 2022-11-15 2022-11-15 Video streaming information evaluation method, device, equipment and storage medium Pending CN115776601A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211431577.9A CN115776601A (en) 2022-11-15 2022-11-15 Video streaming information evaluation method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211431577.9A CN115776601A (en) 2022-11-15 2022-11-15 Video streaming information evaluation method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115776601A true CN115776601A (en) 2023-03-10

Family

ID=85389181

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211431577.9A Pending CN115776601A (en) 2022-11-15 2022-11-15 Video streaming information evaluation method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115776601A (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6104757A (en) * 1998-05-15 2000-08-15 North Carolina State University System and method of error control for interactive low-bit rate video transmission
WO2010051834A1 (en) * 2008-11-04 2010-05-14 Telefonaktiebolaget L M Ericsson (Publ) Method and system for determining a quality value of a video stream
US20110292993A1 (en) * 2001-10-11 2011-12-01 At&T Intellectual Property Ii, L.P. Texture replacement in video sequences and images
US20160105684A1 (en) * 2014-10-14 2016-04-14 Huawei Technologies Co., Ltd. System and Method for Video Communication
WO2018032491A1 (en) * 2016-08-19 2018-02-22 Huizhou Tcl Mobile Communication Co., Ltd Methods, base stations, and user equipment for reliable video streaming transmission
WO2018093814A1 (en) * 2016-11-21 2018-05-24 Cisco Technology, Inc. Keyframe mitigation for video streams with multiple receivers
US20180198989A1 (en) * 2017-01-12 2018-07-12 Gopro, Inc. Phased Camera Array System For Generation of High Quality Images and Video
WO2020133465A1 (en) * 2018-12-29 2020-07-02 Zhejiang Dahua Technology Co., Ltd. Systems and methods for multi-video stream transmission
KR20210044746A (en) * 2020-08-04 2021-04-23 베이징 바이두 넷컴 사이언스 앤 테크놀로지 코., 엘티디. Video coding method, device, electronic equipment and computer readable storage medium
EP3941036A1 (en) * 2020-07-17 2022-01-19 Amlogic (Shanghai) Co., Ltd. Method, electronic device, and storage medium for selecting reference frame
CN114513651A (en) * 2020-11-16 2022-05-17 浙江大华技术股份有限公司 Video equipment key frame collision detection method, data transmission method and related device
CN114520891A (en) * 2020-11-20 2022-05-20 浙江大华技术股份有限公司 Data transmission method based on multiple front-end video devices and related device

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6104757A (en) * 1998-05-15 2000-08-15 North Carolina State University System and method of error control for interactive low-bit rate video transmission
US20110292993A1 (en) * 2001-10-11 2011-12-01 At&T Intellectual Property Ii, L.P. Texture replacement in video sequences and images
WO2010051834A1 (en) * 2008-11-04 2010-05-14 Telefonaktiebolaget L M Ericsson (Publ) Method and system for determining a quality value of a video stream
US20160105684A1 (en) * 2014-10-14 2016-04-14 Huawei Technologies Co., Ltd. System and Method for Video Communication
WO2018032491A1 (en) * 2016-08-19 2018-02-22 Huizhou Tcl Mobile Communication Co., Ltd Methods, base stations, and user equipment for reliable video streaming transmission
WO2018093814A1 (en) * 2016-11-21 2018-05-24 Cisco Technology, Inc. Keyframe mitigation for video streams with multiple receivers
US20180198989A1 (en) * 2017-01-12 2018-07-12 Gopro, Inc. Phased Camera Array System For Generation of High Quality Images and Video
WO2020133465A1 (en) * 2018-12-29 2020-07-02 Zhejiang Dahua Technology Co., Ltd. Systems and methods for multi-video stream transmission
EP3941036A1 (en) * 2020-07-17 2022-01-19 Amlogic (Shanghai) Co., Ltd. Method, electronic device, and storage medium for selecting reference frame
KR20210044746A (en) * 2020-08-04 2021-04-23 베이징 바이두 넷컴 사이언스 앤 테크놀로지 코., 엘티디. Video coding method, device, electronic equipment and computer readable storage medium
CN114513651A (en) * 2020-11-16 2022-05-17 浙江大华技术股份有限公司 Video equipment key frame collision detection method, data transmission method and related device
CN114520891A (en) * 2020-11-20 2022-05-20 浙江大华技术股份有限公司 Data transmission method based on multiple front-end video devices and related device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陆淳;朱秀昌;: "基于802.11e的无线Mesh网络中视频传输优化方案", 电视技术, no. 02, 17 February 2010 (2010-02-17) *

Similar Documents

Publication Publication Date Title
CN110475124B (en) Video jamming detection method and device
JP2007006497A (en) Apparatus and method for providing enhanced wireless communication
CN109286813B (en) Video communication quality detection method and device
CN114520891B (en) Data transmission method and related device based on multiple front-end video devices
CN108322350B (en) Service monitoring method and device and electronic equipment
CN111683273A (en) Method and device for determining video blockage information
CN109040830B (en) Live broadcast pause prediction method, switching method and device
CN110677718B (en) Video identification method and device
CN112752113B (en) Method and device for determining abnormal factors of live broadcast server
EP3101844A1 (en) Packet loss detection method, apparatus, and system
CN106390451B (en) Method and device for testing capacity of game server
CN114928758A (en) Live broadcast abnormity detection processing method and device
CN110996180B (en) Network live broadcast chatting method, system and server
CN111479161B (en) Live broadcast quality data reporting method and device
CN115776601A (en) Video streaming information evaluation method, device, equipment and storage medium
CN114513651A (en) Video equipment key frame collision detection method, data transmission method and related device
US9485458B2 (en) Data processing method and device
CN114900477B (en) Message processing method, server, electronic equipment and storage medium
CN113840157B (en) Access detection method, system and device
CN113515670B (en) Film and television resource state identification method, equipment and storage medium
CN113873278B (en) Broadcast content auditing method and device and electronic equipment
CN114679570A (en) Video transmission method and device, storage medium and electronic device
CN107465743B (en) Method and device for processing request
CN115734009A (en) Method, device and equipment for eliminating collision of video stream and storage medium
CN113840131A (en) Video call quality evaluation method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination