CN108270738B - Video processing method and network equipment - Google Patents

Video processing method and network equipment Download PDF

Info

Publication number
CN108270738B
CN108270738B CN201611264554.8A CN201611264554A CN108270738B CN 108270738 B CN108270738 B CN 108270738B CN 201611264554 A CN201611264554 A CN 201611264554A CN 108270738 B CN108270738 B CN 108270738B
Authority
CN
China
Prior art keywords
video
information
terminal
quality
network device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611264554.8A
Other languages
Chinese (zh)
Other versions
CN108270738A (en
Inventor
冯力刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Huawei Digital Technologies Co Ltd
Original Assignee
Beijing Huawei Digital Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Huawei Digital Technologies Co Ltd filed Critical Beijing Huawei Digital Technologies Co Ltd
Priority to CN201611264554.8A priority Critical patent/CN108270738B/en
Publication of CN108270738A publication Critical patent/CN108270738A/en
Application granted granted Critical
Publication of CN108270738B publication Critical patent/CN108270738B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • H04N21/42684Client identification by a unique number or address, e.g. serial number, MAC address, socket ID
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/80Responding to QoS
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/437Interfacing the upstream path of the transmission network, e.g. for transmitting client requests to a VOD server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6581Reference data, e.g. a movie identifier for ordering a movie or a product identifier in a home shopping application

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Power Engineering (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The present application relates to the field, and in particular, to a video processing method and a network device. The method comprises the following steps. Acquiring the ID and video information of a video from a video message communicated between a server and a terminal; acquiring the video pause information sent to the server by the terminal; associating the morton information with the ID of the video; and determining the quality information of the video according to the video information and the pause information. The method and the device have the advantages that the blocking information provided by the terminal is adopted, namely the blocking information is found by the terminal in the process of playing the video and reported to the server, and the blocking information is more accurate than the blocking information obtained by a simulation player of the network equipment, so that the subsequent process of judging the quality information of the video is more accurate.

Description

Video processing method and network equipment
Technical Field
The present application relates to the field of video technologies, and in particular, to a video processing method and a network device.
Background
In a video transmission mode based on hypertext Transfer Protocol (HTTP), when a user watches a video through a terminal, the user needs to receive video data from a server through the HTTP Protocol and then play the video data.
The existing network equipment for calculating the video card pause adopts the behavior of a simulated player, is positioned between a terminal and a server, can acquire data communicated between the terminal and the server, the network device will record two durations based on the content downloaded by the user, i.e. the video data received from the server, namely the playable time length of the downloaded content and the watching time length of the user, the difference value between the playable time length of the downloaded content and the watching time length of the user, namely the time length which can be supported by the video buffer area, along with the start of the terminal video playing, the network equipment simulates the player to calculate the playable time length of the downloaded content according to the downloaded video data, calculates the watching time length of the user according to the time of the player playing the video, when the video data in the buffer area is smaller than the normal playing threshold value, the video data in the buffer area is considered to be insufficient to support continuous playing, namely, the video data is judged to be blocked.
The video quality evaluation method based on the video characteristics comprises the steps that video quality evaluation is carried out on video data of a player, and the video quality evaluation is carried out on the video data of the player.
Disclosure of Invention
The embodiment of the application provides a video processing method and network equipment to solve the problem that deviation is easily generated in video quality evaluation caused by errors existing in video blocking calculation.
In view of the above, a first aspect of the present application provides a video processing method, in which a network device serving as a terminal and a server first obtains an Identification (ID) and video information of a video from a video message communicated between the server and the terminal; then, acquiring the video pause information sent by the terminal to the server, wherein the video pause information is provided by the terminal, the video is played by the terminal through a player, so that the video pause information is accurate, then the network equipment associates the pause information with the ID of the video, namely the pause information can be corresponding to the specific video, and finally, the network equipment determines the quality information of the video according to the video information and the pause information.
The method comprises the steps of firstly obtaining the ID and the video information of the video, then directly receiving the pause information from the terminal, then associating the ID of the video with the pause information, and finally determining the quality information through the video information and the pause information.
In some embodiments, the network device may obtain the ID and the video information of the video from the video message communicated between the server and the terminal by: firstly, network equipment acquires the ID and the video code rate of a video from a video response message, and then the network equipment can simulate the video to play the video according to a video data message and the video code rate so as to determine the initial buffering duration and the downloading rate of the video, wherein the video data message is the video message sent to the terminal by the server. The video response message is a video message sent to the terminal by the server. It can be seen that the simulated play behavior only determines the initial buffering time length and the download rate of the video, the initial buffering time length is the time length required by loading the video from the time of connecting to the server to the time when the video can start to play, and the video information can be provided to the network device so that the network device can judge the quality information of the video.
In some embodiments, the method further comprises: and determining the number of times of clamping and the time length of clamping from the clamping information. The time and the time length of the pause can be determined from the pause information sent from the terminal to the server, so that the subsequent judgment of the quality information of the video is facilitated.
In some embodiments, the determining, by the network device, the quality information of the video may be determining the quality information of the video according to the initial buffering duration, the downloading rate, and the pause information of the video corresponding to the video ID. The video quality can be judged according to the video information, and the judgment process can be carried out by adopting some video quality evaluation algorithms, so that the video quality can be accurately judged.
In some embodiments, the Quality information includes a Quality level of the video, Key Quality Indicators (KQI) indicating the level of the video information, and Key Performance Indicators (KPI) indicating the video network Quality level (e.g., the number of clicks and the length of clicks). The quality information can exhibit various qualities such as a KQI index for video information and katon information of a video, and a KPI index for indicating quality in a video network, which can reflect the overall quality of a video from different perspectives.
In some embodiments, the network device outputs quality information corresponding to the video ID. The output can be that the quality information is directly displayed, namely, the quality grade, the KQI and the KPI index of the video are directly displayed, the quality grade is overall evaluation, and the KQI and the KPI are evaluation for the angle of the video which is more detailed.
In some embodiments, the network device may further output quality information corresponding to the area, the network element, the terminal, or the video website by displaying the quality information corresponding to the video ID. The quality information is displayed according to different angles, for example, the information displayed according to the terminal angle and the angle of the video website can be different, for example, the priority ranking is performed according to different importance degrees of the parameters concerned by the terminal and the video website, and the parameters required by the user can be acquired at different angles through the method.
In some embodiments, the network device may associate the morton information with the ID of the video by: and associating the pause information with a characteristic field in an HTTP (hyper text transport protocol) data frame used for transmitting the video. Since these characteristic fields are different for different terminals, and the data frames received in the same time period are not necessarily data frames between the same terminal and the server, it is necessary to perform a distinguishing association in order to accurately monitor the video playing condition of each terminal.
In some embodiments, the network device may associate the morton information with the ID of the video by: associating the morton information with IP feature information, wherein the IP feature information comprises at least one of an IP address, an IP-ID and a Transmission Control Protocol (TCP) timestamp. In this manner, since the IP feature information of different terminals is different, different terminals can be monitored by associating the IP feature information with the morton information.
A second aspect of the present embodiments also provides a network device, including at least one module configured to execute the video processing method provided in the first aspect or any embodiment of the first aspect of the present application.
The third aspect of the embodiments of the present application further provides a network device, where the network device includes a processor, a transceiver, and a memory, where the memory is configured to store instructions, and the processor is configured to execute the instructions to perform the video processing method provided in the first aspect of the present application or any implementation manner of the first aspect.
A fourth aspect of the present application provides a storage medium, where a program code is stored, and when the program code is executed by a base station, the method for processing a video provided by the first aspect or any one of the implementations of the first aspect is executed. The storage medium includes, but is not limited to, a flash memory (english: flash memory), a hard disk (HDD) or a Solid State Drive (SSD).
Drawings
FIG. 1 is a schematic illustration of HTTP-based video transmission;
FIG. 2 is a diagram of an embodiment of a video processing method according to an embodiment of the present application;
FIG. 3 is a diagram of one embodiment of a network device of an embodiment of the present application;
fig. 4 is a diagram of an embodiment of a network device according to an embodiment of the present application.
Detailed Description
The embodiment of the application provides a video processing method and network equipment, which can determine more accurate video quality information through acquiring frequency information and accurate pause information.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments.
The following are detailed below.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present application and in the above-described drawings (if any) are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein.
In a video transmission mode based on HTTP protocol, please refer to fig. 1, where fig. 1 is a schematic diagram of video transmission based on HTTP, where the video transmission mode includes a server and a plurality of terminals corresponding to the server, a network device for monitoring video is further provided between the server and the terminals, all data sent by the terminals to the server can be monitored by the network device, and a process of the terminals playing the video is as follows: when the player in the terminal is connected to the video to be played, buffering is performed for a period of time, that is, the terminal receives a part of video data from the server first and uses the part of video data as buffer data of the player, and the video data received from the server is continuously injected into the buffer during the continuous playing of the video.
The network device calculates the video morton in the playing mode, and the adopted mode of simulating the players is that the players on different terminals are different, the types of the players are many, the playing parameters of different players are different, and the network device cannot simulate all the players, so that deviation can be generated between simulation and actual playing process of some players, and the calculated morton information is incorrect.
In view of the above, a video processing method provided by the embodiments of the present application is to solve the above problem, please refer to fig. 2, where fig. 2 is a diagram of an embodiment of a video processing method according to the embodiments of the present application, the method includes a terminal, a network device, and a server, and a specific process (from the beginning, establishing a video connection) may include:
201. and the terminal sends a video request message to the server.
When a user wants to watch a network video, the terminal sends a video request message to the server when the user is on a video page, so that a player on the video page can be connected to the server and play a corresponding video.
202. And the server sends a video response message to the terminal.
After receiving the video request message, the server generates a response and sends a corresponding video response message to the terminal.
203. And the network equipment acquires the ID and the video code rate of the video from the video response message.
The video request message and the video response message can both carry video ID, and the video response message can carry video code rate, so one way is to extract the video ID and the video code rate from the video response message, and certainly, the video ID and the video code rate can also be extracted from the video request message and the video response message.
204. And the server sends the video data message to the terminal.
After the video connection is established, the server sends a video data message to the terminal, and in the process of sending the video data message, the terminal does not immediately start playing videos, but takes video data received within a period of time as a buffer area.
205. The network equipment calculates the initial buffering time length and the downloading rate of the video through the analog player.
The network device can simulate a player to calculate the initial buffering time and the downloading rate of the video, and the initial buffering time is only the receiving time of the video data which is judged to be used as the buffer area and is only related to the downloading rate and the video code rate, so that the difference between the calculated initial buffering time and the initial buffering time calculated by the terminal is very small, and the downloading rate can be obtained only by monitoring the video data message.
206. And when the player of the terminal judges that the data in the buffer area is lower than the normal playing threshold value, generating pause information.
The video data in the buffer area consumed by the playing of the video may be faster than the video data received from the server due to the influence of the video code rate, which may result in less video data in the buffer area, when the video data in the buffer area is smaller than the normal playing threshold, the player may consider that the video data in the buffer area is not enough to support continuous playing, at this time, video pause may occur, and the terminal may generate pause information at this time, where the pause information includes the number of times of pause within a period of time and the pause duration from the beginning of pause to the end of pause each time of pause.
207. And the terminal sends the card pause information to the server.
After the terminal generates the card pause information, the card pause information is sent to the server.
208. The network device determines the number of times of the pause and the length of the pause from the pause information.
After the network device acquires the pause information, the network device can acquire the pause times and pause time from the pause information, and because the pause time of each pause is not necessarily the same, and the pause times in a period of time are not necessarily the same, the pause times and the pause time need to be recorded respectively.
209. The network device associates the morton information with the ID of the video.
After the pause information is acquired, the pause information is associated with the ID of the video, and as the pause information is generated by one terminal and the network equipment is connected to one or more terminals, under the condition of a plurality of terminals, the pause information needs to be distinguished and processed and is associated with the pause information in the mode of the ID of the video, so that when a plurality of terminals play a plurality of videos or one terminal plays one video, accurate distinction can be carried out.
It should be noted that there are two ways of associating the morton information with the feature field in the data frame of the HTTP protocol, and the feature field may be HOST, Uniform Resource Locator (URL), or HTTP header. The fields are characterized in that different terminals or different messages sent by the same terminal can be distinguished, wherein the HOST comprises an IP address and a HOST name, the URL is used for identifying the website of the site for acquiring the video message, and the information carried in the HTTP header field can also uniquely identify the video website or the terminal.
Another is to associate the morton information with IP feature information including at least one of an IP address, an IP-ID, and a TCP timestamp. Similar to the association of characteristic fields in data frames, an IP address can uniquely identify a terminal or a server, IP-IDs (Internet protocol-identification) are different in each video data message and can be used for distinguishing video data messages of different videos of the terminal, TCP timestamps are similar, and the TCP timestamps in the video data sent by a TCP protocol each time are different and can distinguish different videos of the terminal.
210. And the network equipment determines the quality information of the video according to the video pause information, the initial buffering time and the downloading rate.
The network device determines the quality information of the video according to the previous video blocking times, blocking duration, buffering duration, video code rate and download rate, and certainly may have some video parameters, such as video resolution for showing video image definition and video frame rate for showing video fluency, and certainly may also include video watching duration, and the corresponding quality information can be calculated through some video quality evaluation algorithms according to these parameters.
The quality information may include a quality grade, i.e. a score given to the video, and of course, there may also be a key quality indicator KQI for indicating the grade of the video information, such as performing a comprehensive analysis on a video bitrate, a video frame rate and a video resolution to give a corresponding grade, and there may also be a key performance indicator KPI for indicating a video network quality grade, such as a number of times of katton and a length of katton time equal to a network quality-related parameter, and performing the analysis to give the network quality grade.
For example, the corresponding quality information is calculated by adopting a U-vMOS algorithm or a vMOS algorithm. The U-vMOS algorithm and the vMOS algorithm are described below.
First, U-vMOS is introduced:
the U-vMOS aims to construct a unified cross-screen, cross-network and multi-service objective video experience standard system, and reflects the quality of video service experience through objective video experience evaluation standard perception.
The evaluation model of the U-vMOS is mainly divided into three parts, the U-vMOS is a comprehensive score, and the U-vMOS further has three quality scores (sQuality), interaction experience scores (sInteraction), and viewing experience scores (sView) of the sub-videos, and the specific evaluation granularity, i.e. the score, can refer to the following table 1:
TABLE 1
Figure GDA0002669678580000081
Figure GDA0002669678580000091
The method comprises the following steps of (1) monitoring a real-time sView scoring principle under a real-time monitoring scene:
a stuck event: a pause event is formed from the beginning of the picture entering a playing waiting state to the end of the picture resuming playing;
a clamping interval: the interval event from the end of one time of clamping to the beginning of the next clamping is a clamping interval;
successive stuck events: the multiple consecutive stuck events are one consecutive stuck event. The intervals between the jams are smaller than a certain fixed time length, and the jams are judged to be continuous.
The vMOS is described below.
The vMOS is an index weighted average algorithm, namely, a plurality of indexes can be subjected to quality scoring through the weighted average algorithm.
vMOS is typically classified into 5 classes as shown in table 2 below:
TABLE 2
Grade of experience MOS value
Superior food The vMOS value is 4.5-5
Good wine The vMOS value is 3.5-4.5
In The vMOS value is 2.5-3.5
Next time The vMOS value is 1.5-2.5
Difference (D) vMOS value of 0-1.5
The calculation process of vMOS is to multiply the vMOS value of a certain KPI by its weight, namely mos (KPI) weight, and then add all the mos (KPI) weight to obtain a comprehensive vMOS value, for example, if there are 5 KPI, the vMOS value will be calculated as:
MOS=MOS(KPI1)*WEIGTH1+MOS(KPI2)*WEIGTH2+MOS(KPI3) *WEIGTH3+MOS(KPI4)*WEIGTH4+MOS(KPI5)*WEIGTH5
the KPIs may be various types of parameters of the video, for example, parameters for the video, video resolution, video code rate, video frame rate, video buffering duration, video downloading speed, video blocking duration, video blocking times, and the like for the video network layer may be used as KPIs in the vMOS calculation process.
211. And the network equipment outputs the quality information of the video corresponding to the video ID.
After the calculation of the quality information of the video is completed, the quality information can be output, for example, output to a certain display device for display.
It should be noted that, during output, output can be performed at different angles, and it is easy to focus on a region or a network element, and network quality may be concerned more, and then the KPI columns may be presented at the top and in detail, and the remaining quality levels and KQI are displayed after being arranged, and if the quality levels of videos are concerned more for users or video websites, quality levels or KQI columns may be presented at the top and in detail, if the video websites can show contrast between different video resources, the method can present video quality at different angles, and different requirements are met.
In the above description, a video processing method according to an embodiment of the present application is described, and a network device according to an embodiment of the present application is described below with reference to fig. 3, where fig. 3 is a diagram of an embodiment of a network device according to an embodiment of the present application, and the network device may include:
the acquisition module 301 acquires the video ID and video information from a video message communicated between the server and the terminal;
the obtaining module 301 is further configured to obtain pause information of the video sent by the terminal to the server;
the processing module 302 is used for associating the pause information with the ID of the video;
the processing module 302 is further configured to determine quality information of the video according to the video information and the pause information.
Wherein, the obtaining module 301 can implement the step 203 and the step 204 in the embodiment shown in fig. 2, the processing module 302 can implement the step 205, the step 206, the step 208, the step 209, the step 210, and the step 211 in the embodiment shown in fig. 2, and specific functions of the obtaining module 301 and the processing module 302 may refer to the embodiment shown in fig. 2, and are not described herein again.
Optionally, the obtaining module 301 is specifically configured to:
acquiring the ID and the video code rate of a video from a video response message, wherein the video response message is a video message sent to a terminal by a server;
and simulating the video to play according to the video data message and the video code rate to determine the initial buffering time and the downloading rate of the video, wherein the video data message is a video message sent to the terminal by the server.
The specific functions of the processing module 302 can refer to the embodiment shown in fig. 2, and are not described herein again.
Optionally, the processing module 302 is further configured to:
and the network equipment determines the clamping times and the clamping duration from the clamping information.
The specific functions of the processing module 302 can refer to the embodiment shown in fig. 2, and are not described herein again.
Optionally, the processing module 302 is specifically configured to:
and determining the quality information of the video according to the initial buffering time length, the downloading rate and the pause information of the video corresponding to the video ID.
The specific functions of the processing module 302 can refer to the embodiment shown in fig. 2, and are not described herein again.
Optionally, the quality information includes a quality level of the video, a key quality indicator KQI for indicating the level of the video information and the katon information, and a key performance indicator KPI for indicating the quality level of the video network layer.
For the description of the quality information, refer to the description of step 210 in the embodiment shown in fig. 2, and the description thereof is omitted here.
Optionally, the processing module 302 is further configured to:
and outputting the quality information corresponding to the video ID.
The specific functions of the processing module 302 can refer to the embodiment shown in fig. 2, and are not described herein again.
Optionally, the processing module 302 is specifically configured to:
and outputting quality information corresponding to the region, the network element, the terminal or the video website.
The specific functions of the processing module 302 can refer to the embodiment shown in fig. 2, and are not described herein again.
Optionally, the processing module 302 is further configured to:
and associating the pause information with a characteristic field in an HTTP (hyper text transport protocol) data frame used for transmitting the video.
The specific functions of the processing module 302 can refer to the embodiment shown in fig. 2, and are not described herein again.
Optionally, the processing module 302 is further configured to:
associating the morton information with internet protocol, IP, feature information, the IP feature information including at least one of an IP address, an IP-ID, and a transmission control protocol, TCP, timestamp.
The specific functions of the processing module 302 can refer to the embodiment shown in fig. 2, and are not described herein again.
Referring to fig. 4, fig. 4 is a diagram of an embodiment of a network device according to the embodiment of the present application, where the network device 4 may include at least one processor 401, at least one transceiver 402 and a memory 403, which are all connected to a bus, and the network device according to the embodiment of the present application may have more or less components than those shown in fig. 4, may combine two or more components, or may have different component configurations or arrangements, and each component may be implemented in hardware, software, or a combination of hardware and software including one or more signal processing and/or application specific integrated circuits.
Specifically, for the embodiment shown in fig. 3, the processor 401 can implement the functions of the obtaining module 301 and the processing module 302 in the embodiment shown in fig. 3, the transceiver 402 is used for a network device to receive information for discovering a network device or to transmit information, the memory 403 has various structures for storing program instructions, and the processor 401 is used for executing the instructions in the memory 403 to implement the video processing method.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be replaced; and the modifications or the substitutions do not make the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (18)

1. A video processing method, comprising:
the network equipment acquires the ID and video information of a video from a video message communicated between a server and a terminal;
the network equipment acquires the video pause information sent to the server by the terminal;
the network equipment associates the card pause information with the ID of the video;
and the network equipment determines the quality information of the video according to the video information and the video pause information.
2. The video processing method of claim 1, wherein the network device obtaining the video ID and the video information from the video message communicated between the server and the terminal comprises:
the network equipment acquires the ID and the video code rate of a video from a video response message, wherein the video response message is a video message sent to a terminal by a server;
the network equipment simulates the video to play according to the video data message and the video code rate so as to determine the initial buffering duration and the downloading rate of the video, wherein the video data message is a video message sent to the terminal by the server.
3. The video processing method of claim 2, wherein the method further comprises:
and the network equipment determines the clamping times and the clamping duration from the clamping information.
4. The video processing method of claim 3, wherein the network device determining the quality information of the video according to the video information and the morton information comprises:
and the network equipment determines the quality information of the video according to the initial buffering time length, the downloading rate and the pause information of the video corresponding to the video ID.
5. The video processing method according to claim 4, wherein the quality information comprises a quality rating of the video, a key quality indicator KQI for indicating the rating of the video information, and a key performance indicator KPI for indicating a video network quality rating.
6. The video processing method of claim 5, wherein the method further comprises:
and the network equipment outputs the quality information of the video corresponding to the video ID.
7. The video processing method of claim 6, wherein the presenting, by the network device, the quality information corresponding to the video ID comprises:
the network device outputs quality information corresponding to a region, a network element, a terminal, or a video website.
8. The video processing method according to any of claims 1 to 7, wherein the method further comprises:
and the network equipment associates the pause information with a characteristic field in a data frame of an HTTP protocol used for transmitting the video.
9. The video processing method of claim 1, wherein the method further comprises:
the network equipment associates the card pause information with IP characteristic information, wherein the IP characteristic information comprises at least one of an IP address, an IP-ID and a TCP timestamp of the terminal.
10. A network device, comprising:
the acquisition module is used for acquiring the ID and video information of the video from a video message communicated between the server and the terminal;
the acquisition module is further used for acquiring the video pause information sent by the terminal to the server;
the processing module is used for associating the pause information with the ID of the video;
the processing module is further used for determining the quality information of the video according to the video information and the video pause information.
11. The network device of claim 10, wherein the obtaining module is specifically configured to:
acquiring the ID and the video code rate of a video from a video response message, wherein the video response message is a video message sent to a terminal by a server;
and simulating the video to play according to the video data message and the video code rate to determine the initial buffering time and the downloading rate of the video, wherein the video data message is a video message sent to the terminal by the server.
12. The network device of claim 11, wherein the processing module is further configured to:
and the network equipment determines the clamping times and the clamping duration from the clamping information.
13. The network device of claim 12, wherein the processing module is specifically configured to:
and determining the quality information of the video according to the initial buffering time length, the downloading rate and the pause information of the video corresponding to the video ID.
14. The network device of claim 13, wherein the quality information comprises a quality rating of the video, a key quality indicator KQI for indicating the rating of the video information and the katon information, and a key performance indicator KPI for indicating the quality rating of the video network.
15. The network device of claim 14, wherein the processing module is further configured to:
and outputting the quality information corresponding to the video ID.
16. The network device of claim 15, wherein the processing module is specifically configured to:
and outputting quality information corresponding to the region, the network element, the terminal or the video website.
17. The network device of any of claims 10-16, wherein the processing module is further configured to:
and associating the pause information with a characteristic field in an HTTP (hyper text transport protocol) data frame used for transmitting the video.
18. The network device of claim 10, wherein the processing module is further configured to:
and associating the morton information with Internet Protocol (IP) characteristic information, wherein the IP characteristic information comprises at least one of an IP address, an IP-ID and a Transmission Control Protocol (TCP) timestamp of the terminal.
CN201611264554.8A 2016-12-30 2016-12-30 Video processing method and network equipment Active CN108270738B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611264554.8A CN108270738B (en) 2016-12-30 2016-12-30 Video processing method and network equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611264554.8A CN108270738B (en) 2016-12-30 2016-12-30 Video processing method and network equipment

Publications (2)

Publication Number Publication Date
CN108270738A CN108270738A (en) 2018-07-10
CN108270738B true CN108270738B (en) 2021-11-19

Family

ID=62755288

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611264554.8A Active CN108270738B (en) 2016-12-30 2016-12-30 Video processing method and network equipment

Country Status (1)

Country Link
CN (1) CN108270738B (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107241595A (en) * 2017-07-14 2017-10-10 北京奇艺世纪科技有限公司 A kind of video failure monitoring method, device, system and electronic equipment
CN110971916B (en) * 2018-09-28 2022-02-08 武汉斗鱼网络科技有限公司 Live broadcast fluency monitoring method and system
CN111131764B (en) * 2018-11-01 2021-06-15 腾讯科技(深圳)有限公司 Resource exchange video data processing method, computer equipment and storage medium
CN111327964B (en) * 2018-12-17 2022-11-25 中国移动通信集团北京有限公司 Method and device for positioning video playing pause
CN110868622B (en) * 2019-10-30 2022-03-04 北京奇艺世纪科技有限公司 Canton analysis method and device, electronic equipment and storage medium
CN111225387B (en) * 2020-01-16 2023-01-31 广州万码科技有限公司 Mobile network analysis method, system, device and medium based on video playing
CN112019873A (en) * 2020-09-08 2020-12-01 北京金山云网络技术有限公司 Video code rate adjusting method and device and electronic equipment
CN116114254A (en) * 2020-09-14 2023-05-12 华为技术有限公司 Communication method and device
CN112260961B (en) * 2020-09-23 2024-06-14 北京金山云网络技术有限公司 Network traffic scheduling method and device, electronic equipment and storage medium
CN112511818B (en) * 2020-11-24 2022-08-19 上海哔哩哔哩科技有限公司 Video playing quality detection method and device
CN113541832B (en) * 2021-06-24 2023-11-03 青岛海信移动通信技术有限公司 Terminal, network transmission quality detection method and storage medium
CN114554208A (en) * 2022-01-14 2022-05-27 百果园技术(新加坡)有限公司 Video coding configuration method, system, equipment and storage medium
CN114598924B (en) * 2022-03-10 2024-03-22 恒安嘉新(北京)科技股份公司 Method, device, equipment and medium for detecting comprehensive video playing state of client
CN114444984B (en) * 2022-04-11 2022-07-08 深圳市度易科技有限公司 Remote education-based school internal and external management system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8041344B1 (en) * 2007-06-26 2011-10-18 Avaya Inc. Cooling off period prior to sending dependent on user's state
CN103024598A (en) * 2013-01-10 2013-04-03 深信服网络科技(深圳)有限公司 Device and method for acquiring network video playing fluency
CN103533454A (en) * 2013-10-29 2014-01-22 北京国双科技有限公司 Detection method and device for video playing fluency
CN106254902A (en) * 2016-08-19 2016-12-21 恒安嘉新(北京)科技有限公司 A kind of based on mobile Internet video user perception and the method and system of analysis

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103916716B (en) * 2013-01-08 2017-06-20 北京信威通信技术股份有限公司 The code rate smoothing method of realtime video transmission under a kind of wireless network
CN103561354B (en) * 2013-10-29 2017-02-08 北京国双科技有限公司 Method and device for calculating and processing video smoothness
CN105991364B (en) * 2015-02-28 2020-07-17 中兴通讯股份有限公司 User perception evaluation method and device
CN105872854A (en) * 2015-12-14 2016-08-17 乐视网信息技术(北京)股份有限公司 Watermark showing method and device
CN106060663B (en) * 2016-06-24 2018-11-27 武汉斗鱼网络科技有限公司 The method and system of monitor video smoothness degree during net cast

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8041344B1 (en) * 2007-06-26 2011-10-18 Avaya Inc. Cooling off period prior to sending dependent on user's state
CN103024598A (en) * 2013-01-10 2013-04-03 深信服网络科技(深圳)有限公司 Device and method for acquiring network video playing fluency
CN103533454A (en) * 2013-10-29 2014-01-22 北京国双科技有限公司 Detection method and device for video playing fluency
CN106254902A (en) * 2016-08-19 2016-12-21 恒安嘉新(北京)科技有限公司 A kind of based on mobile Internet video user perception and the method and system of analysis

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
OVQMS: A PASSIVE AND ADAPTIVE SYSTEM FOR OVER-THE-TOP VIDEO QUALITY MONITORING;Changtong Che et al;《11th International Conference on Wireless Communications, Networking and Mobile Computing (WiCOM 2015)》;20160407;全文 *

Also Published As

Publication number Publication date
CN108270738A (en) 2018-07-10

Similar Documents

Publication Publication Date Title
CN108270738B (en) Video processing method and network equipment
Ghadiyaram et al. A subjective and objective study of stalling events in mobile streaming videos
US9781221B2 (en) Method and apparatus for passively monitoring online video viewing and viewer behavior
US10462504B2 (en) Targeting videos based on viewer similarity
EP3313043B1 (en) System and method for determining quality of a media stream
WO2017107649A1 (en) Video transmission method and device
EP3291551A1 (en) Image delay detection method and system
CN110418170B (en) Detection method and device, storage medium and electronic device
US8577996B2 (en) Method and apparatus for tracing users of online video web sites
CN106604078B (en) A kind of network video recommended method and device
CN104735473B (en) A kind of detection method and device of video render
CN104040953A (en) Quality of user experience testing for video transmissions
US20180376195A1 (en) Live streaming quick start method and system
CN103856789A (en) System and method for achieving OTT service quality guarantee based on user behavior analysis
CN109714622B (en) Video data processing method and device and electronic equipment
CN109587521B (en) Video stuck judgment method and device
EP2654225A2 (en) Fault detection in streaming media
EP3754998B1 (en) Streaming media quality monitoring method and system
WO2016134564A1 (en) User perception estimation method and apparatus
CN109982068B (en) Method, apparatus, device and medium for evaluating quality of synthesized video
US11570228B2 (en) System and method for managing video streaming quality of experience
CN111970150B (en) Log information processing method, device, server and storage medium
CN110087141A (en) Method of transmitting video data, device, client and server
CN106911927A (en) Assess method, device and the DPI equipment of Internet video user experience quality
US20160269787A1 (en) Data processing method and apparatus for counting audience rating

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant