CN114422866B - Video processing method and device, electronic equipment and storage medium - Google Patents

Video processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114422866B
CN114422866B CN202210048030.4A CN202210048030A CN114422866B CN 114422866 B CN114422866 B CN 114422866B CN 202210048030 A CN202210048030 A CN 202210048030A CN 114422866 B CN114422866 B CN 114422866B
Authority
CN
China
Prior art keywords
video
play
change rate
frame
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210048030.4A
Other languages
Chinese (zh)
Other versions
CN114422866A (en
Inventor
吕华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen TCL New Technology Co Ltd
Original Assignee
Shenzhen TCL New Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen TCL New Technology Co Ltd filed Critical Shenzhen TCL New Technology Co Ltd
Priority to CN202210048030.4A priority Critical patent/CN114422866B/en
Publication of CN114422866A publication Critical patent/CN114422866A/en
Application granted granted Critical
Publication of CN114422866B publication Critical patent/CN114422866B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/647Control signaling between network components and server or clients; Network processes for video distribution between server and clients, e.g. controlling the quality of the video stream, by dropping packets, protecting content from unauthorised alteration within the network, monitoring of network load, bridging between two different networks, e.g. between IP and wireless
    • H04N21/64723Monitoring of network processes or resources, e.g. monitoring of network load
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/80Responding to QoS
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/643Communication protocols
    • H04N21/6437Real-time Transport Protocol [RTP]

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Security & Cryptography (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The embodiment of the application discloses a video processing method, a video processing device, electronic equipment and a storage medium; the video processing method can receive the sequence identification of the video frames in the playing video; based on the sequence identification, detecting the loss of video frames of the played video; when detecting that the video frame loss exists in the played video, identifying the picture change rate of the played video; and performing play adjustment processing on the play video based on the picture change rate, so that the play effect of the play video is improved, and the experience of watching the play video by a user is improved.

Description

Video processing method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of communications technologies, and in particular, to a video processing method, a device, an electronic apparatus, and a storage medium.
Background
With the development of information technology and communication technology, viewing video has become a part of people's lives. In the prior art, the transmission control protocol (TCP, transmission Control Protocol) or the user datagram protocol (UDP, user Datagram Protocol) is generally used for real-time transmission of the play-back video. When the TCP protocol or the UDP protocol is adopted for video transmission, when the video frame loss occurs in the played video, the video frame loss is not combined with a specific video playing scene, so that the playing effect of the played video is reduced, and the experience of watching the played video by a user is reduced.
Disclosure of Invention
The embodiment of the application provides a video processing method, a video processing device, electronic equipment and a storage medium, which can improve the playing effect of playing video, thereby improving the experience of watching the playing video for users.
The embodiment of the application provides a video processing method, which comprises the following steps:
receiving a sequence identifier of a video frame in a playing video;
based on the sequence identification, detecting the loss of video frames of the played video;
when detecting that the video frame of the playing video is lost, identifying the picture change rate of the playing video;
and performing play adjustment processing on the play video based on the picture change rate.
Correspondingly, the embodiment of the application also provides a video processing device, which comprises:
the receiving unit is used for receiving the sequence identification of the video frames in the playing video;
the loss detection unit is used for detecting the loss of video frames of the playing video based on the sequence identification;
the identification unit is used for identifying the picture change rate of the playing video when detecting that the video frame of the playing video is lost;
and the play adjusting unit is used for carrying out play adjusting processing on the play video based on the picture change rate.
In an embodiment, the loss detection unit includes:
the identification matching subunit is used for matching the sequence identification with a preset reference sequence identification;
a detection subunit, configured to detect whether a preset data pool includes a video frame corresponding to the preset reference sequence identifier when the sequence identifier is matched with the preset reference sequence identifier;
and the overtime judging subunit is used for judging whether the acquisition time of the video frame corresponding to the preset reference sequence identifier is overtime or not when the video frame corresponding to the preset reference sequence identifier is not included in the preset data pool.
In an embodiment, the timeout determination subunit includes:
the extraction module is used for extracting the acquisition time of the video frame corresponding to the preset reference sequence identifier when the video frame corresponding to the preset reference sequence identifier is not included in the preset data pool;
the time matching module is used for matching the acquired time with a preset time threshold;
and the overtime module is used for acquiring overtime of the video frame corresponding to the preset reference sequence identifier when the acquisition time is not matched with the preset time threshold, and the video frame of the playing video is lost.
In an embodiment, the loss detection unit further includes:
an updating subunit, configured to update the preset reference sequence identifier when the preset data pool includes a video frame corresponding to the sequence identifier, so as to obtain an updated reference sequence identifier;
and the receiving subunit is used for receiving the rest video frames in the playing video based on the updated reference sequence identification.
In an embodiment, the identification unit comprises:
a time calculating subunit, configured to calculate time information of the video frame of the playing video;
the statistical processing subunit is used for carrying out statistical processing on the time information to obtain the statistical time information of the video frame;
and the logic operation processing is used for carrying out logic operation processing on the counted time information to obtain the picture change rate of the playing video.
In an embodiment, the play adjusting unit includes:
the comparison subunit is used for comparing the picture change rate with a preset picture change rate to obtain a comparison result;
and the play adjustment subunit is used for carrying out play adjustment processing on the play video by adopting a corresponding adjustment mode based on the comparison result.
In an embodiment, the play-adjusting subunit includes:
the first play adjustment module is used for carrying out play adjustment processing on the play video in a first adjustment mode when the picture change rate accords with the preset picture change rate;
and the second play adjustment module is used for carrying out play adjustment processing on the play video by adopting a second adjustment mode when the picture change rate does not accord with the preset picture change rate.
Correspondingly, the embodiment of the application also provides electronic equipment, which comprises a memory and a processor; the memory stores a computer program, and the processor is configured to run the computer program in the memory to execute the video processing method provided in any one of the embodiments of the present application.
Accordingly, the embodiments of the present application further provide a storage medium storing a computer program, where the computer program when executed by a processor implements the video processing method provided in any one of the embodiments of the present application.
The embodiment of the application can receive the sequence identification of the video frames in the playing video; based on the sequence identification, detecting the loss of video frames of the played video; when detecting that the video frame loss exists in the played video, identifying the picture change rate of the played video; and performing play adjustment processing on the play video based on the picture change rate, so that the play effect of the play video is improved, and the experience of watching the play video by a user is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic view of a video processing method according to an embodiment of the present application;
fig. 2 is a schematic flow chart of a video processing method according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of a video processing method according to an embodiment of the present disclosure;
fig. 4 is a schematic view of still another scenario of the video processing method provided in the embodiment of the present application;
FIG. 5 is a schematic flow chart of a video processing method according to an embodiment of the present disclosure;
FIG. 6 is a schematic flow chart of a video processing method according to an embodiment of the present disclosure;
FIG. 7 is a schematic flow chart of a video processing method according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of a video processing apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which embodiments of the present application are shown, however, in which embodiments are shown, by way of illustration, only, and not in any way all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The embodiment of the application provides a video processing method which can be executed by a video processing device, and the video processing device can be integrated in an electronic device. The electronic device may include at least one of a terminal, a server, and the like. I.e. the video processing method may be performed by the terminal or by the server.
The terminal may include a personal computer, a tablet computer, a smart television, a smart phone, a smart home, a wearable electronic device, a VR/AR device, a vehicle-mounted computer, and the like.
The server may be an interworking server or a background server among a plurality of heterogeneous systems, may be an independent physical server, may be a server cluster or a distributed system formed by a plurality of physical servers, and may be a cloud server for providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, basic cloud computing services such as big data and an artificial intelligent platform, and the like.
In an embodiment, as shown in fig. 1, the video processing apparatus may be integrated on an electronic device such as a terminal or a server, so as to implement the video processing method provided in the embodiment of the present application. Specifically, the electronic device may receive a sequence identification of a video frame in the play video; based on the sequence identification, detecting the loss of video frames of the played video; when detecting that the video frame loss exists in the played video, identifying the picture change rate of the played video; and performing play adjustment processing on the play video based on the picture change rate.
The following detailed description is given, respectively, of the embodiments, and the description sequence of the following embodiments is not to be taken as a limitation of the preferred sequence of the embodiments.
The video processing method according to the embodiment of the present application will be described in terms of integrating the video processing apparatus in the electronic device.
As shown in fig. 2, a video processing method is provided, and the specific flow includes:
101. and receiving the sequence identification of the video frames in the playing video.
In one embodiment, the broadcast video real-time transmission is typically performed using the transmission control protocol (TCP, transmission Control Protocol) or the user datagram protocol (UDP, user Datagram Protocol).
For example, when a network congestion is encountered during the process of playing video for real-time transmission using the TCP protocol, the transmitting end discards old data that has not been transmitted yet, and the transmission network does not discard the data. At this time, the user at the receiving end can see that the display screen of the played video has a screen jump (also called a frame jump), but no screen-display phenomenon (also called a mosaic) occurs.
For another example, when a network congestion is encountered during the process of playing video and transmitting in real time by using the UDP protocol, the transmitting end will not discard the data, but the transmitting network will discard the data. At this time, the user at the receiving end can see the phenomena of losing the picture and showing the mosaic of the storage part of the playing video.
In an embodiment, when the playing video has a video frame loss, no consideration is given to the usage scenario of the user, whether the playing video has a frame skip or a mosaic phenomenon. For example, when the frame variation of the played video is not large, the mosaic is easily perceived by the user, and if the frame skip mode is adopted, the playing effect of the played video is better. For another example, when the picture of the playing video changes at a high speed, a slight mosaic is hardly perceived by the user, and a frame skip is easily perceived.
Therefore, the embodiment of the application provides a video processing method, which can combine the sequence identification of video frames in the search and play video; based on the sequence identification, detecting the loss of video frames of the played video; when detecting that the video frame of the playing video is lost, identifying the picture change rate of the playing video; and performing play adjustment processing on the play video based on the picture change rate. By identifying the picture change rate of the played video, different adjustment modes can be adopted to play and adjust the played video based on different picture change rates of the played video, so that the playing effect of the played video is improved.
Wherein playing the video may include playing the video in real time.
Wherein the video frames may include units constituting a play video, i.e., the play video is composed of video frames of one frame.
The sequence identifier may indicate what frame the video frame is a video to play. By means of the sequence identification, it is possible to know whether the play video has a missing video frame.
In an embodiment, to ensure that the play video can be correctly reassembled at the receiving end, when the sending end transmits the play video, the play video is often split into a plurality of video frames, and then each video frame is sent to the receiving end. Then, after receiving the video frames of the playing video, the receiving end can reorganize the video frames, so as to obtain the playing video. In order to enable the receiving end to correctly reorganize video frames of the playing video, the transmitting end can add corresponding sequence identifiers for each video frame of the playing video before transmitting the playing video, so that the receiving end can correctly reorganize the video frames according to the sequence identifiers of the video frames.
In one embodiment, before receiving the sequence identifier of the video frame in the play video, a communication connection is further established between the sending end and the receiving end, so that the play video can be transmitted between the sending end and the receiving end through the communication connection.
In one embodiment, the video compression mode and the video transmission format can be negotiated between the transmitting end and the receiving end. A communication connection may then be established based on negotiating the video compression scheme and the video transmission format, and video transmitted based on the communication connection.
For example, the transmitting end may transmit the video compression mode and the video transmission format that can be supported by the transmitting end to the receiving end, and then the receiving end may screen the target video compression mode and the target video transmission format between the receiving end and the transmitting end from the video compression mode and the video transmission format that can be supported by the transmitting end in combination with the video compression mode and the video transmission format that can be supported by the receiving end. Then, the receiving end and the transmitting end can establish a communication connection based on the target video compression mode and the target video transmission format, and transmit the playing video in real time through the communication connection.
For example, the transmitting end and the receiving end may establish a video transmission channel and a control channel. The video transmission channel is used for transmitting the playing video. For example, the video transmission channel may be based on a real-time transmission protocol (Realtime Transport Protocol, RTP) to transmit the play video. Wherein the RTP protocol is a real-time transport protocol based on UDP protocol. When the video transmission channel is based on the RTP protocol for transmitting the play video, a play video is split into a plurality of RTP packets, and each RTP packet can be regarded as a video frame.
The control channel can be used for the two transmission parties to negotiate a video compression mode and a video transmission format. In addition, when the video frame of the playing video storage is lost, the control channel can also be used for transmitting the video frame loss information sent to the sending end by the receiving end, so that the sending end can send the video frame lost by the receiving end as soon as possible and the playing video can be recovered to be played normally.
102. And detecting the loss of the video frames of the played video based on the sequence identification.
In one embodiment, when there is a loss of video frames in the play video, it is indicated that the video processing apparatus needs to adjust the display condition of the play video. For example, a frame skip is displayed on a display page of the play video, or a mosaic is displayed on a display page of the play video. Therefore, the video processing device needs to detect the video frame loss of the playing video, and when detecting that the playing video has the video frame loss, the video processing device can identify the picture change rate of the playing video and perform play adjustment processing on the playing video based on the picture change rate.
103. When detecting that the video frame loss exists in the playing video, identifying the picture change rate of the playing video.
In an embodiment, when the video processing device detects that the video frame of the play video is lost, the video processing device may identify a frame change rate of the play video and perform play adjustment processing on the play video based on the frame change rate.
The picture change rate may refer to a change rate corresponding to a playing picture when the playing video is played.
In one embodiment, the sensitivity of the user to the change of the playing video can be determined by identifying the frame change rate of the playing video. For example, when the frame variation of the played video is not large, the mosaic is easily perceived by the user, and the frame skip is not easily perceived by the user. For another example, when the frame of the played video varies greatly, a slight mosaic is hardly perceived by the user, and a frame skip is easily perceived. Therefore, the video processing apparatus can recognize the picture change rate of the play video and perform adjustment processing on the play video based on the picture change rate.
104. And performing play adjustment processing on the play video based on the picture change rate.
In an embodiment, after the video processing apparatus identifies the frame rate of change of the played video, the play adjustment process may be performed on the played video based on the frame rate of change.
For example, when playing video, when the transmission network has sporadic congestion or errors and the picture change rate is slow, when displaying the played video, a frame skip strategy is adopted, so that a user cannot easily feel that the video playing is not smooth. For another example, when the picture change rate is high, a mosaic playing strategy may be adopted when the video is displayed, so that the user does not easily feel that the video is not smoothly played.
The embodiment of the application provides a video processing method, which comprises the following steps: receiving a sequence identifier of a video frame in a playing video; based on the sequence identification, detecting the loss of video frames of the played video; when detecting that the video frame loss exists in the played video, identifying the picture change rate of the played video; and performing play adjustment processing on the play video based on the picture change rate. According to the method provided by the embodiment of the application, different play adjustment processing can be performed on the play video according to different picture change rates of the play video, so that a user cannot feel the condition that the play video has video frame loss, and the play effect of the play video and the experience of watching the play video by the user are improved.
According to the method described in the above embodiments, examples are described in further detail below.
The method of the embodiment of the present application will be described by taking the example that the video processing method is integrated on the receiving end (where the receiving end may be a terminal). Specifically, as shown in fig. 3, the flow of the video processing method provided in the embodiment of the present application may include:
201. receiving end receives sequence identification of video frame in playing video
In one embodiment, the receiving end may establish a communication connection with the transmitting end before the receiving end receives the sequence identification of the video frames in the play video.
For example, as shown in fig. 4, the transmitting end and the receiving end may negotiate a video compression mode, a video transmission format and a variable frame rate to be supported, and establish a communication connection based on the negotiated video compression mode, video transmission format and variable frame rate to be supported.
The video compression mode may be h.264. The video is the result of a continuous play of one picture, and the adjacent pictures change very little, so that the middle picture only needs to transfer the difference relative to the previous picture. For example, a video is composed of 100 consecutive pictures, each picture called a frame. Simple compression is to leave figure 1 intact, leave part of the difference between 2 nd and 1 st, and so on. The remaining portions of each picture are recompressed to form a series of compressed data that forms the compressed video stream. In h.264, the full picture frame is called an I frame (first frame in the example), and only the P frame of the difference portion. The practical compression is much more complex than described above, and a difference frame is B-frame, which is not used for real-time transmission. Therefore, the receiving end needs to be able to display the picture, I frames are required to be received, otherwise, the I frames cannot be displayed, when the P frames are lost, only the difference part cannot be displayed, and when the picture changes very little, the user cannot basically perceive the difference.
Where the frame rate is the number of pictures contained in one second, the larger the frame rate, the better the continuity, but the higher the transmission requirement. Current video coding and decoding support variable frame rates, i.e., pictures are delivered only when they change, and not when they do not. No fixed frame rate delivery is used and no data is transmitted if the sender picture is unchanged.
In one embodiment, after the communication connection is established, the receiving end and the transmitting end may transmit the play video based on the communication connection. For example, when the receiving end is a television and the transmitting end is a computer, the video can be transmitted and played between the television and the computer based on the established communication connection.
In an embodiment, as shown in fig. 4, during the process of transmitting the playing video between the sending end and the receiving end, the receiving end may continuously perform video frame loss detection on the playing video. When the receiving end detects that the video frame of the playing video is lost, the receiving end can request the picture synchronization from the sending end.
202. And the receiving end carries out video frame loss detection on the played video based on the sequence identification.
In an embodiment, when the video is transmitted and played between the receiving end and the sending end, the video can be transmitted based on the RTP protocol. Wherein the RTP protocol is a real-time transport protocol based on UDP protocol. When the video transmission channel is based on the RTP protocol for transmitting the play video, a play video is split into a plurality of RTP packets, and each RTP packet can be regarded as a video frame. Wherein each RTP packet has a sequence identity. In general, the sequence identifier corresponding to the consecutive RTP packets is continuous, and if not, the identifier indicates that the video frame is lost. Therefore, the receiving end can perform video frame loss detection on the played video based on the sequence identification.
In an embodiment, after the receiving end receives the sequence identifier of the video frame of the playing video, it may be determined whether the video frame is the first frame based on the sequence identifier of the video frame.
When the video frame is the first frame, the sequence identifier of the video frame may be made to be a preset reference sequence identifier. The preset reference sequence identifier can be used as a basis for judging whether a video frame is lost or not. The preset reference sequence identifier may be equivalent to a video frame that the receiving end wants to receive. For example, when the preset reference sequence identifier is 12, it indicates that the receiving end wants to receive the video frame with the sequence identifier of 12.
And when the video frame is not the first frame, the sequence identification of the video frame and the preset parameter sequence identification can be matched. When the sequence identifier is matched with the preset reference sequence identifier, detecting whether a preset data pool comprises a video frame corresponding to the preset reference sequence identifier or not. When the preset data pool does not comprise the video frames corresponding to the preset reference sequence identifiers, judging whether the acquisition time of the video frames corresponding to the preset reference sequence identifiers is overtime.
When the acquisition time of the video frame corresponding to the preset reference sequence identifier is overtime, the video frame is lost.
For example, as shown in fig. 5, the receiving end obtains a new video frame (equivalent to Fragment) of the playing video, where the sequence identifier of the Fragment is SEQ. Then, it can be determined whether the Fragment is the first frame, and if so, the preset reference sequence identity (equivalent to the nextgraminteq) is set to SEQ. And when the Fragment is not the first frame, it can be judged whether or not SEQ is greater than or equal to nextgragmentseq. When SEQ is smaller than NextFragmentSEQ, it indicates that the receiving end has received the video frame, and therefore, the video frame can accept a new video frame. Wherein, when SEQ is greater than or equal to NextFragmentSEQ, video frame fragments can be added to the preset data pool. Then, the receiving end can judge whether the data pool includes a video frame with a preset reference sequence identifier of NextFragmentSEQ. When the preset data pool does not include the video frame corresponding to the nextghragmentseq, it is indicated that the receiving end does not receive the desired video frame, and at this time, the receiving end may determine whether the acquisition time of the video frame corresponding to the nextghragmentseq is overtime.
The preset data pool can be used for storing video frames received by the receiving end.
In one embodiment, multiple video frames of the played video may not arrive at the receiving end in sequence, which is determined by the IP network characteristics. Therefore, when the receiving end receives video frames, the video frames need to be ordered, and the ordering has a time limit, such as 100ms. For example, it is currently necessary to receive a video frame of nextgmentseq=11, but the video frame of seq=13 arrives first, so the video frame of seq=13 can be saved first. If the video frame of seq=11 has not arrived more than 100ms, we declare that the video frame of seq=11 has been lost, and thus wait for the video frame of SEQ 12, at which time the nextghragmentseq=12 is updated.
In an embodiment, the step of "when the preset data pool does not include the video frame corresponding to the sequence identifier, determining whether the acquisition time of the video frame corresponding to the preset reference sequence identifier is timeout" may include:
when the preset data pool does not comprise the video frames corresponding to the sequence identifications, extracting the acquisition time of the video frames corresponding to the preset reference sequence identifications;
matching the acquired time with a preset time threshold;
when the acquisition time is not matched with the preset time threshold, the acquisition of the video frame corresponding to the preset reference sequence identifier is overtime, and the video frame of the playing video is lost.
The acquisition time of the video frame may refer to the time when the receiving end actually acquires the video frame. If the receiving end does not receive the video frame all the time, the acquisition time of the video frame is accumulated all the time.
The preset time threshold may be a basis for determining whether the video frame has acquired a timeout.
For example, when the acquisition time of the video frame exceeds a preset time threshold, it is indicated that the receiving end does not receive the video frame with the sequence identifier being the preset reference sequence identifier within the specified time, so that the acquisition of the video frame corresponding to the preset reference sequence identifier is overtime, and the video frame is lost when the video is played.
For example, the preset time threshold may be set to 100ms. When the acquisition time of the video frames exceeds 100ms, the receiving end is not received the video frames with the sequence identification being the preset reference sequence identification in the specified time, so that the acquisition time of the video frames corresponding to the preset reference sequence identification is overtime, and the video frames are lost when the video is played.
In an embodiment, the video processing method provided in the embodiment of the present application further includes:
when the preset data pool comprises a video frame corresponding to a preset reference sequence identifier, updating the preset reference sequence identifier to obtain an updated reference sequence identifier;
the remaining video frames in the play video are received based on the updated reference sequence identification.
For example, as shown in fig. 5, when the video frame corresponding to the nextghragmentseq is included in the preset data pool, it is explained that the video frames in the preset data pool are continuous, at this time, the newly received segment may be removed from the preset data, and the nextghragmentseq may be added by 1. Then, the rest video frames in the playing video are received based on the updated reference sequence identification, and whether the playing video has video frame loss or not is detected based on the updated reference sequence identification.
In addition, as shown in fig. 5, after updating the preset reference sequence identifier, the receiving end may further determine that the received video frame can be recombined into the play video. If the video cannot be recombined into the playing video, the receiving end can receive the rest video frames in the playing video based on the updated reference sequence identification.
In an embodiment, when the video frame corresponding to the preset reference sequence identifier cannot be received within the specified time, the preset reference sequence identifier may also be updated to obtain the updated reference sequence identifier. Then, the receiving end informs the sending end of playing the video that the video frame is lost, and the synchronous picture is needed. The synchronous frame may refer to an IDR frame that requires the transmitting end to transmit a complete frame, for example, an h.264 protocol summary.
203. When detecting that the video frame loss exists in the played video, the receiving end identifies the picture change rate of the played video.
In an embodiment, when the receiving end detects that the video frame of the playing video is lost, the receiving end can identify the frame change rate of the playing video. Specifically, the step of identifying a frame change rate of the play video when detecting that the play video has a video frame loss may include:
calculating time information of video frames of the played video;
carrying out statistical processing on the time information to obtain statistical time information of the video frames;
and carrying out logic operation processing on the counted time information to obtain the picture change rate of the played video.
Wherein the time information of the video frame may refer to a relationship between an actual receiving time and an expected receiving time of the video frame.
In an embodiment, the time information of the video frame may refer to a difference between an actual receiving time and an expected receiving time of the video frame, and so on.
The receiving end may acquire an actual receiving time of the video frame, subtract an expected receiving time of the video frame from the actual receiving time of the video frame, and use the obtained time difference as time information of the video frame.
For example, as shown in fig. 6, the time information of the video frame may be equal to the difference between the actual reception time and the expected reception time of the video frame.
Wherein when a plurality of video frames of a play video are received, time information of each video frame can be calculated separately. For example, when 10 video frames of a play video are received, time information of the 10 video frames may be calculated, respectively.
In one embodiment, the time information may be statistically processed to obtain statistical time information for the video frame.
The statistical processing may be processing the time information by using a mathematical statistical method. For example, statistical processing may include averaging, variance or standard deviation, and so forth.
For example, after calculating the time information of each video frame in the play video, the time information of a plurality of video frames may be averaged and the average value may be used as the statistical time information.
For example, as shown in fig. 6, a time difference average of a plurality of video frames may be calculated. For example, there are 10 video frames, and the time difference between these 10 video frames may be divided by 10 to obtain statistical time information.
In an embodiment, after the statistical time information is obtained, logic operation processing may be further performed on the statistical time information, so as to obtain a frame change rate of the playing video.
For example, as shown in fig. 6, the picture change rate may be equal to the inverse of the statistical time information.
In an embodiment, when detecting that the video frame of the playing video is lost, the receiving end can identify the picture change rate of the playing video, so that play adjustment processing can be performed on the playing video based on the picture change rate, and the playing effect of the playing video and the experience of watching the playing video by a user are improved.
204. And the receiving end performs play adjustment processing on the play video based on the picture change rate.
In one embodiment, after the receiving end recognizes the frame rate, the play-back adjustment process may be performed on the play-back video based on the frame rate. The receiving end can judge whether the picture change rate of the played video is high or low, because the receiving end can perform different play adjustment processing on the played video according to different picture change rates. Specifically, the step of performing a play adjustment process on a play video based on a picture change rate may include:
Comparing the picture change rate with a preset picture change rate to obtain a comparison result;
and carrying out play adjustment processing on the play video by adopting a corresponding adjustment mode based on the comparison result.
The preset frame change rate may be a frame change rate preset by a developer. By comparing the picture change rate of the played video with the preset picture change rate, it is possible to determine whether the picture change rate of the played video is a high change rate or a low change rate.
In an embodiment, the step of performing the play adjustment processing on the play video by adopting a corresponding adjustment manner based on the comparison result may include:
when the picture change rate accords with the preset picture change rate, adopting a first adjustment mode to play and adjust the played video;
and when the picture change rate does not accord with the preset picture change rate, adopting a second adjustment mode to carry out play adjustment processing on the play video.
In an embodiment, the picture change rate conforming to the preset picture change rate may refer to the picture change rate being greater than the preset picture change rate. At this time, the frame change rate of the played video frame belongs to a high change rate, and the played video can be subjected to play adjustment processing by adopting a first adjustment mode.
The first adjustment method may refer to performing play adjustment processing on the play video by using a mosaic manner, that is, displaying the mosaic when the play video has a video frame loss and cannot be displayed normally and the picture change rate is high.
In an embodiment, the frame rate not conforming to the predetermined frame rate may refer to the frame rate being less than or equal to the predetermined frame rate. At this time, the frame change rate of the played video belongs to a low change rate, and the played video can be subjected to play adjustment processing in a second adjustment mode.
The second adjustment method can adopt a frame skipping mode to perform play adjustment processing on the play video, namely, when the play video has video frame loss, cannot be displayed normally and has low frame change rate, the play picture displays the frame skipping.
In an embodiment, as shown in fig. 7, after identifying the frame rate of change of the playing video, the receiving end may determine whether to pause the playing video. When the video is played without pause and the picture change rate is high, the mosaic can be displayed. When the video is played without pause, and the picture change rate is low, the skip frame can be displayed.
And when the playing video pauses the display, the playing video is indicated to have possible abnormality. For example, playing video already has a situation where video frames are lost. Then, it can be judged whether or not the received latest video frame is a picture synchronization frame (i.e., IDR frame). If the latest video frame is a picture synchronization frame, the picture of the played video is restored. If the latest video frame is not the picture synchronization frame, the latest video frame can be discarded, and the playing video can be continuously paused.
The embodiment of the application provides a video processing method, which comprises the following steps: receiving a sequence identifier of a video frame in a playing video; based on the sequence identification, detecting the loss of video frames of the played video; when detecting that the video frame loss exists in the played video, identifying the picture change rate of the played video; and performing play adjustment processing on the play video based on the picture change rate. According to the method provided by the embodiment of the application, different play adjustment processing can be performed on the play video according to different picture change rates of the play video, so that a user cannot feel the condition that the play video has video frame loss, and the play effect of the play video and the experience of watching the play video by the user are improved.
The video processing method provided by the embodiment of the application has the advantages that when network conditions are poor and data are lost in the transmission process, one of the two modes of screen display and frame skip can be selected according to the picture change rate. Under the condition of low picture change rate, frame skipping is selected, and under the condition of high picture change rate, a screen is selected, so that user experience is improved.
The video processing method provided by the embodiment of the application can be applied to a scene of casting a screen to a television in a mobile phone or a computer, and the scene is usually a played video or a basically still picture. Such as a computer screen shot or PPT lecture, etc. When the picture of the playing video is basically unchanged, the frame skipping mode can be adopted for displaying, and the display of the television is ensured to be a clear picture. When the data is lost, the television stops updating until receiving the picture synchronization I frame, and simultaneously the television as a receiving end can immediately request the computer to send a picture synchronization (I frame of H.264 protocol) and can recover within 100ms according to the current network transmission speed. If the computer plays the video, the television can continue to play when the data is lost, so that the smoothness is ensured, at the moment, the screen patterns with different degrees can appear, and meanwhile, the computer is required to send an I frame.
In order to better implement the video processing method provided in the embodiments of the present application, in an embodiment, a video processing apparatus is also provided, where the video processing apparatus may be integrated in an electronic device. The meaning of the nouns is the same as that in the video processing method, and specific implementation details can be referred to in the description of the method embodiment.
In one embodiment, a video processing apparatus is provided, which may be integrated in an electronic device, as shown in fig. 8, and includes: the receiving unit 301, the loss detecting unit 302, the identifying unit 303, and the play adjusting unit 304 are specifically as follows:
a receiving unit 301, configured to receive a sequence identifier of a video frame in a play video;
a loss detection unit 302, configured to perform video frame loss detection on the played video based on the sequence identifier;
an identifying unit 303, configured to identify a frame change rate of the playing video when it is detected that the playing video has video frame loss;
and a play adjustment unit 304, configured to perform play adjustment processing on the play video based on the frame change rate.
In an embodiment, the loss detection unit 302 includes:
The identification matching subunit is used for matching the sequence identification with a preset reference sequence identification;
a detection subunit, configured to detect whether a preset data pool includes a video frame corresponding to the preset reference sequence identifier when the sequence identifier is matched with the preset reference sequence identifier;
and the overtime judging subunit is used for judging whether the acquisition time of the video frame corresponding to the preset reference sequence identifier is overtime or not when the video frame corresponding to the preset reference sequence identifier is not included in the preset data pool.
In an embodiment, the timeout determination subunit includes:
the extraction module is used for extracting the acquisition time of the video frame corresponding to the preset reference sequence identifier when the video frame corresponding to the preset reference sequence identifier is not included in the preset data pool;
the time matching module is used for matching the acquired time with a preset time threshold;
and the overtime module is used for acquiring overtime of the video frame corresponding to the preset reference sequence identifier when the acquisition time is not matched with the preset time threshold, and the video frame of the playing video is lost.
In an embodiment, the loss detection unit 302 further includes:
An updating subunit, configured to update the preset reference sequence identifier when the preset data pool includes a video frame corresponding to the sequence identifier, so as to obtain an updated reference sequence identifier;
and the receiving subunit is used for receiving the rest video frames in the playing video based on the updated reference sequence identification.
In an embodiment, the identifying unit 303 includes:
a time calculating subunit, configured to calculate time information of the video frame of the playing video;
the statistical processing subunit is used for carrying out statistical processing on the time information to obtain the statistical time information of the video frame;
and the logic operation processing is used for carrying out logic operation processing on the counted time information to obtain the picture change rate of the playing video.
In an embodiment, the play adjusting unit 304 includes:
the comparison subunit is used for comparing the picture change rate with a preset picture change rate to obtain a comparison result;
and the play adjustment subunit is used for carrying out play adjustment processing on the play video by adopting a corresponding adjustment mode based on the comparison result.
In an embodiment, the play-adjusting subunit includes:
The first play adjustment module is used for carrying out play adjustment processing on the play video in a first adjustment mode when the picture change rate accords with the preset picture change rate;
and the second play adjustment module is used for carrying out play adjustment processing on the play video by adopting a second adjustment mode when the picture change rate does not accord with the preset picture change rate.
In the implementation, each unit may be implemented as an independent entity, or may be implemented as the same entity or several entities in any combination, and the implementation of each unit may be referred to the foregoing method embodiment, which is not described herein again.
The video processing device can improve the reliability of video processing.
The embodiment of the application also provides electronic equipment, which can comprise a terminal or a server; for example, the electronic device may be a server, such as a video processing server, or the like. As shown in fig. 9, a schematic structural diagram of a terminal according to an embodiment of the present application is shown, specifically:
the electronic device may include one or more processing cores 'processors 401, one or more computer-readable storage media's memory 402, power supply 403, and input unit 404, among other components. It will be appreciated by those skilled in the art that the electronic device structure shown in fig. 9 is not limiting of the electronic device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components. Wherein:
The processor 401 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 402, and calling data stored in the memory 402, thereby performing overall monitoring of the electronic device. Optionally, processor 401 may include one or more processing cores; preferably, the processor 401 may integrate an application processor and a modem processor, wherein the application processor mainly processes an operating system, a user page, an application program, etc., and the modem processor mainly processes wireless communication. It will be appreciated that the modem processor described above may not be integrated into the processor 401.
The memory 402 may be used to store software programs and modules, and the processor 401 executes various functional applications and data processing by executing the software programs and modules stored in the memory 402. The memory 402 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data created according to the use of the computer device, etc. In addition, memory 402 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device. Accordingly, the memory 402 may also include a memory controller to provide the processor 401 with access to the memory 402.
The electronic device further comprises a power supply 403 for supplying power to the various components, preferably the power supply 403 may be logically connected to the processor 401 by a power management system, so that functions of managing charging, discharging, and power consumption are performed by the power management system. The power supply 403 may also include one or more of any of a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
The electronic device may further comprise an input unit 404, which input unit 404 may be used for receiving input digital or character information and generating keyboard, mouse, joystick, optical or trackball signal inputs in connection with user settings and function control.
Although not shown, the electronic device may further include a display unit or the like, which is not described herein. In particular, in this embodiment, the processor 401 in the electronic device loads executable files corresponding to the processes of one or more application programs into the memory 402 according to the following instructions, and the processor 401 executes the application programs stored in the memory 402, so as to implement various functions as follows:
receiving a sequence identifier of a video frame in a playing video;
Based on the sequence identification, detecting the loss of video frames of the played video;
when detecting that the video frame of the playing video is lost, identifying the picture change rate of the playing video;
and performing play adjustment processing on the play video based on the picture change rate.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
According to one aspect of the present application, there is provided a computer program application or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the methods provided in the various alternative implementations of the above embodiments.
It will be appreciated by those of ordinary skill in the art that all or part of the steps of the various methods of the above embodiments may be performed by a computer program, or by computer program control related hardware, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, the embodiments of the present application also provide a storage medium in which a computer program is stored, the computer program being capable of being loaded by a processor to perform the steps of any of the video processing methods provided by the embodiments of the present application. For example, the computer program may perform the steps of:
receiving a sequence identifier of a video frame in a playing video;
based on the sequence identification, detecting the loss of video frames of the played video;
when detecting that the video frame of the playing video is lost, identifying the picture change rate of the playing video;
and performing play adjustment processing on the play video based on the picture change rate.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
The steps in any video processing method provided in the embodiments of the present application may be executed by the computer program stored in the storage medium, so that the beneficial effects that any video processing method provided in the embodiments of the present application may be achieved, which are detailed in the previous embodiments and are not repeated herein.
The foregoing describes in detail a video processing method, apparatus, electronic device and storage medium provided in the embodiments of the present application, and specific examples are applied to illustrate principles and implementations of the present application, where the foregoing examples are only used to help understand the method and core idea of the present application; meanwhile, those skilled in the art will have variations in the specific embodiments and application scope in light of the ideas of the present application, and the present description should not be construed as limiting the present application in view of the above.

Claims (7)

1. A video processing method, comprising:
receiving a sequence identifier of a video frame in a playing video;
based on the sequence identification, detecting the loss of video frames of the played video;
when detecting that the video frame of the playing video is lost, calculating time information of the video frame of the playing video, wherein the time information refers to a difference value between the actual receiving time and the expected receiving time of the video frame;
carrying out statistical processing on the time information to obtain statistical time information of the video frame;
calculating the reciprocal of the counted time information to obtain the picture change rate of the played video;
comparing the picture change rate with a preset picture change rate to obtain a comparison result;
when the picture change rate is larger than the preset picture change rate, performing play adjustment processing on the play video in a pattern screen or mosaic mode;
and when the picture change rate is smaller than or equal to the preset picture change rate, performing play adjustment processing on the play video in a frame skipping mode.
2. The method of claim 1, wherein said video frame loss detection of said play video based on said sequence identification comprises:
Matching the sequence identifier with a preset reference sequence identifier, wherein the preset reference sequence identifier is the sequence identifier of the video frame to be received by the receiving end;
when the sequence identifier is greater than or equal to the preset reference sequence identifier, detecting whether a preset data pool comprises a video frame corresponding to the preset reference sequence identifier or not;
when the preset data pool does not comprise the video frames corresponding to the preset reference sequence identifiers, judging whether the acquisition time of the video frames corresponding to the preset reference sequence identifiers is overtime or not.
3. The method according to claim 2, wherein when the preset data pool does not include the video frame corresponding to the preset reference sequence identifier, determining whether the acquisition time of the video frame corresponding to the preset reference sequence identifier is timeout includes:
when the preset data pool does not comprise the video frames corresponding to the preset reference sequence identifiers, extracting the acquisition time of the video frames corresponding to the preset reference sequence identifiers;
matching the acquisition time with a preset time threshold;
and when the acquisition time is not matched with the preset time threshold, the acquisition of the video frame corresponding to the preset reference sequence identifier is overtime, and the video frame of the playing video is lost.
4. The method according to claim 2, wherein the method further comprises:
when the preset data pool comprises a video frame corresponding to the preset reference sequence identifier, updating the preset reference sequence identifier to obtain an updated reference sequence identifier;
and receiving the rest video frames in the played video based on the updated reference sequence identification.
5. A video processing apparatus, comprising:
the receiving unit is used for receiving the sequence identification of the video frames in the playing video;
the loss detection unit is used for detecting the loss of video frames of the playing video based on the sequence identification;
the identification unit is used for calculating the time information of the video frames of the playing video when the video frames of the playing video are detected to be lost, carrying out statistical processing on the time information to obtain the statistical time information of the video frames, calculating the reciprocal of the statistical time information to obtain the picture change rate of the playing video, wherein the time information refers to the difference value between the actual receiving time and the expected receiving time of the video frames;
and the play adjusting unit is used for comparing the picture change rate with a preset picture change rate to obtain a comparison result, carrying out play adjusting processing on the play video in a pattern screen or mosaic mode when the picture change rate is larger than the preset picture change rate, and carrying out play adjusting processing on the play video in a frame skipping mode when the picture change rate is smaller than or equal to the preset picture change rate.
6. An electronic device comprising a memory and a processor; the memory stores a computer program, and the processor is configured to execute the computer program in the memory to perform the steps in the video processing method according to any one of claims 1 to 4.
7. A storage medium storing a plurality of computer programs adapted to be loaded and run by a processor to perform the steps of the video processing method of any of claims 1 to 4.
CN202210048030.4A 2022-01-17 2022-01-17 Video processing method and device, electronic equipment and storage medium Active CN114422866B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210048030.4A CN114422866B (en) 2022-01-17 2022-01-17 Video processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210048030.4A CN114422866B (en) 2022-01-17 2022-01-17 Video processing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114422866A CN114422866A (en) 2022-04-29
CN114422866B true CN114422866B (en) 2023-07-25

Family

ID=81273486

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210048030.4A Active CN114422866B (en) 2022-01-17 2022-01-17 Video processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114422866B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116055790B (en) * 2022-07-29 2024-03-19 荣耀终端有限公司 Video playing method and system and electronic equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109640168A (en) * 2018-11-27 2019-04-16 Oppo广东移动通信有限公司 Method for processing video frequency, device, electronic equipment and computer-readable medium
CN110072123A (en) * 2018-01-24 2019-07-30 中兴通讯股份有限公司 A kind of recovery playback method, video playing terminal and the server of video
CN110572695A (en) * 2019-08-07 2019-12-13 苏州科达科技股份有限公司 media data encoding and decoding methods and electronic equipment
CN111491201A (en) * 2020-04-08 2020-08-04 深圳市昊一源科技有限公司 Method for adjusting video code stream and video frame loss processing method
CN112073823A (en) * 2020-09-02 2020-12-11 深圳创维数字技术有限公司 Frame loss processing method, video playing terminal and computer readable storage medium
CN113099272A (en) * 2021-04-12 2021-07-09 上海商汤智能科技有限公司 Video processing method and device, electronic equipment and storage medium
WO2021238940A1 (en) * 2020-05-26 2021-12-02 维沃移动通信有限公司 Video data processing method and apparatus, and electronic device
WO2021244440A1 (en) * 2020-06-04 2021-12-09 深圳市万普拉斯科技有限公司 Method, apparatus, and system for adjusting image quality of television, and television set

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111416997B (en) * 2020-03-31 2022-11-08 百度在线网络技术(北京)有限公司 Video playing method and device, electronic equipment and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110072123A (en) * 2018-01-24 2019-07-30 中兴通讯股份有限公司 A kind of recovery playback method, video playing terminal and the server of video
CN109640168A (en) * 2018-11-27 2019-04-16 Oppo广东移动通信有限公司 Method for processing video frequency, device, electronic equipment and computer-readable medium
CN110572695A (en) * 2019-08-07 2019-12-13 苏州科达科技股份有限公司 media data encoding and decoding methods and electronic equipment
CN111491201A (en) * 2020-04-08 2020-08-04 深圳市昊一源科技有限公司 Method for adjusting video code stream and video frame loss processing method
WO2021238940A1 (en) * 2020-05-26 2021-12-02 维沃移动通信有限公司 Video data processing method and apparatus, and electronic device
WO2021244440A1 (en) * 2020-06-04 2021-12-09 深圳市万普拉斯科技有限公司 Method, apparatus, and system for adjusting image quality of television, and television set
CN112073823A (en) * 2020-09-02 2020-12-11 深圳创维数字技术有限公司 Frame loss processing method, video playing terminal and computer readable storage medium
CN113099272A (en) * 2021-04-12 2021-07-09 上海商汤智能科技有限公司 Video processing method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN114422866A (en) 2022-04-29

Similar Documents

Publication Publication Date Title
EP2328349B1 (en) Information processing system and information processing device
CN109089130B (en) Method and device for adjusting timestamp of live video
CN108810657B (en) Method and system for setting video cover
CN104702563A (en) Method and device for acquiring streaming media play time
CN110662017B (en) Video playing quality detection method and device
CN111093094A (en) Video transcoding method, device and system, electronic equipment and readable storage medium
US20190245945A1 (en) Rapid optimization of media stream bitrate
CN109714622A (en) A kind of video data handling procedure, device and electronic equipment
CN114422866B (en) Video processing method and device, electronic equipment and storage medium
CN112787945A (en) Data transmission method and device, computer readable medium and electronic equipment
CN113741762A (en) Multimedia playing method, device, electronic equipment and storage medium
CN114025389A (en) Data transmission method and device, computer equipment and storage medium
CN111935497B (en) Video stream management method and data server for traffic police system
CN113259729B (en) Data switching method, server, system and storage medium
CN106791714B (en) The matching process and equipment of IP Camera and server device
EP3255893A1 (en) Communication systems
CN115209189B (en) Video stream transmission method, system, server and storage medium
CN112437332B (en) Playing method and device of target multimedia information
CN109999490B (en) Method and system for reducing networking cloud application delay
CN115514980A (en) Push stream live broadcast management method and device, computer and readable storage medium
CN114416013A (en) Data transmission method, data transmission device, electronic equipment and computer-readable storage medium
CN108024121B (en) Voice barrage synchronization method and system
CN113973215A (en) Data deduplication method and device and storage medium
CN114244843A (en) Streaming media downloading method, electronic equipment and storage medium
CN113542813A (en) Data transmission method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant