CN110798725A - Data processing method and device - Google Patents

Data processing method and device Download PDF

Info

Publication number
CN110798725A
CN110798725A CN201810870810.0A CN201810870810A CN110798725A CN 110798725 A CN110798725 A CN 110798725A CN 201810870810 A CN201810870810 A CN 201810870810A CN 110798725 A CN110798725 A CN 110798725A
Authority
CN
China
Prior art keywords
video
data
audio
frame
playing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810870810.0A
Other languages
Chinese (zh)
Inventor
焦克新
安君超
韩杰
王艳辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Visionvera Information Technology Co Ltd
Original Assignee
Visionvera Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Visionvera Information Technology Co Ltd filed Critical Visionvera Information Technology Co Ltd
Priority to CN201810870810.0A priority Critical patent/CN110798725A/en
Publication of CN110798725A publication Critical patent/CN110798725A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47202End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting content on demand, e.g. video on demand

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)

Abstract

The embodiment of the invention provides a data processing method and a data processing device, wherein the method comprises the following steps: a video network terminal acquires a video-on-demand instruction and generates a corresponding video-on-demand request according to the video-on-demand instruction; sending the video-on-demand request to a video-on-demand server so that the video-on-demand server acquires a terminal identifier of a virtual terminal in an on-demand server according to the video-on-demand request; receiving a terminal identification returned by a video networking server, and establishing a video networking data channel between virtual terminals corresponding to the terminal identification; receiving video data through the video networking data channel, wherein the video data are sent by the virtual terminal according to the video on demand request, and the video data comprise audio data and image data; determining a synchronous reference clock according to the audio sampling rate of the audio data; synchronously playing the audio data and the image data according to the synchronous reference clock; and then the problem of unsynchronization of images and sound can be effectively solved.

Description

Data processing method and device
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a data processing method and a data processing apparatus.
Background
The video networking is an important milestone for network development, is a higher-level form of the Internet, is a real-time network, can realize the real-time transmission of full-network high-definition videos which cannot be realized by the existing Internet, and pushes a plurality of Internet applications to high-definition video; therefore, large-scale high-definition video comprehensive services such as high-definition video conferences, video monitoring, intelligent monitoring analysis, emergency command and the like are realized on one platform.
The video networking can realize live video and video on demand, so that a user can watch videos which the user wants to watch through the video on demand on the video networking terminal; in the process of playing video, the situation that the sound and the picture are not synchronized often occurs, which affects the user experience, and therefore, the synchronization of the sound and the picture is a problem to be solved urgently.
Disclosure of Invention
The embodiment of the invention provides a data processing method, which aims to effectively solve the problem of synchronization of sound and pictures in a video.
Correspondingly, the embodiment of the invention also provides a data processing device, which is used for ensuring the realization and the application of the method.
In order to solve the above problems, the present invention discloses a data processing method, which is applied to a video network video system, wherein the video network video system comprises a video network terminal, a video network server and an on-demand server, and the method comprises the following steps: a video network terminal acquires a video-on-demand instruction and generates a corresponding video-on-demand request according to the video-on-demand instruction; sending the video-on-demand request to a video-on-demand server so that the video-on-demand server acquires a terminal identifier of a virtual terminal in an on-demand server according to the video-on-demand request; receiving a terminal identification returned by a video networking server, and establishing a video networking data channel between virtual terminals corresponding to the terminal identification; receiving video data through the video networking data channel, wherein the video data are sent by the virtual terminal according to the video on demand request, and the video data comprise audio data and image data; determining a synchronous reference clock according to the audio sampling rate of the audio data; and synchronously playing the audio data and the image data according to the synchronous reference clock.
The invention also discloses a data processing device, which is applied to the video network terminal in the video network video system, wherein the video network video system comprises the video network terminal, the video network server and the on-demand server; the device comprises: the request generation module is used for acquiring a video-on-demand instruction and generating a corresponding video-on-demand request according to the video-on-demand instruction; the request sending module is used for sending the video-on-demand request to a video-on-demand server so that the video-on-demand server can obtain the terminal identification of the virtual terminal in the on-demand server according to the video-on-demand request; the channel establishing module is used for receiving a terminal identifier returned by the video networking server and establishing a video networking data channel between virtual terminals corresponding to the terminal identifier; the data receiving module is used for receiving video data through the video networking data channel, the video data are sent by the virtual terminal according to the video on demand request, and the video data comprise audio data and image data; the clock determining module is used for determining a synchronous reference clock according to the audio sampling rate of the audio data; and the synchronous playing module is used for synchronously playing the audio data and the image data according to the synchronous reference clock.
Compared with the prior art, the embodiment of the invention has the following advantages:
in the embodiment of the invention, a video networking terminal can interact with an on-demand server to acquire corresponding video data, wherein after acquiring an on-demand instruction, the video networking terminal can generate a corresponding video-on-demand request according to the on-demand instruction, and the video-on-demand request is sent to the video networking server, so that the video networking server acquires a terminal identifier of a virtual terminal in the on-demand server according to the video-on-demand request, receives the terminal identifier returned by the video networking server, and establishes a video networking data channel between the virtual terminals corresponding to the terminal identifier; video data, including audio data and image data, may then be received over the video networking data channel. Then, in the process of playing video data, the video network terminal can determine a synchronous reference clock according to the audio sampling rate of the audio data, and synchronously play the audio data and the image data according to the synchronous reference clock; and then the problem of unsynchronization of images and sound can be effectively solved.
Drawings
FIG. 1 is a schematic networking diagram of a video network of the present invention;
FIG. 2 is a schematic diagram of a hardware architecture of a node server according to the present invention;
fig. 3 is a schematic diagram of a hardware structure of an access switch of the present invention;
fig. 4 is a schematic diagram of a hardware structure of an ethernet protocol conversion gateway according to the present invention;
FIG. 5 is a schematic diagram of a video networking video system framework according to an embodiment of the invention;
FIG. 6 is a flow chart of the steps of a data processing method embodiment of the present invention;
FIG. 7 is a flow chart of steps in another data processing method embodiment of the present invention;
FIG. 8 is a block diagram of an embodiment of a data processing apparatus of the present invention;
FIG. 9 is a block diagram of another data processing apparatus embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
The video networking adopts a real-time high-definition video exchange technology, can integrate required services such as dozens of services of video, voice, pictures, characters, communication, data and the like on a system platform on a network platform, such as high-definition video conference, video monitoring, intelligent monitoring analysis, emergency command, digital broadcast television, delayed television, network teaching, live broadcast, VOD on demand, television mail, Personal Video Recorder (PVR), intranet (self-office) channels, intelligent video broadcast control, information distribution and the like, and realizes high-definition quality video broadcast through a television or a computer.
To better understand the embodiments of the present invention, the following description refers to the internet of view:
some of the technologies applied in the video networking are as follows:
network Technology (Network Technology)
Network technology innovation in video networking has improved over traditional Ethernet (Ethernet) to face the potentially enormous video traffic on the network. Unlike pure network Packet Switching (Packet Switching) or network circuit Switching (circuit Switching), the Packet Switching is adopted by the technology of the video networking to meet the Streaming requirement. The video networking technology has the advantages of flexibility, simplicity and low price of packet switching, and simultaneously has the quality and safety guarantee of circuit switching, thereby realizing the seamless connection of the whole network switching type virtual circuit and the data format.
Switching Technology (Switching Technology)
The video network adopts two advantages of asynchronism and packet switching of the Ethernet, eliminates the defects of the Ethernet on the premise of full compatibility, has end-to-end seamless connection of the whole network, is directly communicated with a user terminal, and directly bears an IP data packet. The user data does not require any format conversion across the entire network. The video networking is a higher-level form of the Ethernet, is a real-time exchange platform, can realize the real-time transmission of the whole-network large-scale high-definition video which cannot be realized by the existing Internet, and pushes a plurality of network video applications to high-definition and unification.
Server Technology (Server Technology)
The server technology on the video networking and unified video platform is different from the traditional server, the streaming media transmission of the video networking and unified video platform is established on the basis of connection orientation, the data processing capacity of the video networking and unified video platform is independent of flow and communication time, and a single network layer can contain signaling and data transmission. For voice and video services, the complexity of video networking and unified video platform streaming media processing is much simpler than that of data processing, and the efficiency is greatly improved by more than one hundred times compared with that of a traditional server.
Storage Technology (Storage Technology)
The super-high speed storage technology of the unified video platform adopts the most advanced real-time operating system in order to adapt to the media content with super-large capacity and super-large flow, the program information in the server instruction is mapped to the specific hard disk space, the media content is not passed through the server any more, and is directly sent to the user terminal instantly, and the general waiting time of the user is less than 0.2 second. The optimized sector distribution greatly reduces the mechanical motion of the magnetic head track seeking of the hard disk, the resource consumption only accounts for 20% of that of the IP internet of the same grade, but concurrent flow which is 3 times larger than that of the traditional hard disk array is generated, and the comprehensive efficiency is improved by more than 10 times.
Network Security Technology (Network Security Technology)
The structural design of the video network completely eliminates the network security problem troubling the internet structurally by the modes of independent service permission control each time, complete isolation of equipment and user data and the like, generally does not need antivirus programs and firewalls, avoids the attack of hackers and viruses, and provides a structural carefree security network for users.
Service Innovation Technology (Service Innovation Technology)
The unified video platform integrates services and transmission, and is not only automatically connected once whether a single user, a private network user or a network aggregate. The user terminal, the set-top box or the PC are directly connected to the unified video platform to obtain various multimedia video services in various forms. The unified video platform adopts a menu type configuration table mode to replace the traditional complex application programming, can realize complex application by using very few codes, and realizes infinite new service innovation.
Networking of the video network is as follows:
the video network is a centralized control network structure, and the network can be a tree network, a star network, a ring network and the like, but on the basis of the centralized control node, the whole network is controlled by the centralized control node in the network.
As shown in fig. 1, the video network is divided into an access network and a metropolitan network.
The devices of the access network part can be mainly classified into 3 types: node server, access switch, terminal (including various set-top boxes, coding boards, memories, etc.). The node server is connected to an access switch, which may be connected to a plurality of terminals and may be connected to an ethernet network.
The node server is a node which plays a centralized control function in the access network and can control the access switch and the terminal. The node server can be directly connected with the access switch or directly connected with the terminal.
Similarly, devices of the metropolitan network portion may also be classified into 3 types: a metropolitan area server, a node switch and a node server. The metro server is connected to a node switch, which may be connected to a plurality of node servers.
The node server is a node server of the access network part, namely the node server belongs to both the access network part and the metropolitan area network part.
The metropolitan area server is a node which plays a centralized control function in the metropolitan area network and can control a node switch and a node server. The metropolitan area server can be directly connected with the node switch or directly connected with the node server.
Therefore, the whole video network is a network structure with layered centralized control, and the network controlled by the node server and the metropolitan area server can be in various structures such as tree, star and ring.
The access network part can form a unified video platform (the part in the dotted circle), and a plurality of unified video platforms can form a video network; each unified video platform may be interconnected via metropolitan area and wide area video networking.
Video networking device classification
1.1 devices in the video network of the embodiment of the present invention can be mainly classified into 3 types: servers, switches (including ethernet gateways), terminals (including various set-top boxes, code boards, memories, etc.). The video network as a whole can be divided into a metropolitan area network (or national network, global network, etc.) and an access network.
1.2 wherein the devices of the access network part can be mainly classified into 3 types: node servers, access switches (including ethernet gateways), terminals (including various set-top boxes, code boards, memories, etc.).
The specific hardware structure of each access network device is as follows:
a node server:
as shown in fig. 2, the system mainly includes a network interface module 201, a switching engine module 202, a CPU module 203, and a disk array module 204;
the network interface module 201, the CPU module 203, and the disk array module 204 all enter the switching engine module 202; the switching engine module 202 performs an operation of looking up the address table 205 on the incoming packet, thereby obtaining the direction information of the packet; and stores the packet in a queue of the corresponding packet buffer 206 based on the packet's steering information; if the queue of the packet buffer 206 is nearly full, it is discarded; the switching engine module 202 polls all packet buffer queues for forwarding if the following conditions are met: 1) the port send buffer is not full; 2) the queue packet counter is greater than zero. The disk array module 204 mainly implements control over the hard disk, including initialization, read-write, and other operations on the hard disk; the CPU module 203 is mainly responsible for protocol processing with an access switch and a terminal (not shown in the figure), configuring an address table 205 (including a downlink protocol packet address table, an uplink protocol packet address table, and a data packet address table), and configuring the disk array module 204.
The access switch:
as shown in fig. 3, the network interface module mainly includes a network interface module (a downlink network interface module 301 and an uplink network interface module 302), a switching engine module 303 and a CPU module 304;
wherein, the packet (uplink data) coming from the downlink network interface module 301 enters the packet detection module 305; the packet detection module 305 detects whether the Destination Address (DA), the Source Address (SA), the packet type, and the packet length of the packet meet the requirements, and if so, allocates a corresponding stream identifier (stream-id) and enters the switching engine module 303, otherwise, discards the stream identifier; the packet (downstream data) coming from the upstream network interface module 302 enters the switching engine module 303; the data packet coming from the CPU module 204 enters the switching engine module 303; the switching engine module 303 performs an operation of looking up the address table 306 on the incoming packet, thereby obtaining the direction information of the packet; if the packet entering the switching engine module 303 is from the downstream network interface to the upstream network interface, the packet is stored in the queue of the corresponding packet buffer 307 in association with the stream-id; if the queue of the packet buffer 307 is nearly full, it is discarded; if the packet entering the switching engine module 303 is not from the downlink network interface to the uplink network interface, the data packet is stored in the queue of the corresponding packet buffer 307 according to the guiding information of the packet; if the queue of the packet buffer 307 is nearly full, it is discarded.
The switching engine module 303 polls all packet buffer queues, which in this embodiment of the present invention is divided into two cases:
if the queue is from the downlink network interface to the uplink network interface, the following conditions are met for forwarding: 1) the port send buffer is not full; 2) the queued packet counter is greater than zero; 3) obtaining a token generated by a code rate control module;
if the queue is not from the downlink network interface to the uplink network interface, the following conditions are met for forwarding: 1) the port send buffer is not full; 2) the queue packet counter is greater than zero.
The rate control module 208 is configured by the CPU module 204, and generates tokens for packet buffer queues from all downstream network interfaces to upstream network interfaces at programmable intervals to control the rate of upstream forwarding.
The CPU module 304 is mainly responsible for protocol processing with the node server, configuration of the address table 306, and configuration of the code rate control module 308.
Ethernet protocol conversion gateway
As shown in fig. 4, the apparatus mainly includes a network interface module (a downlink network interface module 401 and an uplink network interface module 402), a switching engine module 403, a CPU module 404, a packet detection module 405, a rate control module 408, an address table 406, a packet buffer 407, a MAC adding module 409, and a MAC deleting module 410.
Wherein, the data packet coming from the downlink network interface module 401 enters the packet detection module 405; the packet detection module 405 detects whether the ethernet MAC DA, the ethernet MAC SA, the ethernet length or frame type, the video network destination address DA, the video network source address SA, the video network packet type, and the packet length of the packet meet the requirements, and if so, allocates a corresponding stream identifier (stream-id); then, the MAC deletion module 410 subtracts MAC DA, MAC SA, length or frame type (2byte) and enters the corresponding receiving buffer, otherwise, discards it;
the downlink network interface module 401 detects the sending buffer of the port, and if there is a packet, obtains the ethernet MAC DA of the corresponding terminal according to the destination address DA of the packet, adds the ethernet MAC DA of the terminal, the MACSA of the ethernet coordination gateway, and the ethernet length or frame type, and sends the packet.
The other modules in the ethernet protocol gateway function similarly to the access switch.
A terminal:
the system mainly comprises a network interface module, a service processing module and a CPU module; for example, the set-top box mainly comprises a network interface module, a video and audio coding and decoding engine module and a CPU module; the coding board mainly comprises a network interface module, a video and audio coding engine module and a CPU module; the memory mainly comprises a network interface module, a CPU module and a disk array module.
1.3 devices of the metropolitan area network part can be mainly classified into 2 types: node server, node exchanger, metropolitan area server. The node switch mainly comprises a network interface module, a switching engine module and a CPU module; the metropolitan area server mainly comprises a network interface module, a switching engine module and a CPU module.
2. Video networking packet definition
2.1 Access network packet definition
The data packet of the access network mainly comprises the following parts: destination Address (DA), Source Address (SA), reserved bytes, payload (pdu), CRC.
As shown in table 1, the data packet of the access network mainly includes the following parts:
Figure BDA0001752094460000081
TABLE 1
Wherein:
the Destination Address (DA) is composed of 8 bytes (byte), the first byte represents the type of the data packet (such as various protocol packets, multicast data packets, unicast data packets, etc.), there are 256 possibilities at most, the second byte to the sixth byte are metropolitan area network addresses, and the seventh byte and the eighth byte are access network addresses;
the Source Address (SA) is also composed of 8 bytes (byte), defined as the same as the Destination Address (DA);
the reserved byte consists of 2 bytes;
the payload part has different lengths according to different types of datagrams, and is 64 bytes if the datagram is various types of protocol packets, and is 32+1024 or 1056 bytes if the datagram is a unicast packet, of course, the length is not limited to the above 2 types;
the CRC consists of 4 bytes and is calculated in accordance with the standard ethernet CRC algorithm.
2.2 metropolitan area network packet definition
The topology of a metropolitan area network is a graph and there may be 2, or even more than 2, connections between two devices, i.e., there may be more than 2 connections between a node switch and a node server, a node switch and a node switch, and a node switch and a node server. However, the metro network address of the metro network device is unique, and in order to accurately describe the connection relationship between the metro network devices, parameters are introduced in the embodiment of the present invention: a label to uniquely describe a metropolitan area network device.
In this specification, the definition of the Label is similar to that of the Label of MPLS (Multi-Protocol Label Switch), and assuming that there are two connections between the device a and the device B, there are 2 labels for the packet from the device a to the device B, and 2 labels for the packet from the device B to the device a. The label is classified into an incoming label and an outgoing label, and assuming that the label (incoming label) of the packet entering the device a is 0x0000, the label (outgoing label) of the packet leaving the device a may become 0x 0001. The network access process of the metro network is a network access process under centralized control, that is, address allocation and label allocation of the metro network are both dominated by the metro server, and the node switch and the node server are both passively executed, which is different from label allocation of MPLS, and label allocation of MPLS is a result of mutual negotiation between the switch and the server.
As shown in table 2, the data packet of the metropolitan area network mainly includes the following parts:
TABLE 2
Namely Destination Address (DA), Source Address (SA), Reserved byte (Reserved), tag, payload (pdu), CRC. The format of the tag may be defined by reference to the following: the tag is 32 bits with the upper 16 bits reserved and only the lower 16 bits used, and its position is between the reserved bytes and payload of the packet.
The data processing method provided by the embodiment of the invention is applied to a video network video system, and as shown in fig. 5, the video network video system comprises a video network terminal, a video network server and an on-demand server; the video network terminal is connected with the video network server, the video network server is connected with the on-demand server, and the video network terminal performs data interaction with the on-demand server through the video network server. The video network server is used for providing data forwarding service, and the on-demand server is a video network server which can be used for providing video service.
Referring to fig. 6, a flowchart illustrating steps of an embodiment of a data processing method of the present invention specifically includes:
step 601, the video network terminal obtains a request order and generates a corresponding video request according to the request order.
In the embodiment of the invention, a user can watch recorded and broadcasted video data by adopting a video network terminal, and after the user executes selection operation aiming at the video data needing to be watched, the video network terminal can obtain an on-demand instruction corresponding to the selection operation; and then generating a corresponding video-on-demand request according to the on-demand instruction. The video-on-demand request is used for indicating to acquire corresponding video data, and the video-on-demand request may include various information, such as terminal information of a video network terminal, for example, a terminal identifier, and related information of the video data, for example, a video identifier.
Step 602, sending the video-on-demand request to a video-on-demand server, so that the video-on-demand server obtains a terminal identifier of a virtual terminal in an on-demand server according to the video-on-demand request.
Step 603, receiving a terminal identifier returned by the video network server, and establishing a video network data channel between the virtual terminals corresponding to the terminal identifier.
In the embodiment of the invention, the video data can be stored in a database of the on-demand server or in an independent file server, and the video data in the file server can be read and written by the video network server; therefore, the video network terminal can send the video on demand request to the video network server through a signaling channel, such as channel a in fig. 5, and establish a video network data channel with the video on demand server through the video network server. After receiving the video-on-demand request, the video-on-demand server may forward the video-on-demand request to the on-demand server through a signaling channel, such as channel B in fig. 5; the video-on-demand server can create a virtual terminal according to the video-on-demand request, and then return the terminal identifier of the virtual terminal to the video-networking server, wherein the virtual terminal can be a virtual video-networking terminal and is a virtual terminal device. After the video network server receives the terminal identifier of the virtual terminal, on one hand, a video network data channel corresponding to the terminal identifier, such as channel 1 in fig. 5, can be established between the video network server and the virtual terminal; on the other hand, the terminal identification of the virtual terminal can be returned to the video network terminal. After receiving the terminal identifier of the virtual terminal, the video network terminal may establish a video network data channel, such as channel 2 in fig. 5, between the video network terminal and the video network server, where the video network data channel corresponds to the terminal identifier; and then the video networking data channel of the video networking terminal and the virtual terminal in the on-demand server is completed.
And step 604, receiving video data through the video networking data channel, wherein the video data is sent by the virtual terminal according to the video on demand request, and the video data comprises audio data and image data.
Then the virtual terminal can acquire video data corresponding to the video-on-demand request, and then the video data is sent to the video networking terminal through the video networking data channel, so that the video networking terminal can receive the video data through the video networking data channel; wherein the video data may include audio data and image data.
Step 605, determining a synchronous reference clock according to the audio sampling rate of the audio data.
And step 606, synchronously playing the audio data and the image data according to the synchronous reference clock.
After receiving the video data, the video network terminal can determine a synchronous reference clock, and then play audio data and image data according to the synchronous reference clock so as to ensure synchronous playing of sound and images; wherein, since the audio data has high stability compared to the image data transmission, the synchronous reference clock can be determined according to the audio sampling rate of the audio data.
In the embodiment of the invention, a video networking terminal can interact with an on-demand server to acquire corresponding video data, wherein after acquiring an on-demand instruction, the video networking terminal can generate a corresponding video-on-demand request according to the on-demand instruction, and the video-on-demand request is sent to the video networking server, so that the video networking server acquires a terminal identifier of a virtual terminal in the on-demand server according to the video-on-demand request, receives the terminal identifier returned by the video networking server, and establishes a video networking data channel between the virtual terminals corresponding to the terminal identifier; video data, including audio data and image data, may then be received over the video networking data channel. Then, in the process of playing video data, the video network terminal can determine a synchronous reference clock according to the audio sampling rate of the audio data, and synchronously play the audio data and the image data according to the synchronous reference clock; and then the problem of unsynchronization of images and sound can be effectively solved.
In another embodiment of the present invention, each frame of audio data and each frame of image data may be played according to the playing duration of one frame of audio data, the playing duration of one frame of image data, and a synchronization reference clock, so as to ensure the synchronization of sound and image.
Referring to fig. 7, a flowchart illustrating steps of another data processing method according to an embodiment of the present invention specifically includes the following steps:
step 701, the video network terminal obtains a request order and generates a corresponding video request according to the request order.
In the embodiment of the invention, a user can watch recorded and broadcasted video data by adopting a video network terminal, and after the user executes selection operation aiming at the video data needing to be watched, the video network terminal can obtain an on-demand instruction corresponding to the selection operation; and then generating a corresponding video-on-demand request according to the on-demand instruction. The video-on-demand request is used for indicating to acquire corresponding video data, and the video-on-demand request may include various information, such as terminal information of a video network terminal, for example, a terminal identifier, and related information of the video data, for example, a video identifier.
Step 702, sending the video-on-demand request to a video-on-demand server, so that the video-on-demand server obtains a terminal identifier of a virtual terminal in an on-demand server according to the video-on-demand request.
Step 703, receiving a terminal identifier returned by the video network server, and establishing a video network data channel between the virtual terminals corresponding to the terminal identifier.
In the embodiment of the invention, the video data can be stored in a database of the on-demand server or in an independent file server, and the video data in the file server can be read and written by the video network server; therefore, the video network terminal can send the video on demand request to the video network server through a signaling channel, such as channel a in fig. 5, and invoke the video network server to establish a video network data channel with the video on demand server. After receiving the vod request, the video-on-demand server may forward the vod request to the on-demand server through a signaling channel, for example, a channel B in fig. 5. After receiving the video on demand request, the on demand server can search whether the terminal identification of the available virtual terminal exists or not; if the virtual terminal exists, the virtual terminal corresponding to the terminal identification can be created, and if the virtual terminal does not exist, the virtual terminal cannot be created, so that a video networking data channel cannot be established with the video networking terminal; and then the on-demand server returns the terminal identification of the virtual terminal to the video network server. After the video network server receives the terminal identifier of the virtual terminal, on one hand, a video network data channel corresponding to the terminal identifier, such as channel 1 in fig. 5, can be established between the video network server and the virtual terminal; on the other hand, the terminal identification of the virtual terminal can be returned to the video network terminal. After receiving the terminal identifier of the virtual terminal, the video network terminal may establish a video network data channel, such as channel 2 in fig. 5, between the video network terminal and the video network server, where the video network data channel corresponds to the terminal identifier; and then the video networking data channel of the video networking terminal and the virtual terminal in the on-demand server is completed.
Step 704, receiving video data through the video networking data channel, where the video data is sent by the virtual terminal according to the video on demand request, and the video data includes audio data and image data.
In the embodiment of the invention, after the virtual terminal determines that the video networking data channel between the virtual terminal and the video networking terminal is successfully established, the identification information such as video ID corresponding to the video data can be obtained from the video on demand request, and then the video data corresponding to the identification information is searched from a database or a file server according to the identification information. After the corresponding video data is found, the audio data and the image data in the video data may be compressed according to a set format, for example, the audio data is encoded by using AAC (advanced audio Coding), and the image data is encoded by using H264 (a digital video compression format). Then sending the audio code stream corresponding to the compressed audio data and the image code stream corresponding to the compressed image data to a video network server through corresponding video network data channels; and the video network server sends the received audio code stream and the received image code stream to the video network terminal through the corresponding video network data channel.
After the video network terminal receives the code stream, the audio code stream and the image code stream can be placed in a video buffer area, a subsequent audio decoder can read audio data from the buffer area, and an image decoder can distinguish image data from the buffer area.
Step 705, determining an audio playing duration for playing a frame of audio data according to the audio sampling rate of the audio data.
Step 706, determining a reference time interval according to a decoding time length and an audio playing time length corresponding to a frame of audio data.
Step 707, generating the synchronous reference clock according to the reference time interval.
In the embodiment of the invention, before the audio decoder and the image decoder read out data from the buffer area, the synchronous reference clock can be determined; the audio playing time length for playing a frame of audio data may be determined according to the audio sampling rate of the audio data, for example, if the audio sampling rate of the on-demand server is 44100Hz, the playing time of an AAC audio frame is 1024 × 1000/44100 — 23.21 ms. Then, determining a reference time interval according to the decoding time length and the audio playing time length corresponding to one frame of audio data, wherein the sum of the decoding time length and the audio playing time length can be used as the reference time interval; for example, the decoding duration of the video network terminal is 7ms, the reference time interval may be 30 ms. Generating the synchronous reference clock according to the reference time interval, wherein the time interval between two adjacent synchronous reference times in the synchronous reference clock can be the reference time interval; for example, if the reference time interval is 30ms, the first synchronization reference time of the synchronization reference clock is 0ms, the second synchronization reference time is 30ms, the third synchronization reference time is 60ms, and so on.
Step 708, determining an image playing duration for playing a frame of image data according to the image sampling rate of the image data.
And 709, synchronously playing the audio data and the image data according to the synchronous reference clock and the image playing duration.
And then the audio decoder and the video decoder can read corresponding data according to the synchronous reference clock, wherein at the first synchronous reference time, the audio decoder can acquire the first frame of audio data from the buffer area, then decode the first frame of audio data, send the decoded audio data to an audio playing module such as a sound card, and further the audio playing module can play the first frame of audio data. During the first synchronous reference time, the video decoder may also read the first frame of image data from the buffer, decode the first frame of image data, and send the decoded audio data to an image playing module, such as a video card, so that the image playing module can play the first frame of image data.
In the subsequent process, after the audio decoder determines that the audio playing module plays one frame of audio data, the next frame of audio data can be read from the buffer area; when the audio decoder starts to read the next frame of audio data, it can determine whether to allow the video decoder to read the next frame of image data according to the corresponding synchronization reference time and the image playing duration of the frame of image data. Therefore, the embodiment of the present invention may determine the image playing duration of a frame of image data first, wherein the image playing duration of playing a frame of image data may be determined according to the image sampling rate of the image data, for example: a frame rate of 25fps is calculated to obtain a play time of 40ms per frame. And then synchronously playing the audio data and the image data according to the synchronous reference clock and the image playing time length, namely judging whether to read the next frame of image data according to the synchronous reference clock and the image playing time length so as to ensure the synchronous playing of the audio data and the image data.
Wherein, according to the synchronous reference clock and the image playing time length, the audio data and the image data are synchronously played, which can be specifically realized by substeps 91-94:
and a substep 91 of determining a synchronization reference time according to the synchronization reference clock when reading the nth frame of audio data.
And a substep 92, judging whether the M frame image data is allowed to be read or not according to the synchronous reference time and the image playing time length.
And a substep 93 of calling an image decoder to read the image data of the Mth frame.
Substep 94, when the audio decoder reads the N +1 th frame of audio data, performing the step of determining the synchronization reference time according to the synchronization reference clock.
In the subsequent playing process, when an audio decoder reads the Nth frame of audio data, which synchronous reference time on the corresponding synchronous reference clock at the moment can be determined, and then whether the Mth frame of image data is allowed to be read or not is judged according to the synchronous reference time and the image playing time; wherein N and M are integers greater than 1. Furthermore, on one hand, the audio decoder continues to read and decode the nth frame of audio data, and then sends the decoded audio data to the audio playing device for playing, on the other hand, when the mth frame of image data is determined to be allowed to be read, the image decoder can acquire and decode the mth frame of image data from the buffer area, and then send the decoded mth frame of image data to the image player, and the image player can play the mth frame of image data after the M-1 th frame of image data is played. When it is determined that the reading of the image data of the mth frame is not permitted, the reading may not be permitted, and when waiting for the audio decoder to read the audio data of the (N + 1) th frame, it is determined whether the reading of the image data of the mth frame is permitted. Therefore, the audio data and the image data are synchronously played in the above mode, and the synchronization of sound and images is ensured. Wherein N and M are integers greater than 1.
Wherein, whether the reading of the Mth frame image data is allowed can be judged by the following sub-steps:
substep S1, calculating the product of the image playing time length and M, and determining the difference value of the product and the synchronous reference time;
substep S2, determining whether the difference is less than a threshold value;
substep S3, if less than, allowing reading of the mth frame image data;
if the result is greater than the substep S4, the image data of the mth frame is not permitted to be read.
Wherein the threshold value can be set according to requirements.
In the embodiment of the invention, a video networking terminal can interact with an on-demand server to acquire corresponding video data, wherein after acquiring an on-demand instruction, the video networking terminal can generate a corresponding video-on-demand request according to the on-demand instruction, and the video-on-demand request is sent to the video networking server, so that the video networking server acquires a terminal identifier of a virtual terminal in the on-demand server according to the video-on-demand request, receives the terminal identifier returned by the video networking server, and establishes a video networking data channel between the virtual terminals corresponding to the terminal identifier; video data, including audio data and image data, may then be received over the video networking data channel. Then, in the process of playing video data, the video network terminal can determine a synchronous reference clock according to the audio sampling rate of the audio data, and synchronously play the audio data and the image data according to the synchronous reference clock; and then the problem of unsynchronization of images and sound can be effectively solved.
Secondly, in the embodiment of the present invention, when the audio data and the image data are synchronously played according to the synchronous reference clock and the image playing duration, when an audio decoder reads the nth frame of audio data, the synchronous reference time is determined according to the synchronous reference clock, whether reading of the mth frame of image data is allowed is determined according to the synchronous reference time and the image playing duration, if it is determined that reading of the mth frame of image data is allowed, the image decoder is called to read the mth frame of image data, and if it is determined that reading of the mth frame of image data is not allowed, the step of determining the synchronous reference time according to the synchronous reference clock is performed when the audio decoder reads the N +1 th frame of audio data; further ensuring the synchronous playing of the audio data and the image data.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
The embodiment of the invention also provides a data processing device, which is applied to the video networking terminal in the video networking video system, wherein the video networking video system comprises the video networking terminal, the video networking server and the on-demand server.
Referring to fig. 8, a block diagram of a data processing apparatus according to an embodiment of the present invention is shown, which may specifically include: a request generating module 801, a request sending module 802, a channel establishing module 803, a data receiving module 804, a clock determining module 805 and a synchronous playing module 806.
A request generating module 801, configured to obtain a video-on-demand instruction, and generate a corresponding video-on-demand request according to the video-on-demand instruction;
a request sending module 802, configured to send the video-on-demand request to a video-on-demand server, so that the video-on-demand server obtains a terminal identifier of a virtual terminal in an on-demand server according to the video-on-demand request;
a channel establishing module 803, configured to receive a terminal identifier returned by the video networking server, and establish a video networking data channel between virtual terminals corresponding to the terminal identifier;
a data receiving module 804, configured to receive video data through the video networking data channel, where the video data is sent by the virtual terminal according to the video-on-demand request, and the video data includes audio data and image data;
a clock determining module 805, configured to determine a synchronous reference clock according to an audio sampling rate of the audio data;
and a synchronous playing module 806, configured to synchronously play the audio data and the image data according to the synchronous reference clock.
Referring to FIG. 9, a block diagram of another data processing apparatus embodiment of the present invention is shown.
In an optional embodiment of the present invention, the clock determining module 805 is specifically configured to determine an audio playing duration for playing a frame of audio data according to an audio sampling rate of the audio data; determining a reference time interval according to the decoding time length and the audio playing time length corresponding to one frame of audio data; and generating the synchronous reference clock according to the reference time interval.
In an optional embodiment of the present invention, the synchronized playing module 806 includes: a time length determining sub-module 8061 and a data synchronous playing sub-module 8062, wherein,
a time length determining submodule 8061, configured to determine an image playing time length for playing one frame of image data according to the image sampling rate of the image data;
and a data synchronous playing submodule 8062, configured to synchronously play the audio data and the image data according to the synchronous reference clock and the image playing time length.
The data synchronization playing sub-module 8062 is configured to determine a synchronization reference time according to a synchronization reference clock when the audio decoder reads the nth frame of audio data; judging whether the M frame image data is allowed to be read or not according to the synchronous reference time and the image playing duration; if the M frame of image data is determined to be allowed to be read, calling an image decoder to read the M frame of image data; if the M frame of image data is determined not to be allowed to be read, when the audio decoder reads the (N + 1) frame of audio data, a step of determining synchronous reference time according to a synchronous reference clock is executed; wherein N and M are integers greater than 1.
The data synchronous playing submodule 8062 is configured to calculate a product of an image playing time length and M, and determine a difference between the product and a synchronous reference time; judging whether the difference value is smaller than a threshold value; if the current frame number is less than the M frame number, allowing the M frame image data to be read; and if so, not allowing the M frame image data to be read.
In the embodiment of the invention, a video networking terminal can interact with an on-demand server to acquire corresponding video data, wherein after acquiring an on-demand instruction, the video networking terminal can generate a corresponding video-on-demand request according to the on-demand instruction, and the video-on-demand request is sent to the video networking server, so that the video networking server acquires a terminal identifier of a virtual terminal in the on-demand server according to the video-on-demand request, receives the terminal identifier returned by the video networking server, and establishes a video networking data channel between the virtual terminals corresponding to the terminal identifier; video data, including audio data and image data, may then be received over the video networking data channel. Then, in the process of playing video data, the video network terminal can determine a synchronous reference clock according to the audio sampling rate of the audio data, and synchronously play the audio data and the image data according to the synchronous reference clock; and then the problem of unsynchronization of images and sound can be effectively solved.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a predictive manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The data processing method and the data processing apparatus provided by the present invention are described in detail above, and the principle and the implementation of the present invention are explained in the present document by applying specific examples, and the description of the above embodiments is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. A data processing method is characterized in that the method is applied to a video network video system, the video network video system comprises a video network terminal, a video network server and an on-demand server, and the method comprises the following steps:
a video network terminal acquires a video-on-demand instruction and generates a corresponding video-on-demand request according to the video-on-demand instruction;
sending the video-on-demand request to a video-on-demand server so that the video-on-demand server acquires a terminal identifier of a virtual terminal in an on-demand server according to the video-on-demand request;
receiving a terminal identification returned by a video networking server, and establishing a video networking data channel between virtual terminals corresponding to the terminal identification;
receiving video data through the video networking data channel, wherein the video data are sent by the virtual terminal according to the video on demand request, and the video data comprise audio data and image data;
determining a synchronous reference clock according to the audio sampling rate of the audio data;
and synchronously playing the audio data and the image data according to the synchronous reference clock.
2. The method of claim 1, wherein the step of determining a synchronous reference clock based on the audio sampling rate of the audio data comprises:
determining the audio playing time length for playing a frame of audio data according to the audio sampling rate of the audio data;
determining a reference time interval according to the decoding time length and the audio playing time length corresponding to one frame of audio data;
and generating the synchronous reference clock according to the reference time interval.
3. The method of claim 1, wherein said synchronously playing the audio data and the image data according to the synchronous reference clock comprises:
determining the image playing time length for playing one frame of image data according to the image sampling rate of the image data;
and synchronously playing the audio data and the image data according to the synchronous reference clock and the image playing time length.
4. The method of claim 3, wherein the step of synchronously playing the audio data and the image data according to the synchronous reference clock and the image playing duration comprises:
when the audio decoder reads the audio data of the Nth frame, the synchronous reference time is determined according to the synchronous reference clock;
judging whether the M frame image data is allowed to be read or not according to the synchronous reference time and the image playing duration;
if the M frame of image data is determined to be allowed to be read, calling an image decoder to read the M frame of image data;
if the M frame of image data is determined not to be allowed to be read, when the audio decoder reads the (N + 1) frame of audio data, a step of determining synchronous reference time according to a synchronous reference clock is executed;
wherein N and M are integers greater than 1.
5. The method according to claim 4, wherein said determining whether to allow the mth frame of image data to be played according to the synchronization reference time and the image playing time comprises:
calculating the product of the image playing time length and M, and determining the difference value of the product and the synchronous reference time;
judging whether the difference value is smaller than a threshold value;
if the current frame number is less than the M frame number, allowing the M frame image data to be read;
and if so, not allowing the M frame image data to be read.
6. A data processing device is characterized by being applied to a video network terminal in a video network video system, wherein the video network video system comprises the video network terminal, a video network server and an on-demand server; the device comprises:
the request generation module is used for acquiring a video-on-demand instruction and generating a corresponding video-on-demand request according to the video-on-demand instruction;
the request sending module is used for sending the video-on-demand request to a video-on-demand server so that the video-on-demand server can obtain the terminal identification of the virtual terminal in the on-demand server according to the video-on-demand request;
the channel establishing module is used for receiving a terminal identifier returned by the video networking server and establishing a video networking data channel between virtual terminals corresponding to the terminal identifier;
the data receiving module is used for receiving video data through the video networking data channel, the video data are sent by the virtual terminal according to the video on demand request, and the video data comprise audio data and image data;
the clock determining module is used for determining a synchronous reference clock according to the audio sampling rate of the audio data;
and the synchronous playing module is used for synchronously playing the audio data and the image data according to the synchronous reference clock.
7. The apparatus of claim 6,
the clock determining module is specifically configured to determine an audio playing duration for playing a frame of audio data according to the audio sampling rate of the audio data; determining a reference time interval according to the decoding time length and the audio playing time length corresponding to one frame of audio data; and generating the synchronous reference clock according to the reference time interval.
8. The apparatus of claim 6, wherein the synchronized playback module comprises:
the time length determining submodule is used for determining the image playing time length for playing one frame of image data according to the image sampling rate of the image data;
and the data synchronous playing submodule is used for synchronously playing the audio data and the image data according to the synchronous reference clock and the image playing time length.
9. The apparatus of claim 8,
the data synchronous playing submodule is used for determining synchronous reference time according to a synchronous reference clock when the audio decoder reads the Nth frame of audio data; judging whether the M frame image data is allowed to be read or not according to the synchronous reference time and the image playing duration; if the M frame of image data is determined to be allowed to be read, calling an image decoder to read the M frame of image data; if the M frame of image data is determined not to be allowed to be read, when the audio decoder reads the (N + 1) frame of audio data, a step of determining synchronous reference time according to a synchronous reference clock is executed; wherein N and M are integers greater than 1.
10. The apparatus of claim 9,
the data synchronous playing submodule is used for calculating the product of the image playing time length and M and determining the difference value of the product and the synchronous reference time; judging whether the difference value is smaller than a threshold value; if the current frame number is less than the M frame number, allowing the M frame image data to be read; and if so, not allowing the M frame image data to be read.
CN201810870810.0A 2018-08-02 2018-08-02 Data processing method and device Pending CN110798725A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810870810.0A CN110798725A (en) 2018-08-02 2018-08-02 Data processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810870810.0A CN110798725A (en) 2018-08-02 2018-08-02 Data processing method and device

Publications (1)

Publication Number Publication Date
CN110798725A true CN110798725A (en) 2020-02-14

Family

ID=69425833

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810870810.0A Pending CN110798725A (en) 2018-08-02 2018-08-02 Data processing method and device

Country Status (1)

Country Link
CN (1) CN110798725A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112202644A (en) * 2020-10-12 2021-01-08 中国人民解放军国防科技大学 Collaborative network measurement method and system oriented to hybrid programmable network environment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1284817A (en) * 2000-05-13 2001-02-21 深圳市天圣电脑有限公司 Transmission system and method for web site video request
US20060277316A1 (en) * 2005-05-12 2006-12-07 Yunchuan Wang Internet protocol television
CN102196302A (en) * 2011-05-19 2011-09-21 广东星海数字家庭产业技术研究院有限公司 Digital television middleware-based video-on-demand method and system
CN103905879A (en) * 2014-03-13 2014-07-02 北京奇艺世纪科技有限公司 Video data and audio data synchronized playing method and device and equipment
CN106550282A (en) * 2015-09-17 2017-03-29 北京视联动力国际信息技术有限公司 A kind of player method and system of video data
CN108307212A (en) * 2017-01-11 2018-07-20 北京视联动力国际信息技术有限公司 A kind of file order method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1284817A (en) * 2000-05-13 2001-02-21 深圳市天圣电脑有限公司 Transmission system and method for web site video request
US20060277316A1 (en) * 2005-05-12 2006-12-07 Yunchuan Wang Internet protocol television
CN102196302A (en) * 2011-05-19 2011-09-21 广东星海数字家庭产业技术研究院有限公司 Digital television middleware-based video-on-demand method and system
CN103905879A (en) * 2014-03-13 2014-07-02 北京奇艺世纪科技有限公司 Video data and audio data synchronized playing method and device and equipment
CN106550282A (en) * 2015-09-17 2017-03-29 北京视联动力国际信息技术有限公司 A kind of player method and system of video data
CN108307212A (en) * 2017-01-11 2018-07-20 北京视联动力国际信息技术有限公司 A kind of file order method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘丽霞,边金松,张琍,穆森: "《基于FFMPEG解码的音视频同步实现》", 《计算机工程与设计》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112202644A (en) * 2020-10-12 2021-01-08 中国人民解放军国防科技大学 Collaborative network measurement method and system oriented to hybrid programmable network environment

Similar Documents

Publication Publication Date Title
CN109788314B (en) Method and device for transmitting video stream data
CN111107299A (en) Method and device for synthesizing multi-channel video
CN110324580B (en) Monitoring video playing method and device based on video network
CN110769310B (en) Video processing method and device based on video network
CN110049273B (en) Video networking-based conference recording method and transfer server
CN108965930B (en) Video data processing method and device
CN111147859A (en) Video processing method and device
CN111131760B (en) Video recording method and device
CN110149305B (en) Video network-based multi-party audio and video playing method and transfer server
CN110769179B (en) Audio and video data stream processing method and system
CN111131743A (en) Video call method and device based on browser, electronic equipment and storage medium
CN109743284B (en) Video processing method and system based on video network
CN110769297A (en) Audio and video data processing method and system
CN110611639A (en) Audio data processing method and device for streaming media conference
CN110392275B (en) Sharing method and device for manuscript demonstration and video networking soft terminal
CN109640016B (en) Method and device for realizing rapid recording in video networking conference
CN111654659A (en) Conference control method and device
CN111447396A (en) Audio and video transmission method and device, electronic equipment and storage medium
CN111246153A (en) Video conference establishing method and device, electronic equipment and readable storage medium
CN111131788A (en) Monitoring resource state detection method and device and computer readable storage medium
CN110798725A (en) Data processing method and device
CN110795008B (en) Picture transmission method and device and computer readable storage medium
CN110536148B (en) Live broadcasting method and equipment based on video networking
CN110087020B (en) Method and system for realizing video networking conference by iOS equipment
CN110691214B (en) Data processing method and device for business object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200214