CN110769179A - Audio and video data stream processing method and system - Google Patents

Audio and video data stream processing method and system Download PDF

Info

Publication number
CN110769179A
CN110769179A CN201810828920.0A CN201810828920A CN110769179A CN 110769179 A CN110769179 A CN 110769179A CN 201810828920 A CN201810828920 A CN 201810828920A CN 110769179 A CN110769179 A CN 110769179A
Authority
CN
China
Prior art keywords
audio
video data
video
data stream
original
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810828920.0A
Other languages
Chinese (zh)
Other versions
CN110769179B (en
Inventor
沈世国
杨乌拉
庞晓强
王艳辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Visionvera Information Technology Co Ltd
Original Assignee
Visionvera Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Visionvera Information Technology Co Ltd filed Critical Visionvera Information Technology Co Ltd
Priority to CN201810828920.0A priority Critical patent/CN110769179B/en
Publication of CN110769179A publication Critical patent/CN110769179A/en
Application granted granted Critical
Publication of CN110769179B publication Critical patent/CN110769179B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/92Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N5/926Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback by pulse code modulation
    • H04N5/9265Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback by pulse code modulation with processing of the sound signal

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Security & Cryptography (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)

Abstract

The embodiment of the invention provides a method and a system for processing audio and video data streams, wherein the method comprises the following steps: the video networking terminal sends request information of audio and video data streams to an Ethernet server; the video networking terminal receives audio and video data streams returned by the Ethernet server according to the serial numbers of the audio and video data streams in the request information; the video networking terminal decodes the video data stream in the audio and video data stream through a built-in first application programming interface to obtain original video data, and decodes the audio data stream in the audio and video data stream through a built-in second application programming interface to obtain original audio data; and the video network terminal synchronously plays the original video data and the original audio data according to the time stamp in the original video data and the time stamp in the original audio data. The embodiment of the invention realizes the function that the video network terminal can play the audio and video data stream on the Ethernet terminal in the video network.

Description

Audio and video data stream processing method and system
Technical Field
The invention relates to the technical field of video networking, in particular to a method for processing audio and video data streams and a system for processing audio and video data.
Background
The video network is a special network for transmitting high-definition video and a special protocol at high speed based on Ethernet hardware, is a higher-level form of the Internet and is a real-time network.
Currently, an ethernet terminal in the ethernet may obtain an audio and video data stream in real time, transmit the obtained audio and video data stream to other ethernet terminals in the ethernet, and display the audio and video data stream on the other ethernet terminals, that is, a live broadcast function of the ethernet terminal. However, the ethernet terminal cannot transmit the audio and video data streams acquired in real time to the video network, and the audio and video data streams are displayed on the video network terminal in the video network.
Disclosure of Invention
In view of the above problems, embodiments of the present invention are proposed to provide a processing method of an audio-visual data stream and a corresponding processing system of an audio-visual data stream that overcome or at least partially solve the above problems.
In order to solve the above problem, an embodiment of the present invention discloses a method for processing audio and video data streams, where the method is applied to video networking and ethernet, and the video networking includes: a video networking terminal, the Ethernet comprising: the method comprises the following steps that an Ethernet server and an Ethernet terminal are connected, wherein the Ethernet server is respectively connected with the video networking terminal and the Ethernet terminal, and the method comprises the following steps: the video networking terminal sends request information of audio and video data streams to the Ethernet server, wherein the request information comprises the serial numbers of the audio and video data streams; the video networking terminal receives audio and video data streams returned by the Ethernet server according to the serial numbers of the audio and video data streams in the request information, and the audio and video data streams come from the Ethernet terminal; the video networking terminal decodes the video data stream in the audio and video data stream through a built-in first application programming interface to obtain original video data, and decodes the audio data stream in the audio and video data stream through a built-in second application programming interface to obtain original audio data; and the video network terminal synchronously plays the original video data and the original audio data according to the time stamp in the original video data and the time stamp in the original audio data.
Optionally, the video networking terminal performs a decoding operation on a video data stream in the audio and video data stream through a built-in first application programming interface to obtain original video data, and performs a decoding operation on an audio data stream in the audio and video data stream through a built-in second application programming interface to obtain original audio data, including: and the video networking terminal decodes the video data stream in the H.264 format in the audio and video data stream through a built-in first application programming interface to obtain original video data in a brightness and chrominance concentration YUV format, and decodes the audio data stream in the advanced audio coding AAC format in the audio and video data stream through a built-in second application programming interface to obtain original audio data in a Pulse Code Modulation (PCM) format.
Optionally, before the video networking terminal performs a decoding operation on a video data stream in an h.264 format in the audio and video data streams through a built-in first application programming interface to obtain original video data in a luminance chrominance density YUV format, and performs a decoding operation on an audio data stream in an advanced audio coding AAC format in the audio and video data streams through a built-in second application programming interface to obtain original audio data in a pulse code modulation PCM format, the method further includes: and the video networking terminal carries out demultiplexing operation on the audio and video data stream to obtain the video data stream in the H.264 format and the audio data stream in the advanced audio coding AAC format.
Optionally, the video networking terminal synchronously plays the original video data and the original audio data according to the timestamp in the original video data and the timestamp in the original audio data, and the playing includes: the video network terminal determines a timestamp pair with a corresponding relation according to the timestamp in the original video data and the timestamp in the original audio data; the video network terminal determines an original video data frame and an original audio data frame which have a synchronous relation according to the timestamp pair; and the video network terminal renders the original video data frames in the original video data frames and the original audio data frames with the synchronous relationship through a built-in graphic program interface, and plays the original video data frames and the original audio data frames in the original video data frames and the original audio data frames with the synchronous relationship through a built-in sound program interface.
Optionally, before the video networking terminal sends the request information of the audio and video data stream to the ethernet server, the method further includes: and the video networking terminal establishes network connection with the Ethernet server and keeps long connection with the Ethernet server through a socket interface.
The embodiment of the invention also discloses a system for processing audio and video data streams, which is applied to video networking and Ethernet, wherein the video networking comprises the following steps: a video networking terminal, the Ethernet comprising: ethernet server and ethernet terminal, ethernet server respectively with the video networking terminal with the ethernet terminal connection, the video networking terminal includes: the request module is used for sending request information of audio and video data streams to the Ethernet server, and the request information comprises the serial numbers of the audio and video data streams; a receiving module, configured to receive an audio-video data stream returned by the ethernet server according to the serial number of the audio-video data stream in the request information, where the audio-video data stream originates from the ethernet terminal; the decoding module is used for decoding the video data stream in the audio and video data stream through a built-in first application programming interface to obtain original video data, and decoding the audio data stream in the audio and video data stream through a built-in second application programming interface to obtain original audio data; and the playing module is used for synchronously playing the original video data and the original audio data according to the time stamp in the original video data and the time stamp in the original audio data.
Optionally, the decoding module is configured to perform a decoding operation on a video data stream in an h.264 format in the audio and video data stream through a built-in first application programming interface to obtain original video data in a luminance and chrominance density YUV format, and perform a decoding operation on an audio data stream in an advanced audio coding AAC format in the audio and video data stream through a built-in second application programming interface to obtain original audio data in a pulse code modulation PCM format.
Optionally, the video networking terminal further includes: the demultiplexing module is used for performing decoding operation on the video data stream in the H.264 format in the audio and video data stream through a built-in first application programming interface to obtain original video data in a luminance and chrominance concentration YUV format, performing decoding operation on the audio data stream in the advanced audio coding AAC format in the audio and video data stream through a built-in second application programming interface to obtain original audio data in a Pulse Coding Modulation (PCM) format, and performing demultiplexing operation on the audio and video data stream to obtain the video data stream in the H.264 format and the audio data stream in the advanced audio coding AAC format.
Optionally, the playing module includes: the time stamp pair determining module is used for determining a time stamp pair with a corresponding relation according to the time stamp in the original video data and the time stamp in the original audio data; a synchronous data determining module, configured to determine, according to the timestamp pair, an original video data frame and an original audio data frame having a synchronous relationship; and the synchronous data playing module is used for rendering the original video data frames in the synchronous relation and the original audio data frames in the original audio data frames through a built-in graphic program interface and playing the original video data frames in the synchronous relation and the original audio data frames in the original audio data frames through a built-in sound program interface.
Optionally, the video networking terminal further includes: and the connection module is used for establishing network connection with the Ethernet server before the request module sends the request information of the audio and video data streams to the Ethernet server, and keeping long connection with the Ethernet server through a socket interface.
The embodiment of the invention has the following advantages:
the embodiment of the invention is applied to the video network and the Ethernet, wherein the video network comprises a video network terminal, the Ethernet comprises an Ethernet server and an Ethernet terminal, and the Ethernet server is respectively connected with the video network terminal and the Ethernet terminal.
In the embodiment of the invention, the video networking terminal sends the request information of the audio and video data streams to the Ethernet server, and the request information comprises the serial number of the audio and video data streams to be played. And the video networking terminal receives audio and video data streams returned by the Ethernet server according to the serial numbers in the request information, wherein the audio and video data streams come from the Ethernet terminal and are acquired in the Ethernet by the Ethernet terminal. The video networking terminal decodes the video data stream in the audio and video data stream through a built-in first application programming interface to obtain original video data, and decodes the audio data stream in the audio and video data stream through a built-in second application programming interface to obtain original audio data. And the video network terminal synchronously plays the original video data and the original audio data according to the time stamp in the original video data and the time stamp in the original audio data.
The embodiment of the invention applies the characteristics of the Ethernet and the characteristics of the video network, and the video network terminal in the video network requests to play the audio and video data stream collected by the Ethernet terminal by sending the request information of the audio and video data stream to the Ethernet server. And the Ethernet server returns the audio and video data streams collected by the Ethernet terminal to the video networking terminal according to the serial number of the audio and video data streams in the request information. The video networking terminal decodes and plays the audio and video data stream, and the function that the video networking terminal can play the audio and video data stream on the Ethernet terminal in the video networking is realized.
Drawings
FIG. 1 is a schematic networking diagram of a video network of the present invention;
FIG. 2 is a schematic diagram of a hardware architecture of a node server according to the present invention;
fig. 3 is a schematic diagram of a hardware structure of an access switch of the present invention;
fig. 4 is a schematic diagram of a hardware structure of an ethernet protocol conversion gateway according to the present invention;
FIG. 5 is a flowchart illustrating steps of an embodiment of a method for processing audio and video data streams according to the present invention;
FIG. 6 is a flow chart illustrating a method for implementing a live broadcast function of IOS equipment based on video networking according to the present invention;
fig. 7 is a block diagram of an embodiment of the audio/video data stream processing system according to the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
The video networking is an important milestone for network development, is a real-time network, can realize high-definition video real-time transmission, and pushes a plurality of internet applications to high-definition video, and high-definition faces each other.
The video networking adopts a real-time high-definition video exchange technology, can integrate required services such as dozens of services of video, voice, pictures, characters, communication, data and the like on a system platform on a network platform, such as high-definition video conference, video monitoring, intelligent monitoring analysis, emergency command, digital broadcast television, delayed television, network teaching, live broadcast, VOD on demand, television mail, Personal Video Recorder (PVR), intranet (self-office) channels, intelligent video broadcast control, information distribution and the like, and realizes high-definition quality video broadcast through a television or a computer.
To better understand the embodiments of the present invention, the following description refers to the internet of view:
some of the technologies applied in the video networking are as follows:
network Technology (Network Technology)
Network technology innovation in video networking has improved over traditional Ethernet (Ethernet) to face the potentially enormous video traffic on the network. Unlike pure network Packet Switching (Packet Switching) or network circuit Switching (circuit Switching), the internet of vision technology employs network Packet Switching to satisfy the demand of Streaming (which is interpreted as Streaming, continuous broadcasting, and is a data transmission technology that changes received data into a stable continuous stream and continuously transmits the stream, so that the sound heard or image seen by the user is very smooth, and the user can start browsing on the screen before the whole data is transmitted). The video networking technology has the advantages of flexibility, simplicity and low price of packet switching, and simultaneously has the quality and safety guarantee of circuit switching, thereby realizing the seamless connection of the whole network switching type virtual circuit and the data format.
Switching Technology (Switching Technology)
The video network adopts two advantages of asynchronism and packet switching of the Ethernet, eliminates the defects of the Ethernet on the premise of full compatibility, has end-to-end seamless connection of the whole network, is directly communicated with a user terminal, and directly bears an IP data packet. The user data does not require any format conversion across the entire network. The video networking is a higher-level form of the Ethernet, is a real-time exchange platform, can realize the real-time transmission of the whole-network large-scale high-definition video which cannot be realized by the existing Internet, and pushes a plurality of network video applications to high-definition and unification.
Server Technology (Server Technology)
The server technology on the video networking and unified video platform is different from the traditional server, the streaming media transmission of the video networking and unified video platform is established on the basis of connection orientation, the data processing capacity of the video networking and unified video platform is independent of flow and communication time, and a single network layer can contain signaling and data transmission. For voice and video services, the complexity of video networking and unified video platform streaming media processing is much simpler than that of data processing, and the efficiency is greatly improved by more than one hundred times compared with that of a traditional server.
Storage Technology (Storage Technology)
The super-high speed storage technology of the unified video platform adopts the most advanced real-time operating system in order to adapt to the media content with super-large capacity and super-large flow, the program information in the server instruction is mapped to the specific hard disk space, the media content is not passed through the server any more, and is directly sent to the user terminal instantly, and the general waiting time of the user is less than 0.2 second. The optimized sector distribution greatly reduces the mechanical motion of the magnetic head track seeking of the hard disk, the resource consumption only accounts for 20% of that of the IP internet of the same grade, but concurrent flow which is 3 times larger than that of the traditional hard disk array is generated, and the comprehensive efficiency is improved by more than 10 times.
Network Security Technology (Network Security Technology)
The structural design of the video network completely eliminates the network security problem troubling the internet structurally by the modes of independent service permission control each time, complete isolation of equipment and user data and the like, generally does not need antivirus programs and firewalls, avoids the attack of hackers and viruses, and provides a structural carefree security network for users.
Service Innovation Technology (Service Innovation Technology)
The unified video platform integrates services and transmission, and is not only automatically connected once whether a single user, a private network user or a network aggregate. The user terminal, the set-top box or the PC are directly connected to the unified video platform to obtain various multimedia video services in various forms. The unified video platform adopts a menu type configuration table mode to replace the traditional complex application programming, can realize complex application by using very few codes, and realizes infinite new service innovation.
Networking of the video network is as follows:
the video network is a centralized control network structure, and the network can be a tree network, a star network, a ring network and the like, but on the basis of the centralized control node, the whole network is controlled by the centralized control node in the network.
As shown in fig. 1, the video network is divided into an access network and a metropolitan network.
The devices of the access network part can be mainly classified into 3 types: node server, access switch, terminal (including various set-top boxes, coding boards, memories, etc.). The node server is connected to an access switch, which may be connected to a plurality of terminals and may be connected to an ethernet network.
The node server is a node which plays a centralized control function in the access network and can control the access switch and the terminal. The node server can be directly connected with the access switch or directly connected with the terminal.
Similarly, devices of the metropolitan network portion may also be classified into 3 types: a metropolitan area server, a node switch and a node server. The metro server is connected to a node switch, which may be connected to a plurality of node servers.
The node server is a node server of the access network part, namely the node server belongs to both the access network part and the metropolitan area network part.
The metropolitan area server is a node which plays a centralized control function in the metropolitan area network and can control a node switch and a node server. The metropolitan area server can be directly connected with the node switch or directly connected with the node server.
Therefore, the whole video network is a network structure with layered centralized control, and the network controlled by the node server and the metropolitan area server can be in various structures such as tree, star and ring.
The access network part can form a unified video platform (circled part), and a plurality of unified video platforms can form a video network; each unified video platform may be interconnected via metropolitan area and wide area video networking.
Video networking device classification
1.1 devices in the video network of the embodiment of the present invention can be mainly classified into 3 types: servers, switches (including ethernet gateways), terminals (including various set-top boxes, code boards, memories, etc.). The video network as a whole can be divided into a metropolitan area network (or national network, global network, etc.) and an access network.
1.2 wherein the devices of the access network part can be mainly classified into 3 types: node servers, access switches (including ethernet gateways), terminals (including various set-top boxes, code boards, memories, etc.).
The specific hardware structure of each access network device is as follows:
a node server:
as shown in fig. 2, the system mainly includes a network interface module 201, a switching engine module 202, a CPU module 203, and a disk array module 204.
The network interface module 201, the CPU module 203, and the disk array module 204 all enter the switching engine module 202; the switching engine module 202 performs an operation of looking up the address table 205 on the incoming packet, thereby obtaining the direction information of the packet; and stores the packet in a queue of the corresponding packet buffer 206 based on the packet's steering information; if the queue of the packet buffer 206 is nearly full, it is discarded; the switching engine module 202 polls all packet buffer queues for forwarding if the following conditions are met: 1) the port send buffer is not full; 2) the queue packet counter is greater than zero. The disk array module 204 mainly implements control over the hard disk, including initialization, read-write, and other operations on the hard disk; the CPU module 203 is mainly responsible for protocol processing with an access switch and a terminal (not shown in the figure), configuring an address table 205 (including a downlink protocol packet address table, an uplink protocol packet address table, and a data packet address table), and configuring the disk array module 204.
The access switch:
as shown in fig. 3, the network interface module (downstream network interface module 301, upstream network interface module 302), the switching engine module 303, and the CPU module 304 are mainly included.
Wherein, the packet (uplink data) coming from the downlink network interface module 301 enters the packet detection module 305; the packet detection module 305 detects whether the Destination Address (DA), the Source Address (SA), the packet type, and the packet length of the packet meet the requirements, if so, allocates a corresponding stream identifier (stream-id) and enters the switching engine module 303, otherwise, discards the stream identifier; the packet (downstream data) coming from the upstream network interface module 302 enters the switching engine module 303; the data packet coming from the CPU module 204 enters the switching engine module 303; the switching engine module 303 performs an operation of looking up the address table 306 on the incoming packet, thereby obtaining the direction information of the packet; if the packet entering the switching engine module 303 is from the downstream network interface to the upstream network interface, the packet is stored in the queue of the corresponding packet buffer 307 in association with the stream-id; if the queue of the packet buffer 307 is nearly full, it is discarded; if the packet entering the switching engine module 303 is not from the downlink network interface to the uplink network interface, the data packet is stored in the queue of the corresponding packet buffer 307 according to the guiding information of the packet; if the queue of the packet buffer 307 is nearly full, it is discarded.
The switching engine module 303 polls all packet buffer queues, which in this embodiment of the present invention is divided into two cases:
if the queue is from the downlink network interface to the uplink network interface, the following conditions are met for forwarding: 1) the port send buffer is not full; 2) the queued packet counter is greater than zero; 3) and obtaining the token generated by the code rate control module.
If the queue is not from the downlink network interface to the uplink network interface, the following conditions are met for forwarding: 1) the port send buffer is not full; 2) the queue packet counter is greater than zero.
The rate control module 208 is configured by the CPU module 204, and generates tokens for packet buffer queues from all downstream network interfaces to upstream network interfaces at programmable intervals to control the rate of upstream forwarding.
The CPU module 304 is mainly responsible for protocol processing with the node server, configuration of the address table 306, and configuration of the code rate control module 308.
Ethernet protocol conversion gateway
As shown in fig. 4, the apparatus mainly includes a network interface module (a downlink network interface module 401 and an uplink network interface module 402), a switching engine module 403, a CPU module 404, a packet detection module 405, a rate control module 408, an address table 406, a packet buffer 407, a MAC adding module 409, and a MAC deleting module 410.
Wherein, the data packet coming from the downlink network interface module 401 enters the packet detection module 405; the packet detection module 405 detects whether the ethernet MAC DA, the ethernet MAC SA, the ethernet length or frame type, the video network destination address DA, the video network source address SA, the video network packet type, and the packet length of the packet meet the requirements, and if so, allocates a corresponding stream identifier (stream-id); then, the MAC deletion module 410 subtracts MAC DA, MAC SA, length or frame type (2byte) and enters the corresponding receiving buffer, otherwise, discards it;
the downlink network interface module 401 detects the sending buffer of the port, and if there is a packet, acquires the ethernet MAC DA of the corresponding terminal according to the video networking destination address DA of the packet, adds the ethernet MAC DA of the terminal, the MACSA of the ethernet coordination gateway, and the ethernet length or frame type, and sends the packet.
The other modules in the ethernet protocol gateway function similarly to the access switch.
A terminal:
the system mainly comprises a network interface module, a service processing module and a CPU module; for example, the set-top box mainly comprises a network interface module, a video and audio coding and decoding engine module and a CPU module; the coding board mainly comprises a network interface module, a video and audio coding engine module and a CPU module; the memory mainly comprises a network interface module, a CPU module and a disk array module.
1.3 devices of the metropolitan area network part can be mainly classified into 3 types: node server, node exchanger, metropolitan area server. The node switch mainly comprises a network interface module, a switching engine module and a CPU module; the metropolitan area server mainly comprises a network interface module, a switching engine module and a CPU module.
2. Video networking packet definition
2.1 Access network packet definition
The data packet of the access network mainly comprises the following parts: destination Address (DA), Source Address (SA), reserved bytes, payload (pdu), CRC.
As shown in the following table, the data packet of the access network mainly includes the following parts:
DA SA Reserved Payload CRC
the Destination Address (DA) is composed of 8 bytes (byte), the first byte represents the type of the data packet (e.g. various protocol packets, multicast data packets, unicast data packets, etc.), there are at most 256 possibilities, the second byte to the sixth byte are metropolitan area network addresses, and the seventh byte and the eighth byte are access network addresses.
The Source Address (SA) is also composed of 8 bytes (byte), defined as the same as the Destination Address (DA).
The reserved byte consists of 2 bytes.
The payload part has different lengths according to types of different datagrams, and is 64 bytes if the type of the datagram is a variety of protocol packets, or is 1056 bytes if the type of the datagram is a unicast packet, but is not limited to the above 2 types.
The CRC consists of 4 bytes and is calculated in accordance with the standard ethernet CRC algorithm.
2.2 metropolitan area network packet definition
The topology of a metropolitan area network is a graph and there may be 2, or even more than 2, connections between two devices, i.e., there may be more than 2 connections between a node switch and a node server, a node switch and a node switch, and a node switch and a node server. However, the metro network address of the metro network device is unique, and in order to accurately describe the connection relationship between the metro network devices, parameters are introduced in the embodiment of the present invention: a label to uniquely describe a metropolitan area network device.
In this specification, the definition of the Label is similar to that of a Label of Multi-Protocol Label switching (MPLS), and assuming that there are two connections between a device a and a device B, there are 2 labels for a packet from the device a to the device B, and 2 labels for a packet from the device B to the device a. The label is classified into an incoming label and an outgoing label, and assuming that the label (incoming label) of the packet entering the device a is 0x0000, the label (outgoing label) of the packet leaving the device a may become 0x 0001. The network access process of the metro network is a network access process under centralized control, that is, address allocation and label allocation of the metro network are both dominated by the metro server, and the node switch and the node server are both passively executed, which is different from label allocation of MPLS, and label allocation of MPLS is a result of mutual negotiation between the switch and the server.
As shown in the following table, the data packet of the metro network mainly includes the following parts:
DA SA Reserved label (R) Payload CRC
Namely Destination Address (DA), Source Address (SA), Reserved byte (Reserved), tag, payload (pdu), CRC. The format of the tag may be defined by reference to the following: the tag is 32 bits with the upper 16 bits reserved and only the lower 16 bits used, and its position is between the reserved bytes and payload of the packet.
Based on the above characteristics of the video network, one of the core concepts of the embodiments of the present invention is proposed, and the video network terminal may play audio and video data streams from the ethernet terminal in the ethernet network synchronously in the video network, following the protocol of the ethernet network and the protocol of the video network.
Referring to fig. 5, a flowchart illustrating steps of an embodiment of a method for processing audio and video data streams according to the present invention is shown, where the method may be applied to a video network and an ethernet network, where the video network includes a video network terminal, the ethernet network includes an ethernet server and an ethernet terminal, and the ethernet server is connected to the video network terminal and the ethernet terminal, respectively, and the method may specifically include the following steps:
step 501, the video networking terminal sends request information of audio and video data streams to the Ethernet server.
In the embodiment of the present invention, the video network terminal may be an intelligent terminal, such as a smart phone, a tablet computer, and the like, which is accessed to the video network. The Ethernet server can be a streaming media server which is a bridge and a link of audio and video data transmission service between the video network and the Ethernet, realizes seamless fusion of the video network service and the Ethernet service, can safely access various audio and video resources in the Ethernet to the video network, and can convert different audio and video streams of a video conference, a monitoring image, a digital television and the like in the video network into audio and video data supporting a standard Ethernet protocol and output the audio and video data to the Ethernet.
In the embodiment of the present invention, the request information of the audio and video data stream may include a number of the audio and video data stream, where the number may be a unique number corresponding to the audio and video data stream, or a unique number corresponding to a source ethernet terminal of the audio and video data stream. Whether the serial number is the serial number corresponding to the audio and video data stream, the serial number corresponding to the Ethernet terminal or other serial numbers, the function is to determine the unique audio and video data stream according to the serial number.
In a preferred embodiment of the present invention, before the video networking terminal sends the request information of the audio-video data stream to the ethernet server, the video networking terminal establishes a network connection with the ethernet server and maintains a long connection with the ethernet server through the socket interface. The video network terminal can log in the streaming media server in a user name and password verification mode, and can keep long connection with the streaming media server through a socket interface after logging in. The long connection refers to a link mode that a plurality of data packets can be continuously transmitted on one connection, and during the connection holding period, if no data packet is transmitted, a link detection packet needs to be transmitted by two sides.
Step 502, the video networking terminal receives the audio and video data stream returned by the ethernet server according to the serial number of the audio and video data stream in the request information.
In the embodiment of the present invention, the audio and video data stream returned by the ethernet server may be an audio and video data stream obtained by the ethernet terminal in real time. When the Ethernet terminal acquires the audio and video data streams in real time, the audio and video data streams can be acquired in real time respectively. And respectively combining the audio data stream and the video data stream obtained in real time to form an audio-video data stream.
In a preferred embodiment of the present invention, when the ethernet terminal acquires the audio and video data stream in real time, the audio and video data stream may be acquired in real time through a preset audio and video data capture device and an audio and video data acquisition device. The audio and video data capturing class is used for capturing audio and video data streams through audio and video data acquisition equipment of the Ethernet terminal, and the audio and video data acquisition equipment is used for acquiring attribute information of the audio and video data acquisition equipment of the Ethernet terminal. For example, a smart phone with an ethernet terminal as an IOS system is taken as an example for explanation, in the ethernet terminal, a preset audio/video data capture class is AVCaptureSession, a preset audio/video data acquisition device obtains the class as AVCaptureDevice, and the audio/video data acquisition device may include a camera, a microphone, and the like. The Ethernet terminal can acquire the original audio and video data stream through AVCaptureSessiion and AVCaptureDevice.
Step 503, the video networking terminal performs a decoding operation on the video data stream in the audio and video data stream through the built-in first application programming interface to obtain original video data, and performs a decoding operation on the audio data stream in the audio and video data stream through the built-in second application programming interface to obtain original audio data.
In the embodiment of the invention, after the video networking terminal receives the audio and video data stream, the audio and video data stream is decoded in the video networking terminal through an application programming interface built in the system to obtain original audio and video data.
In a preferred embodiment of the present invention, in the Audio and video data streams received by the video networking terminal, the format of the video data stream is h.264 format, and the format of the Audio data stream is Advanced Audio Coding (AAC) format. For the video data stream in the h.264 format, the video networking terminal needs to perform decoding operation on the video data stream in the h.264 format through a built-in first application programming interface to obtain original video data in the luminance and chrominance density YUV format. For example, the first application programming interface may be a Video Tool box (an application programming interface for decoding Video data in an IOS system, which is a set of functions written in C language), and "Y" in YUV format represents "luminance", "U" represents "chrominance", and "V" represents "density". For the audio data stream in the AAC format, the video networking terminal needs to decode the audio data stream in the AAC format through a built-in second application programming interface to obtain original audio data in a Pulse Code Modulation (PCM) format. For example, the second API may be an Audio Toolbox (an API for decoding Audio data in an IOS system, a set of functions written in C)
In a preferred embodiment of the invention, the audiovisual data stream received by the ethernet terminal is typically in the form of multiplexed data packets. Therefore, before the video networking terminal decodes the audio and video data stream, the data packets encapsulating the audio and video data stream need to be demultiplexed. For example, the video networking terminal performs demultiplexing operation on the data packet to obtain a video data stream in h.264 format and an audio data stream in AAC format.
And step 504, the video network terminal synchronously plays the original video data and the original audio data according to the time stamp in the original video data and the time stamp in the original audio data.
In the embodiment of the invention, in order to ensure that the audio data and the video data can be played synchronously and avoid the condition of audio and video asynchronism, a time stamp is added to each frame of the original audio data and a time stamp is also added to each frame of the original video data in the original audio data and the original video data collected by the Ethernet terminal. The time stamp may represent play time information of the original audio data and the original video data.
After obtaining the original audio data and the original video data, the video network terminal may use the timestamps thereof to synchronously play the original audio data and the original video data, and specifically, the video network terminal determines a timestamp pair having a corresponding relationship according to the timestamp in the original video data and the timestamp in the original audio data. The pair of timestamps having the correspondence relationship may be two timestamps representing the same play time information in the original video data and in the original audio data. And the video network terminal determines an original video data frame and an original audio data frame which have a synchronous relation according to the timestamp pair. The video network terminal renders the original video data frames in the original video data frames and the original audio data frames with synchronous relation through a built-in graphic program interface, and plays the original video data frames and the original audio data frames in the original audio data frames with synchronous relation through a built-in sound program interface. For example, the time stamp of the video frame vz1 in the original video data V1 and the time stamp of the audio frame az1 in the original audio data a1 are a time stamp pair having a corresponding relationship, and then the video frame vz1 and the audio frame az1 are an original video data frame and an original audio data frame having a synchronous relationship. The video network terminal can render the video frame vz1 through the built-in OpenGL (a professional graphical program interface with cross-programming language and cross-platform programming interface specification), and play the audio frame az1 through the OpenAL (a cross-platform sound effect application programming interface).
Based on the above description about the embodiment of the method for processing audio and video data streams, a method for implementing the live broadcast function of the IOS device based on video networking is introduced below, as shown in fig. 6, the IOS device logs in to the streaming media server and keeps long connection, and the IOS device requests to view audio and video live broadcast through the live broadcast number. And the streaming media server pulls the audio and video data stream to the IOS device. The IOS equipment decodes the Video data stream through a Video Tool box to obtain an original Video data frame, decodes the Audio data stream through an Audio Tool box to obtain an original Audio data frame, renders the original Video data frame, and plays the original Audio data frame.
The embodiment of the invention is applied to the video network and the Ethernet, wherein the video network comprises a video network terminal, the Ethernet comprises an Ethernet server and an Ethernet terminal, and the Ethernet server is respectively connected with the video network terminal and the Ethernet terminal.
In the embodiment of the invention, the video networking terminal sends the request information of the audio and video data streams to the Ethernet server, and the request information comprises the serial number of the audio and video data streams to be played. And the video networking terminal receives audio and video data streams returned by the Ethernet server according to the serial numbers in the request information, wherein the audio and video data streams come from the Ethernet terminal and are acquired in the Ethernet by the Ethernet terminal. The video networking terminal decodes the video data stream in the audio and video data stream through a built-in first application programming interface to obtain original video data, and decodes the audio data stream in the audio and video data stream through a built-in second application programming interface to obtain original audio data. And the video network terminal synchronously plays the original video data and the original audio data according to the time stamp in the original video data and the time stamp in the original audio data.
The embodiment of the invention applies the characteristics of the Ethernet and the characteristics of the video network, and the video network terminal in the video network requests to play the audio and video data stream collected by the Ethernet terminal by sending the request information of the audio and video data stream to the Ethernet server. And the Ethernet server returns the audio and video data streams collected by the Ethernet terminal to the video networking terminal according to the serial number of the audio and video data streams in the request information. The video networking terminal decodes and plays the audio and video data stream, and the function that the video networking terminal can play the audio and video data stream on the Ethernet terminal in the video networking is realized.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 7, a block diagram of an embodiment of a system for processing audio and video data streams according to the present invention is shown, where the system may be applied in a video network and an ethernet network, the video network includes a video network terminal, the ethernet network includes an ethernet server and an ethernet terminal, the ethernet server is connected to the video network terminal and the ethernet terminal, respectively, and the video network terminal in the system may specifically include the following modules:
a request module 701, configured to send request information of audio and video data streams to an ethernet server, where the request information includes a serial number of the audio and video data streams.
A receiving module 702, configured to receive the audio and video data stream returned by the ethernet server according to the serial number of the audio and video data stream in the request information, where the audio and video data stream originates from the ethernet terminal.
The decoding module 703 is configured to perform a decoding operation on the video data stream in the audio/video data stream through a built-in first application programming interface to obtain original video data, and perform a decoding operation on the audio data stream in the audio/video data stream through a built-in second application programming interface to obtain original audio data.
And a playing module 704, configured to play the original video data and the original audio data synchronously according to the time stamp in the original video data and the time stamp in the original audio data.
In a preferred embodiment of the present invention, the decoding module 703 is configured to perform a decoding operation on a video data stream in an h.264 format in an audio/video data stream through a built-in first application programming interface to obtain original video data in a luminance/chrominance density YUV format, and perform a decoding operation on an audio data stream in an advanced audio coding AAC format in the audio/video data stream through a built-in second application programming interface to obtain original audio data in a pulse code modulation PCM format.
In a preferred embodiment of the present invention, the video network terminal further includes: the demultiplexing module 705 is configured to perform, in the decoding module 703, a decoding operation on a video data stream in the h.264 format in the audio/video data stream through a built-in first application programming interface to obtain original video data in the luminance/chrominance density YUV format, and perform, through a built-in second application programming interface, a decoding operation on an audio data stream in the advanced audio coding AAC format in the audio/video data stream to obtain original audio data in the pulse coding modulation PCM format, perform a demultiplexing operation on the audio/video data stream to obtain a video data stream in the h.264 format and an audio data stream in the advanced audio coding AAC format.
In a preferred embodiment of the present invention, the playing module 704 includes: a timestamp pair determining module 7041, configured to determine, according to a timestamp in the original video data and a timestamp in the original audio data, a timestamp pair having a corresponding relationship; a synchronous data determining module 7042, configured to determine, according to the timestamp pair, an original video data frame and an original audio data frame having a synchronous relationship; and the synchronous data playing module 7043 is configured to render, through a built-in graphical program interface, an original video data frame in the original video data frame and an original audio data frame having a synchronous relationship, and play, through a built-in sound program interface, an original video data frame in the original video data frame and an original audio data frame in the original audio data frame having a synchronous relationship.
In a preferred embodiment of the present invention, the video network terminal further includes: a connection module 706, configured to establish a network connection with the ethernet server before the request module 701 sends the request information of the audio/video data stream to the ethernet server, and maintain a long connection with the ethernet server through the socket interface.
For the system embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The foregoing describes in detail a method for processing audio and video data streams and a system for processing audio and video data streams provided by the present invention, and specific examples are applied herein to illustrate the principles and embodiments of the present invention, and the description of the foregoing examples is only used to help understand the method and the core ideas of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. A method for processing audio-video data streams is applied to video networking and Ethernet, and the video networking comprises the following steps: a video networking terminal, the Ethernet comprising: the method comprises the following steps that an Ethernet server and an Ethernet terminal are connected, wherein the Ethernet server is respectively connected with the video networking terminal and the Ethernet terminal, and the method comprises the following steps:
the video networking terminal sends request information of audio and video data streams to the Ethernet server, wherein the request information comprises the serial numbers of the audio and video data streams;
the video networking terminal receives audio and video data streams returned by the Ethernet server according to the serial numbers of the audio and video data streams in the request information, and the audio and video data streams come from the Ethernet terminal;
the video networking terminal decodes the video data stream in the audio and video data stream through a built-in first application programming interface to obtain original video data, and decodes the audio data stream in the audio and video data stream through a built-in second application programming interface to obtain original audio data;
and the video network terminal synchronously plays the original video data and the original audio data according to the time stamp in the original video data and the time stamp in the original audio data.
2. The audio-video data stream processing method according to claim 1, wherein the video networking terminal performs a decoding operation on the video data stream in the audio-video data stream through a built-in first application programming interface to obtain original video data, and performs a decoding operation on the audio data stream in the audio-video data stream through a built-in second application programming interface to obtain original audio data, and the method comprises:
and the video networking terminal decodes the video data stream in the H.264 format in the audio and video data stream through a built-in first application programming interface to obtain original video data in a brightness and chrominance concentration YUV format, and decodes the audio data stream in the advanced audio coding AAC format in the audio and video data stream through a built-in second application programming interface to obtain original audio data in a Pulse Code Modulation (PCM) format.
3. The method for processing audio-video data streams according to claim 2, wherein before the video networking terminal performs a decoding operation on the video data stream in the h.264 format in the audio-video data streams through a built-in first application programming interface to obtain raw video data in the luminance chrominance density YUV format, and performs a decoding operation on the audio data stream in the advanced audio coding AAC format in the audio-video data streams through a built-in second application programming interface to obtain raw audio data in the pulse code modulation PCM format, the method further comprises:
and the video networking terminal carries out demultiplexing operation on the audio and video data stream to obtain the video data stream in the H.264 format and the audio data stream in the advanced audio coding AAC format.
4. The audio-visual data stream processing method according to claim 1, wherein the video networking terminal synchronously plays the original video data and the original audio data according to the time stamp in the original video data and the time stamp in the original audio data, and comprises:
the video network terminal determines a timestamp pair with a corresponding relation according to the timestamp in the original video data and the timestamp in the original audio data;
the video network terminal determines an original video data frame and an original audio data frame which have a synchronous relation according to the timestamp pair;
and the video network terminal renders the original video data frames in the original video data frames and the original audio data frames with the synchronous relationship through a built-in graphic program interface, and plays the original video data frames and the original audio data frames in the original video data frames and the original audio data frames with the synchronous relationship through a built-in sound program interface.
5. The audio-visual data stream processing method according to claim 1, wherein before the video networking terminal sends the request information of the audio-visual data stream to the ethernet server, the method further comprises:
and the video networking terminal establishes network connection with the Ethernet server and keeps long connection with the Ethernet server through a socket interface.
6. A processing system for audio-visual data streams is applied to video networking and Ethernet, and the video networking comprises: a video networking terminal, the Ethernet comprising: ethernet server and ethernet terminal, ethernet server respectively with the video networking terminal with the ethernet terminal connection, the video networking terminal includes:
the request module is used for sending request information of audio and video data streams to the Ethernet server, and the request information comprises the serial numbers of the audio and video data streams;
a receiving module, configured to receive an audio-video data stream returned by the ethernet server according to the serial number of the audio-video data stream in the request information, where the audio-video data stream originates from the ethernet terminal;
the decoding module is used for decoding the video data stream in the audio and video data stream through a built-in first application programming interface to obtain original video data, and decoding the audio data stream in the audio and video data stream through a built-in second application programming interface to obtain original audio data;
and the playing module is used for synchronously playing the original video data and the original audio data according to the time stamp in the original video data and the time stamp in the original audio data.
7. The audio-video data stream processing system according to claim 6, wherein the decoding module is configured to perform a decoding operation on a video data stream in an h.264 format in the audio-video data stream through a built-in first application programming interface to obtain raw video data in a luminance chrominance density YUV format, and perform a decoding operation on an audio data stream in an advanced audio coding AAC format in the audio-video data stream through a built-in second application programming interface to obtain raw audio data in a pulse code modulation PCM format.
8. Audio-visual data stream processing system according to claim 7, characterised in that said video networking terminal further comprises:
the demultiplexing module is used for performing decoding operation on the video data stream in the H.264 format in the audio and video data stream through a built-in first application programming interface to obtain original video data in a luminance and chrominance concentration YUV format, performing decoding operation on the audio data stream in the advanced audio coding AAC format in the audio and video data stream through a built-in second application programming interface to obtain original audio data in a Pulse Coding Modulation (PCM) format, and performing demultiplexing operation on the audio and video data stream to obtain the video data stream in the H.264 format and the audio data stream in the advanced audio coding AAC format.
9. Audio-visual data stream processing system according to claim 6, characterised in that said playing module comprises:
the time stamp pair determining module is used for determining a time stamp pair with a corresponding relation according to the time stamp in the original video data and the time stamp in the original audio data;
a synchronous data determining module, configured to determine, according to the timestamp pair, an original video data frame and an original audio data frame having a synchronous relationship;
and the synchronous data playing module is used for rendering the original video data frames in the synchronous relation and the original audio data frames in the original audio data frames through a built-in graphic program interface and playing the original video data frames in the synchronous relation and the original audio data frames in the original audio data frames through a built-in sound program interface.
10. Audio-visual data stream processing system according to claim 6, characterised in that said video networking terminal further comprises:
and the connection module is used for establishing network connection with the Ethernet server before the request module sends the request information of the audio and video data streams to the Ethernet server, and keeping long connection with the Ethernet server through a socket interface.
CN201810828920.0A 2018-07-25 2018-07-25 Audio and video data stream processing method and system Active CN110769179B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810828920.0A CN110769179B (en) 2018-07-25 2018-07-25 Audio and video data stream processing method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810828920.0A CN110769179B (en) 2018-07-25 2018-07-25 Audio and video data stream processing method and system

Publications (2)

Publication Number Publication Date
CN110769179A true CN110769179A (en) 2020-02-07
CN110769179B CN110769179B (en) 2022-07-08

Family

ID=69328100

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810828920.0A Active CN110769179B (en) 2018-07-25 2018-07-25 Audio and video data stream processing method and system

Country Status (1)

Country Link
CN (1) CN110769179B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111314646A (en) * 2020-02-27 2020-06-19 Oppo(重庆)智能科技有限公司 Image acquisition method, image acquisition device, terminal device and readable storage medium
CN111669625A (en) * 2020-06-12 2020-09-15 北京字节跳动网络技术有限公司 Processing method, device and equipment for shot file and storage medium
CN114189717A (en) * 2021-11-08 2022-03-15 陕西千山航空电子有限责任公司 Multimedia stream data synchronous playback system and method based on data fusion

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7664057B1 (en) * 2004-07-13 2010-02-16 Cisco Technology, Inc. Audio-to-video synchronization system and method for packet-based network video conferencing
CN102447893A (en) * 2010-09-30 2012-05-09 北京沃安科技有限公司 Method and system for real-time acquisition and release of videos of mobile phone
CN103338386A (en) * 2013-07-10 2013-10-02 航天恒星科技有限公司 Audio and video synchronization method based on simplified timestamps
US20160005439A1 (en) * 2014-07-01 2016-01-07 Disney Enterprises, Inc. Systems and methods for networked media synchronization
CN106385525A (en) * 2016-09-07 2017-02-08 天脉聚源(北京)传媒科技有限公司 Video play method and device
CN107979563A (en) * 2016-10-21 2018-05-01 北京视联动力国际信息技术有限公司 A kind of information processing method and device based on regarding networking
CN107995069A (en) * 2016-10-26 2018-05-04 北京视联动力国际信息技术有限公司 A kind of method and apparatus of terminal video push

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7664057B1 (en) * 2004-07-13 2010-02-16 Cisco Technology, Inc. Audio-to-video synchronization system and method for packet-based network video conferencing
CN102447893A (en) * 2010-09-30 2012-05-09 北京沃安科技有限公司 Method and system for real-time acquisition and release of videos of mobile phone
CN103338386A (en) * 2013-07-10 2013-10-02 航天恒星科技有限公司 Audio and video synchronization method based on simplified timestamps
US20160005439A1 (en) * 2014-07-01 2016-01-07 Disney Enterprises, Inc. Systems and methods for networked media synchronization
CN106385525A (en) * 2016-09-07 2017-02-08 天脉聚源(北京)传媒科技有限公司 Video play method and device
CN107979563A (en) * 2016-10-21 2018-05-01 北京视联动力国际信息技术有限公司 A kind of information processing method and device based on regarding networking
CN107995069A (en) * 2016-10-26 2018-05-04 北京视联动力国际信息技术有限公司 A kind of method and apparatus of terminal video push

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111314646A (en) * 2020-02-27 2020-06-19 Oppo(重庆)智能科技有限公司 Image acquisition method, image acquisition device, terminal device and readable storage medium
CN111314646B (en) * 2020-02-27 2021-04-23 Oppo(重庆)智能科技有限公司 Image acquisition method, image acquisition device, terminal device and readable storage medium
CN111669625A (en) * 2020-06-12 2020-09-15 北京字节跳动网络技术有限公司 Processing method, device and equipment for shot file and storage medium
CN114189717A (en) * 2021-11-08 2022-03-15 陕西千山航空电子有限责任公司 Multimedia stream data synchronous playback system and method based on data fusion

Also Published As

Publication number Publication date
CN110769179B (en) 2022-07-08

Similar Documents

Publication Publication Date Title
CN109547728B (en) Recorded broadcast source conference entering and conference recorded broadcast method and system
CN109120879B (en) Video conference processing method and system
CN110475090B (en) Conference control method and system
CN109547731B (en) Video conference display method and system
CN109547163B (en) Method and device for controlling data transmission rate
CN109660816B (en) Information processing method and device
CN110572607A (en) Video conference method, system and device and storage medium
CN108881948B (en) Method and system for video inspection network polling monitoring video
CN110049273B (en) Video networking-based conference recording method and transfer server
CN109246135B (en) Method and system for acquiring streaming media data
CN108965930B (en) Video data processing method and device
CN111131754A (en) Control split screen method and device of conference management system
CN111147859A (en) Video processing method and device
CN108574816B (en) Video networking terminal and communication method and device based on video networking terminal
CN110769179B (en) Audio and video data stream processing method and system
CN110149305B (en) Video network-based multi-party audio and video playing method and transfer server
CN109905616B (en) Method and device for switching video pictures
CN109743284B (en) Video processing method and system based on video network
CN110769297A (en) Audio and video data processing method and system
CN110611639A (en) Audio data processing method and device for streaming media conference
CN111212255B (en) Monitoring resource obtaining method and device and computer readable storage medium
CN110049069B (en) Data acquisition method and device
CN110661749A (en) Video signal processing method and video networking terminal
CN110536148B (en) Live broadcasting method and equipment based on video networking
CN110149306B (en) Media data processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant