CN110958072A - Multi-node audio and video information synchronous sharing display method - Google Patents

Multi-node audio and video information synchronous sharing display method Download PDF

Info

Publication number
CN110958072A
CN110958072A CN201911067949.2A CN201911067949A CN110958072A CN 110958072 A CN110958072 A CN 110958072A CN 201911067949 A CN201911067949 A CN 201911067949A CN 110958072 A CN110958072 A CN 110958072A
Authority
CN
China
Prior art keywords
node
audio
terminal
data
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911067949.2A
Other languages
Chinese (zh)
Other versions
CN110958072B (en
Inventor
弓飞
黄小华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Hangxing Machinery Manufacturing Co Ltd
Original Assignee
Beijing Hangxing Machinery Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Hangxing Machinery Manufacturing Co Ltd filed Critical Beijing Hangxing Machinery Manufacturing Co Ltd
Priority to CN201911067949.2A priority Critical patent/CN110958072B/en
Publication of CN110958072A publication Critical patent/CN110958072A/en
Application granted granted Critical
Publication of CN110958072B publication Critical patent/CN110958072B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J3/00Time-division multiplex systems
    • H04J3/02Details
    • H04J3/06Synchronising arrangements
    • H04J3/0635Clock or time synchronisation in a network
    • H04J3/0638Clock or time synchronisation among nodes; Internode synchronisation
    • H04J3/0658Clock or time synchronisation among packet nodes
    • H04J3/0661Clock or time synchronisation among packet nodes using timestamps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising

Abstract

The invention discloses a multi-node audio and video information synchronous sharing display method, which comprises the following steps: according to satellite time service data output by the satellite time service equipment, time synchronization is carried out on each terminal node in the topological network; after all the terminal nodes finish time synchronization, each terminal node starts synchronous acquisition and caching of audio and video data at a preset unified moment according to a communication protocol specification; when any terminal node receives the monitoring instruction, the cached audio and video data is forwarded to a monitoring computer; the monitoring computer stores all the forwarded audio and video data into a video buffer queue and an audio buffer queue in a classified manner; and sequentially decoding the audio data and the video data in the video buffer queue and the audio buffer queue according to the queue sequence, and outputting and playing the audio data and the video data. The invention realizes the synchronous sharing and display of the multi-node audio and video information.

Description

Multi-node audio and video information synchronous sharing display method
Technical Field
The invention relates to the technical field of communication, in particular to a multi-node audio and video information synchronous sharing display method.
Background
For a strong real-time and high-safety control system, the real-time performance and the certainty of communication between nodes are the key of system implementation. Message communication protocols based on event-triggered message synchronization and scheduling are commonly used in general network processing modules. Numerous studies have shown that time-triggered solutions have a greater advantage for high reliability systems. The working principles of a Time Triggered Architecture (TTA) system and an Event Triggered Architecture (ETA) system are different, where Time triggering is performed according to a Time process, and a control signal Triggered by an Event is according to occurrence of an Event (such as one interrupt). The fundamental difference between ETA and TTA systems is related to the source of the control signal. In a TTA system, control always resides inside the distributed system, which is a physically closed deterministic system; in an ETA system, the control signal may originate from inside the terminal or from an environment outside the terminal system (e.g., forwarded by an interrupt mechanism), and the unpredictable environment will cause uncertainty in the behavior of the terminal system. The most important feature of the Time Triggered Ethernet (TTE) AS a typical application of the TTA system is that a Time triggering function is introduced based on the IEEE1588/AS6802 Time synchronization protocol, so that the conventional Ethernet based on event triggering is upgraded into a network capable of being Triggered according to deterministic Time, an accurate Time synchronization rule is adopted, and each node in the network communicates according to a uniform Time triggering strategy, thereby ensuring that communication is collision-free and realizing the deterministic feature.
In recent years, with the high-speed development in the fields of automotive electronics, aerospace, medical equipment, transportation and the like, the requirements on high stability, high speed and real-time performance in a network are higher and higher, and especially in the modern military electronic war, the requirements on multi-angle situation synchronous perception and high-precision time synchronous control are extremely urgent. The domestic time-triggered Ethernet has a certain application foundation in the field of industrial control, but has little application in the aspects of audio and video multi-view scene display, such as multi-angle synchronous observation, military cooperative combat situation perception, large-scene high-synchronization 3D imaging and the like.
Disclosure of Invention
The technical problem solved by the invention is as follows: the method overcomes the defects of the prior art, and provides a multi-node audio and video information synchronous sharing and displaying method, which realizes synchronous sharing and displaying of each terminal node information in a chain or ring network topology by using a broadcast mode based on a time trigger network.
The technical solution of the invention is as follows: the invention discloses a multi-node audio and video information synchronous sharing display method, which comprises the following steps:
according to satellite time service data output by the satellite time service equipment, time synchronization is carried out on each terminal node in the topological network; the topological network is a ring topological network or a chain topological network;
after all the terminal nodes finish time synchronization, each terminal node starts synchronous acquisition and caching of audio and video data at a preset unified moment according to a communication protocol specification;
when any terminal node receives the monitoring instruction, the cached audio and video data is forwarded to a monitoring computer;
the monitoring computer stores all the forwarded audio and video data into a video buffer queue and an audio buffer queue in a classified manner;
and sequentially decoding the audio data and the video data in the video buffer queue and the audio buffer queue according to the queue sequence, and outputting and playing the audio data and the video data.
Compared with the prior art, the invention has the advantages that:
(1) the invention discloses a multi-node audio and video information synchronous sharing and displaying method, which is based on a time-triggered network and realizes synchronous sharing and displaying of information of each terminal node in a chained or ring network topology by utilizing a broadcast mode. The method has the other characteristic that no center exists, any terminal node in the network can be connected with other networks and can forward all terminal node data in the network, and the method has obvious advantages in resource consumption, networking convenience and freedom compared with the traditional point-to-point mode based on the switch.
(2) The invention discloses a multi-node audio and video information synchronous sharing display method, which selects high-precision satellite time service equipment as a global network time service source to carry out periodic time service, completes the high-precision time synchronization of each terminal node in the network, and the synchronization precision can reach sub-nanosecond level.
(3) The invention discloses a multi-node audio and video information synchronous sharing display method, which adopts a broadcast mode to exchange data, is based on a strict time trigger mechanism and is matched with a reasonably planned time slot configuration rule to realize the full utilization of channel resources, avoid channel competition and conflict and ensure that the data in each period can be normally and stably received and transmitted.
(4) The invention discloses a multi-node audio and video information synchronous sharing display method, which adopts a uniform fixed-length message format and a uniform message coding check method so as to obtain a uniform transmission error rate index to ensure reliable communication. In a time-triggered network, each terminal node needs to generate a data stream according to a consistent format for communication, and a time-triggered network frame is a basic information element in the network, and a uniform frame structure should be established according to data characteristics in practical application.
(5) The invention discloses a multi-node audio and video information synchronous sharing display method, wherein each terminal node comprises at least 3 standard Ethernet interfaces, 2 of which are used for networking connection, and 1 of which is used for expanding links (which can be accessed to a monitoring computer for receiving and displaying information such as images, texts, audios and the like), so that the real-time display of any terminal node information in a network or the synchronous display of a plurality of terminal node information is realized. After receiving the data through the network interface, the display monitoring computer processes the data according to the communication protocol frame and performs corresponding display action according to the instruction; the configuration requirement is relatively low, and normal display can be completed without a special module card or a central controller.
Drawings
FIG. 1 is a diagram showing an architecture of a topology network system according to the present invention;
FIG. 2 is a schematic diagram of a terminal node structure according to the present invention;
fig. 3 shows a network data slot configuration diagram in the present invention.
Detailed Description
Example 1
In this embodiment, the method for synchronously sharing and displaying the audio and video information of multiple nodes includes the following steps:
and step S1, according to the satellite time service data output by the satellite time service equipment, time synchronization is carried out on each terminal node in the topological network.
The topological network is a ring topological network or a chain topological network.
Each terminal node can realize time synchronization by the following modes: receiving satellite time service data output by the satellite time service equipment; according to the satellite time service data, performing master-slave identification on each terminal node in the topological network, and determining a master node and a slave node; and carrying out time synchronization on the master node and the slave nodes according to a preset time configuration table.
Further, the determination of the master node and the slave node may be implemented as follows: judging whether a terminal node captures satellite time service data or not in a self-checking period; when it is determined that a certain terminal node captures satellite time service data in a self-checking period, correcting local time, local second signals and local clock frequency of the certain terminal node according to the satellite time service data, and setting a time reference identifier for the certain terminal node after the correction is completed so as to determine that the certain terminal node is a main node; and when other terminal nodes except the certain terminal node are determined not to capture satellite time service data in a self-test period (such as 1s), identifying the other terminal nodes except the certain terminal node as slave nodes. The local time can be corrected according to the time service time information in the satellite time service data; correcting the phase of the local second signal according to the phase difference between the reference second signal and the local second signal in the satellite time service data; and continuously correcting the output value of the voltage-controlled crystal oscillator regulating and controlling circuit according to the frequency difference between the system clock frequency and the local clock frequency in the satellite time service data, so that the local clock frequency always follows the system clock frequency.
Further, time synchronization of the master node and the slave node may be achieved by: according to a preset time configuration table, when the time slot of the master node arrives, executing a time synchronization algorithm of a standard IEEE1588 protocol, carrying out 1-time synchronization with two adjacent slave nodes at two ends, and identifying the synchronization; when the time slot of the synchronized slave node arrives, the synchronized slave node performs 1 time of time synchronization with the downstream slave node of the synchronized slave node, and the synchronized slave node is marked as synchronized; and repeating the execution until the time synchronization of all the slave nodes is completed.
Further, the preset time configuration table may be configured offline, and the preset time configuration table may be transferred to each terminal node.
And step S2, after all the terminal nodes finish time synchronization, each terminal node starts synchronous acquisition and caching of audio and video data at a preset unified moment according to the communication protocol specification.
The synchronous acquisition and caching process of the audio and video data comprises the following steps: each terminal node simultaneously acquires audio and video data with equal time length from corresponding audio and video acquisition equipment at the same time and caches the audio and video data; and when the time slot of any terminal node arrives, forwarding the audio and video data to an adjacent terminal node, and stopping forwarding until the received message node identifier is the identifier of the terminal node, thereby completing one-time whole-network broadcast transmission.
And step S3, when any terminal node receives the monitoring instruction, forwarding the cached audio and video data to the monitoring computer.
In this embodiment, the monitoring computer may be connected to any terminal node, and send a monitoring instruction to the connected terminal node.
And step S4, the monitoring computer stores all the forwarded audio and video data into the video buffer queue and the audio buffer queue in a classified manner.
And step S5, decoding the audio data and the video data in the video buffer queue and the audio buffer queue in sequence according to the queue order, and outputting and playing.
In the present embodiment, referring to fig. 1, an architecture diagram of a topology network system in the present invention is shown. Referring to fig. 2, a schematic diagram of a terminal node structure in the present invention is shown, in this embodiment, the multi-node audio/video information synchronization sharing display method is implemented based on the topology network system. As can be seen from fig. 2, the topological network system includes: the system comprises a satellite time service device, at least three terminal nodes, at least three groups of audio and video acquisition devices and a monitoring computer. The satellite time service equipment is connected with each terminal node, at least three groups of audio and video acquisition equipment are respectively connected with at least three corresponding terminal nodes, and the monitoring computer can be accessed to any terminal node. Further, at least three terminal nodes are sequentially connected end to construct a ring topology network; and if at least three terminal nodes are connected in sequence, a chain type topological network can be constructed.
Furthermore, the structure and function of each terminal node are completely consistent. As shown in fig. 2, the terminal node includes at least: the system comprises 3 gigabit Ethernet interfaces (Ethernet PHY), 1 data interface (Uart interface), a main control chip FPGA (4), a voltage-controlled crystal oscillator regulation and control circuit and an audio and video acquisition interface. Furthermore, two gigabit Ethernet interfaces are used for being connected with other terminal nodes to realize ring topology network networking, and the remaining one gigabit Ethernet interface is used for being connected with a monitoring computer to realize output display and system control of audio and video data; data interface is used for connecting satellite time service equipment
Example 2
The following description is given by using an example of audio and video synchronous display of the time-triggered 16 terminal nodes.
As mentioned above, the topology network system is composed of a satellite time service device, a terminal node, an audio and video acquisition device and a monitoring computer. Audio and video signals collected by audio and video collecting equipment connected with 16 terminal nodes can be synchronously displayed on a monitoring computer in real time, the frame rate of the video signals is 15fps, and the size of picture pixels is 320 x 240; the audio signal is transmitted by adopting a pulse width density modulation signal PDM with a sampling rate of 1.2MHz, the transcoding from the PDM to a PCM standard audio format is realized by utilizing a digital sampling filter at a monitoring computer end, and finally, 16-bit audio with the sampling rate of 40KHz is output.
The data transmitted throughout the network is video data, audio data, instructions, time stamp data, and the like. And the audio and video data collected by each terminal node is transmitted by the adjacent terminal nodes at two ends, so that each terminal node can receive the data of all other terminal nodes in the same loop. Network data can be conveniently extracted from any terminal node, and meanwhile, a monitoring terminal instruction can be sent to each terminal node.
In order to ensure that audio and video data of each terminal node can be transmitted to a monitoring computer without conflict and meet the delay requirement, a TTE time triggering mechanism is required to be utilized, and after TTE whole network time is unified, a time slot configuration rule meeting the actual application requirement is formulated to send the data. Video data, audio data, and instructions are defined as TT data and time stamp data is defined as BE data according to the type of data. In order to enable a monitoring terminal to receive audio and video data of all terminal nodes within an acceptable delay range, the video acquisition rate is set to be 15fps, one second time is equally divided into 15 parts, each part of time is Tx (x is 0,1,2 … 14), first frame audio and video data of all terminal nodes are transmitted within T0 time, second frame audio and video data of all terminal nodes are transmitted at T1 time, the maximum data delay of the whole network can be controlled within 67ms until fifteenth frame audio and video data are transmitted, and the delay is within the acceptable range. In order to ensure that audio-video data of all terminal nodes are transmitted in the Tx time period without collision, the Tx time period needs to be divided into 16 shares, each of which is Txy (y is 0,1,2 … 7). And transmitting the audio and video information of the first terminal node in the Tx0 time period, and transmitting the audio and video information of the second terminal node in the Tx1 time period until the audio and video information of the 16 terminal nodes is completely transmitted, as shown in FIG. 3.
All network data adopt UDP protocol, and the format of the network data is defined as follows:
1. video data. The video data is in an RGB565 format, each frame of UDP message transmits video data of one line of pixel points, a line has 320 pixel points, and the data of each pixel point is two bytes, and the total number of the bytes is 640. The type, channel number, and line number are added before the pixel data, so that the video data packet length of one frame is 643 byte. The specific contents are shown in the following table 1:
TABLE 1
Definition of Content providing method and apparatus Length (byte) Remarks for note
Data type identification 0x00 1 0x00 denotes video data
Node ID 0x00~0x0F 1 0x00 start, total 16 nodes
Video frame line number 0x00~0xEF 1 0x00 start, for a total of 240 rows
RGB data 640 0x00 denotes video data
2. Audio data. The audio data is in PDM format with a sampling rate of 1.2 MHz. 150000 bytes per second, equally divided into 150, each frame UDP packet carries one copy of audio data, i.e. 1000 bytes. The audio data is preceded by a type, channel number. The specific contents are shown in the following table 2:
TABLE 2
Definition of Content providing method and apparatus Length (byte) Remarks for note
Data type identification 0x01 1 0x01 denotes audio data
Node ID 0x00~0x0F 1 Starting at 0x00 for a total of 16 channels
PDM data 1000 0x00 start, for a total of 240 rows
3. The instruction data format. The instruction data is in a self-defined format and is sent to the connected nodes by the monitoring terminal computer, then the instruction data is forwarded to the node network in a broadcast mode, and each node judges and executes corresponding operation according to the node ID information in the format. The specific format is shown in table 3 below:
TABLE 3
Figure BDA0002259990550000071
4. Timestamp data format. The timestamp data format is a self-defined format, and the timestamp messages between two nodes are interacted in an IEEE 1588P 2P message mode. Because the timestamp messages are interacted between two adjacent nodes, a channel number does not need to be set. The specific format is as follows 4:
TABLE 4
Figure BDA0002259990550000081
Each terminal node in the network operates according to the following main flow cycle:
1. and starting the computer. After power-on starting, each terminal node self-checks whether the terminal node is connected with the satellite time service equipment, the connected terminal node marks the terminal node as a 'main node', and the unconnected terminal node marks the terminal node as a 'slave node'; the main node executes a time service calibration program, calibrates the self clock and the satellite time service, adjusts the voltage-controlled crystal oscillator to enable the frequency of the local clock to be consistent with that of the time service clock, and sets the self active time to be the Xth time slot TxX according to the self node number X after the completion.
2. The master node enters a time service mode, clock synchronization messages are sent to the slave nodes at the two ends, clock synchronization calibration is carried out by adopting an IEEE1588V2 protocol, after the calibration is finished, the slave nodes at the two ends set corresponding active time slots according to the ID numbers of the slave nodes, and simultaneously, audio and video information starts to be collected and packaged, and the audio and video information is broadcasted and sent after the active time slots arrive.
3. The synchronized slave nodes continue to perform clock synchronization with the downstream slave nodes in the idle time period of the time slots of the synchronized slave nodes, and the timestamp data is not forwarded but only responded; so that all end nodes can complete one synchronization within 67 ms.
4. All terminal nodes firstly complete the packing and sending of TT data in self active time slots, complete the passing-through type forwarding of TT data and the acquisition and caching of self data in inactive time slots, and stop forwarding when the terminal nodes receive the TT data identified by self ID in each time slot, and at the moment, the TT data is sent to all terminal nodes in the network.
5. After receiving the connection request, the terminal nodes automatically forward TT data to the monitoring terminal, and the monitoring terminal synchronously displays video pictures of 16 terminal nodes after decoding audio and video data through application software and can select audio data of 1 terminal node for playing. The monitored terminal node broadcasts and sends to the terminal node network when receiving the monitoring terminal instruction message.
The main flow of audio and video decoding and displaying is as follows:
1. and after receiving the data, the UDP carries out data packet caching according to the data type identifier.
2. The video decoding thread judges according to the buffer amount of the video data, and pushes the frame of complete image data to a UI (user interface) display thread to finish the display of a frame of picture after the buffer of the frame of complete image data is finished;
3. and the audio data receiving and caching thread judges the node identification of the current audio to be played according to the current software control requirement, and screens and caches the corresponding PDM audio data according to the node identification.
4. And the audio decoding thread judges according to the receiving buffer amount of the PDM audio data, and when the buffer amount of the PDM data is more than 2Kbyte, the PDM digital sampling filtering is started to generate PCM data and store the PCM data in a PCM buffer area.
5. When the PCM cache data volume in the audio decoding thread is more than 10Kbyte, pushing the PCM cache to a PCM playing thread for playing; and simultaneously deleting corresponding data by the PCM buffer queue.
6. When the application program instruction is issued, the monitored terminal node receives the instruction and then forwards the corresponding instruction after waiting for the arrival of the self active time slot.
Those skilled in the art will appreciate that those matters not described in detail in the present specification are well known in the art.

Claims (10)

1. A multi-node audio and video information synchronous sharing display method is characterized by comprising the following steps:
according to satellite time service data output by the satellite time service equipment, time synchronization is carried out on each terminal node in the topological network; the topological network is a ring topological network or a chain topological network;
after all the terminal nodes finish time synchronization, each terminal node starts synchronous acquisition and caching of audio and video data at a preset unified moment according to a communication protocol specification;
when any terminal node receives the monitoring instruction, the cached audio and video data is forwarded to a monitoring computer;
the monitoring computer stores all the forwarded audio and video data into a video buffer queue and an audio buffer queue in a classified manner;
and sequentially decoding the audio data and the video data in the video buffer queue and the audio buffer queue according to the queue sequence, and outputting and playing the audio data and the video data.
2. The multi-node audio and video information synchronous sharing display method according to claim 1, wherein time synchronization is performed on each terminal node in the topology network according to satellite time service data output by the satellite time service equipment, and the method comprises the following steps:
receiving satellite time service data output by the satellite time service equipment;
according to the satellite time service data, performing master-slave identification on each terminal node in the topological network, and determining a master node and a slave node;
and carrying out time synchronization on the master node and the slave nodes according to a preset time configuration table.
3. The multi-node audio and video information synchronous sharing display method according to claim 2, wherein the master-slave identification is performed on each terminal node in the topological network according to the satellite time service data, and the determination of the master node and the slave node comprises the following steps:
judging whether a terminal node captures satellite time service data or not in a self-checking period;
when it is determined that a certain terminal node captures satellite time service data in a self-checking period, correcting local time, local second signals and local clock frequency of the certain terminal node according to the satellite time service data, and identifying the certain terminal node as a main node after the correction is completed;
and when other terminal nodes except the certain terminal node are determined not to capture satellite time service data in the self-test period, identifying the other terminal nodes except the certain terminal node as slave nodes.
4. The multi-node audio and video information synchronous sharing display method according to claim 3, wherein the time synchronization of the master node and the slave node is performed according to a preset time configuration table, and the method comprises the following steps:
according to a preset time configuration table, when the time slot of a master node arrives, the master node performs 1 time of time synchronization with two adjacent slave nodes at two ends, and the master node is identified as synchronized;
when the time slot of the synchronized slave node arrives, the synchronized slave node performs 1 time of time synchronization with the downstream slave node and is marked as synchronized;
and repeating the execution until the time synchronization of all the slave nodes is completed.
5. The multi-node audio and video information synchronous sharing display method according to claim 1, wherein the synchronous acquisition and caching process of the audio and video data is as follows:
each terminal node simultaneously acquires audio and video data with equal time length from corresponding audio and video acquisition equipment at the same time and caches the audio and video data;
and when the time slot of any terminal node arrives, forwarding the audio and video data to an adjacent terminal node, and stopping forwarding until the received message node identifier is the identifier of the terminal node, thereby completing one-time whole-network broadcast transmission.
6. The multi-node audio-video information synchronous sharing display method according to claim 1, further comprising:
and sequentially connecting at least three terminal nodes end to construct a ring topology network.
7. The multi-node audio-video information synchronous sharing display method according to claim 1, further comprising:
and connecting at least three terminal nodes in sequence to construct the chain type topological network.
8. The multi-node audio-video information synchronous sharing display method according to claim 6 or 7, further comprising:
connecting the satellite time service equipment with each terminal node;
and respectively connecting at least three groups of audio and video acquisition equipment with at least three corresponding terminal nodes.
9. The multi-node audio-video information synchronous sharing display method according to claim 2, further comprising: and configuring the preset time configuration table in an off-line manner, and transferring the preset time configuration table to each terminal node.
10. The multi-node audio-video information synchronous sharing display method according to claim 1, further comprising:
and connecting the monitoring computer with any terminal node, and sending a monitoring instruction to the connected terminal node.
CN201911067949.2A 2019-11-04 2019-11-04 Multi-node audio and video information synchronous sharing display method Active CN110958072B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911067949.2A CN110958072B (en) 2019-11-04 2019-11-04 Multi-node audio and video information synchronous sharing display method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911067949.2A CN110958072B (en) 2019-11-04 2019-11-04 Multi-node audio and video information synchronous sharing display method

Publications (2)

Publication Number Publication Date
CN110958072A true CN110958072A (en) 2020-04-03
CN110958072B CN110958072B (en) 2021-11-05

Family

ID=69976528

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911067949.2A Active CN110958072B (en) 2019-11-04 2019-11-04 Multi-node audio and video information synchronous sharing display method

Country Status (1)

Country Link
CN (1) CN110958072B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101938606A (en) * 2009-07-03 2011-01-05 北京大学 Method, system and device for propelling multimedia data
CN102868939A (en) * 2012-09-10 2013-01-09 杭州电子科技大学 Method for synchronizing audio/video data in real-time video monitoring system
CN103237191A (en) * 2013-04-16 2013-08-07 成都飞视美视频技术有限公司 Method for synchronously pushing audios and videos in video conference
CN103237255A (en) * 2013-04-24 2013-08-07 南京龙渊微电子科技有限公司 Multi-thread audio and video synchronization control method and system
CN103905833A (en) * 2013-07-12 2014-07-02 吉首大学 Distributed network video data mining and collecting system based on cloud calculation
CN106850466A (en) * 2017-02-22 2017-06-13 电子科技大学 The retransmission method and device of packet in a kind of time-triggered network

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101938606A (en) * 2009-07-03 2011-01-05 北京大学 Method, system and device for propelling multimedia data
CN102868939A (en) * 2012-09-10 2013-01-09 杭州电子科技大学 Method for synchronizing audio/video data in real-time video monitoring system
CN103237191A (en) * 2013-04-16 2013-08-07 成都飞视美视频技术有限公司 Method for synchronously pushing audios and videos in video conference
CN103237255A (en) * 2013-04-24 2013-08-07 南京龙渊微电子科技有限公司 Multi-thread audio and video synchronization control method and system
CN103905833A (en) * 2013-07-12 2014-07-02 吉首大学 Distributed network video data mining and collecting system based on cloud calculation
CN106850466A (en) * 2017-02-22 2017-06-13 电子科技大学 The retransmission method and device of packet in a kind of time-triggered network

Also Published As

Publication number Publication date
CN110958072B (en) 2021-11-05

Similar Documents

Publication Publication Date Title
US8699646B2 (en) Media clock negotiation
EP1953937B1 (en) Clock synchronization aid device for communication station(s) of a wireless network, and associated clock synchronization device
US9906320B2 (en) Industrial network apparatus and data communication method
US8914662B2 (en) Implementing transparent clock by correcting time information carried in data using residence time information
US20100034191A1 (en) Method and system for time synchronization in a sensor network
EP3163786B1 (en) Clock synchronization method and apparatus
CN102427426B (en) Method and device for simultaneously supporting AFDX (Avionics Full-duplex Switched Ethernet) and common Ethernet switching
US10778359B2 (en) Time synchronization method, programmable logic device, single board and network element
US8036202B2 (en) Physical layer transceiver with integrated time synchronization
US10805880B2 (en) Communication device, communication method, and computer readable medium
CN101753578B (en) ETHERNET/EI protocol conversion method and protocol converter
CN111385048A (en) Time synchronization method and system
JP5127482B2 (en) Timing synchronization method, synchronization apparatus, synchronization system, and synchronization program
JP2009005340A (en) Method for transmission and reception of data content in communication network, computer program product, storage medium and devices
US20170272190A1 (en) Time synchronization method and apparatus for network devices and time synchronization server
CN112929117B (en) Compatible definable deterministic communication Ethernet
CN107508648A (en) Time triggered Ethernet substep time synchronized strategy based on functions of the equipments classification
CN110958072B (en) Multi-node audio and video information synchronous sharing display method
US20110026654A1 (en) Network device of high-precision synchronization type, network system, and frame transfer method
CN214480655U (en) Embedded equipment compatible with definable deterministic communication Ethernet
KR20120051632A (en) Method for clock synchronization in distributed system having ring topology and apparatus for the same
GB2595884A (en) Method, device and computer program for robust data transmission in daisy-chain networks
CN107959968B (en) High-precision low-overhead wireless sensor network clock synchronization method
WO2024045832A1 (en) Queue scheduling method and apparatus
CN115022208B (en) TTE network flow monitoring equipment and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant