CN103237255A - Multi-thread audio and video synchronization control method and system - Google Patents

Multi-thread audio and video synchronization control method and system Download PDF

Info

Publication number
CN103237255A
CN103237255A CN 201310144225 CN201310144225A CN103237255A CN 103237255 A CN103237255 A CN 103237255A CN 201310144225 CN201310144225 CN 201310144225 CN 201310144225 A CN201310144225 A CN 201310144225A CN 103237255 A CN103237255 A CN 103237255A
Authority
CN
China
Prior art keywords
video
audio
thread
output
frequency
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN 201310144225
Other languages
Chinese (zh)
Inventor
陈勇
王卫东
吴少校
乔崇
祁云嵩
徐钊
孟凡伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NANJING LONGYUAN MICROELECTRONIC TECHNOLOGY Co Ltd
Original Assignee
NANJING LONGYUAN MICROELECTRONIC TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NANJING LONGYUAN MICROELECTRONIC TECHNOLOGY Co Ltd filed Critical NANJING LONGYUAN MICROELECTRONIC TECHNOLOGY Co Ltd
Priority to CN 201310144225 priority Critical patent/CN103237255A/en
Publication of CN103237255A publication Critical patent/CN103237255A/en
Withdrawn legal-status Critical Current

Links

Images

Landscapes

  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The invention relates to a multi-thread audio and video synchronization control method and system. The method comprises the steps of establishing a demultiplexing thread, an audio and video decoding thread and an audio and video synchronous output thread; starting the demultiplexing thread to finish the demultiplexing processing of a transport stream or a program stream, respectively inserting audio and video elementary streams into tails of audio and video elementary stream queues, and activating the audio and video decoding thread; starting the audio and video decoding thread to respectively decode bit streams in the audio and video elementary stream queues, and respectively inserting time information and audio data and image data obtained after decoding into audio and video output queues; according to difference between a timestamp after decoding and a local system clock, starting the audio and video synchronous output thread, and selecting and outputting data which conforms to output time; and according to the difference between the timestamps of video and audio, conducting repetition and frame skipping processing to video playing to enable the audio and the video to be synchronously output. The multi-thread audio and video synchronization control method and system provided by the invention have the advantages that a large memory space is not needed for buffering data, the operating efficiency is high, the realization complexity is low and the method and the system are applicable to embedded operating systems.

Description

A kind of audio-video synchronization control method of multithreading and system
Technical field
The invention belongs to the digital audio/video technical field, relate to a kind of audio-video synchronization control method and system of multithreading.
Background technology
Present popular video encoding and decoding standard, MPEG-1 for example, MPEG-2, MPEG-4, H.26x wait all is to adopt traditional hybrid coding structure, this structure adopts the prediction in time and space, conversion, quantize and entropy coding method, frame of video is encoded to different frame types: infra-frame prediction frame (I frame), MB of prediction frame (P frame) and bi-directional predicted frames (B frame), this will cause video decode speed different because of video content, and the play frame rate of video is fixed, thereby problems such as video decode speed and display speed do not match have been caused, and Voice ﹠ Video is separately coding and transmission, and when playing, require to export synchronously, if do not adopt reasonable control method, be easy to cause audio frequency and video to export nonsynchronous phenomenon.
Ways of addressing this issue, prior art generally are image and the audio frames that adopts after bigger buffering area cushions coded data and coding.Because what the uncertainty of picture material was given buffer size again determines to have brought difficulty, the too little then data of buffering area are overflowed, and buffering area causes the waste of memory headroom too greatly again.
Summary of the invention
In order to solve the above problem of existing in prior technology, the object of the present invention is to provide a kind of audio-video synchronization control method of multithreading, described method adopts the Program Clock Reference PCR (Program Clock Reference) of operating system semaphore, conditional-variable, system layer definition and shows timestamp PTS (Presentation Time Stamp), thereby make audio frequency and video under the reference of same local system clock STC (System Time clock), export according to PTS strictness constantly, thereby realized the audio-visual synchronization broadcast.
Another object of the present invention is to provide a kind of audio-visual synchronization control system of multithreading.
Purpose of the present invention is achieved through the following technical solutions: a kind of audio-video synchronization control method of multithreading, may further comprise the steps: step 1, five threads of establishment are respectively demultiplexing thread, audio decoder thread, video decode thread, audio sync output thread and audio video synchronization output thread; Step 2, startup demultiplexing thread are finished the transport stream of system layer or the demultiplexing of program stream are handled, and transport stream or program stream are split into audio frequency stream, video-frequency basic flow and temporal information substantially; Then audio frequency is flowed substantially and insert the basic flow queue afterbody of audio frequency, video-frequency basic flow is inserted video-frequency basic flow formation afterbody, activate the decoding thread; If then demultiplexing thread suspension wait is overflowed in the basic flow queue of audio frequency or video-frequency basic flow formation; Bit stream in step 3, startup audio decoder thread and the video decode thread difference basic flow queue of decoded audio and the video-frequency basic flow formation, and with temporal information and decoded voice data and view data, insert audio frequency output queue afterbody and video output queue afterbody respectively, activate audio output line journey or video output cable journey; If overflowing the thread suspension of then decoding, audio frequency output queue or video output queue wait for; Step 4, the audio sync that starts are exported thread and audio video synchronization output thread, and according to the difference of decoded timestamp and local system clock, the data of selecting to meet output time are exported; And according to the difference of the timestamp of video and audio frequency, video playback is carried out repetition and frame-skipping is handled, thereby make audio-visual synchronization output.
In above-mentioned control method, the described activation decoding of step 2 thread is finished by semaphore being carried out the P operation.
In above-mentioned control method, step 3 described activation audio output line journey or video output cable journey are finished by the activation condition variable.
In above-mentioned control method, the described audio decoder process of step 3 is as follows: (a) the audio decoder thread block begins decoding in the audio signal amount after demultiplexing thread activation signal amount; (b) to the audio frequency of the basic flow queue head of audio frequency substantially stream decode; (c) the back one frame voice data of will decoding inserts audio frequency output queue afterbody, and activates audio frequency output condition variable; (d) the timestamp value in the basic flow queue of audio frequency is kept in the audio frequency output queue corresponding node; Described audio frequency output queue judges by the length of this formation whether it overflows, if the audio frequency output queue is overflowed then audio decoder thread suspension wait; The described video decoding process of step 3 is as follows: (a) the video decode thread block begins decoding in the vision signal amount after demultiplexing thread activation signal amount; (b) video-frequency basic flow of video-frequency basic flow queue heads is decoded; (c) a decoded frame image data is inserted video output queue afterbody, and activate video output condition variable; (d) the timestamp value in the video-frequency basic flow formation is kept in the video output queue corresponding node; Described video output queue judges by the length of this formation whether it overflows, if the video output queue is overflowed then video decode thread suspension wait.
In above-mentioned control method, the described audio frequency output procedure of step 4 is as follows: (a) the audio output line journey is blocked in audio frequency output condition variable; (b) behind audio decoder thread activation condition variable, calculate the local system clock of current time; (c) search timestamp value and the immediate formation node of local system clock value in the audio frequency output queue, with the voice data output of this formation node; The described video output procedure of step 4 is as follows: (a) the video output cable journey is blocked in video output condition variable; (b) behind video decode thread activation condition variable, calculate the local system clock of current time; (c) search timestamp value and the immediate formation node of local system clock value in the video output queue, calculate the difference of video time stamp value and audio time stamp value, if the video time stamp value is greater than the audio time stamp value, then repeat to show the image in this formation node, if the video time stamp value, is then skipped this formation node less than the audio time stamp value.
In above-mentioned control method, what the synchronous processing of step 2 and step 3 cross-thread was adopted is Semaphore Mechanism, by to the PV operational coordination demultiplexing thread of signal and decoding cross-thread synchronously.
In above-mentioned control method, what the synchronous processing of step 3 and step 4 cross-thread was adopted is conditional-variable, by the activation condition variable coordinate decoding thread and output cross-thread synchronously.
Another object of the present invention adopts following technical scheme to realize: a kind of audio-visual synchronization control system of multithreading comprises demultiplexing module, audio decoder, Video Decoder, audio sync output module, audio video synchronization output module, is used for preserving the basic flow queue of audio frequency, the video-frequency basic flow formation that is used for preserving video-frequency basic flow that audio frequency flows substantially, is used for preserving the audio frequency output queue of decoded voice data, the video output queue that is used for preserving decoded video data; Wherein demultiplexing module is used for transport stream or the program stream of system layer are carried out demultiplexing, be split into audio frequency flow substantially with video-frequency basic flow after insert the basic flow queue of audio frequency and video-frequency basic flow formation respectively, and obtain corresponding timestamp; Audio decoder and Video Decoder are decoded to the data in the basic flow queue of audio frequency and the video-frequency basic flow formation respectively, and decoded voice data and video data are inserted audio frequency output queue and video output queue respectively; Audio frequency output module, video output module are according to the timestamp of decoding back data and the difference of local system clock, selection meets the data of output time and exports, and according to the difference of the timestamp of video and audio frequency, video playback is carried out repetition and frame-skipping is handled.
The present invention is owing to adopted technique scheme, not only realized the synchronous output of audio frequency and video effectively, and do not need to cushion a large amount of video data encoders and AV data, thus reduced the use to system resource largely, be suitable for embedded platform; Software architecture adopts parallel processing in addition, the operational efficiency height, and implementation complexity is low.
 
Description of drawings
Fig. 1 is multithreading control flow chart of the present invention;
Fig. 2 is system block diagram of the present invention.
 
Embodiment
The present invention is described in further detail below in conjunction with embodiment and accompanying drawing, but embodiments of the present invention are not limited thereto.
The audio-video synchronization control method of multithreading of the present invention is based on the platform development of built-in Linux operating system and realization, audio frequency and video transport stream or program stream are carried out the parallel processing of multithreading by operating system, its specific implementation process comprises the steps: as shown in Figure 1
Step 1, five threads of establishment are respectively demultiplexing thread, audio decoder thread, video decode thread, audio sync output thread and audio video synchronization output thread.
Step 2, startup demultiplexing thread are finished the transport stream of system layer or the demultiplexing of program stream are handled, and transport stream or program stream are split into audio frequency stream, video-frequency basic flow and temporal information substantially; Then audio frequency is flowed substantially and insert the basic flow queue afterbody of audio frequency, video-frequency basic flow is inserted video-frequency basic flow formation afterbody, activate the decoding thread; If then demultiplexing thread suspension wait is overflowed in the basic flow queue of audio frequency or video-frequency basic flow formation.
For transport stream, the described audio frequency of this step flows substantially with video-frequency basic flow to be distinguished by pid value (Process identifier), at first resolve the packet that pid value is 0x00, Program Association Table PAT from this packet obtains the pid value of the Program Map Table PMT of each program, certain program of decoding is then resolved the Program Map Table of this program correspondence, thereby obtain pid value that this program sound intermediate frequency flows substantially and the pid value of video-frequency basic flow, receive the packet of corresponding pid value then, data recombination is become PES(Packetized Elementary Streams, has the elementary stream in packet header) the basic flow queue of grouping back insertion.For program stream, then being by the Stream ID(flow identifier in each PES packet) the mark position to distinguish Voice ﹠ Video flows substantially.
The described temporal information of this step refers to timestamp PTS and the Program Clock Reference PCR in the PES grouping, and the pts value that parsing is obtained is kept in the corresponding formation node.The demultiplexing thread carries out P operation (namely to semaphore application resource) to semaphore and activates the decoding thread after the basic stream with a whole PES grouping inserts basic flow queue; The basic flow queue of described audio frequency or video-frequency basic flow formation judge by the length of this formation whether it overflows, if overflow then demultiplexing thread suspension wait.
Bit stream in step 3, startup audio decoder thread and the video decode thread difference basic flow queue of decoded audio and the video-frequency basic flow formation, and with temporal information and decoded voice data and view data, insert audio frequency output queue afterbody and video output queue afterbody respectively, activate audio output line journey or video output cable journey, wait for if audio frequency output queue or video output queue are overflowed the thread suspension of then decoding.Wherein decoded voice data is PCM data (Pulse Code Modulation, pulse modulation coded datas).
The described audio decoder process of this step is as follows: (a) the audio decoder thread block begins decoding in the audio signal amount after demultiplexing thread activation signal amount; (b) to the audio frequency of the basic flow queue head of audio frequency substantially stream decode; (c) the back one frame audio frequency PCM data of will decoding are inserted audio frequency output queue afterbody, and activate audio frequency output condition variable; (d) the timestamp pts value in the basic flow queue of audio frequency is kept in the audio frequency output queue corresponding node; Described audio frequency output queue judges by the length of this formation whether it overflows, if the audio frequency output queue is overflowed then audio decoder thread suspension wait.
The described video decoding process of this step is as follows: (a) the video decode thread block begins decoding in the audio signal amount after demultiplexing thread activation signal amount; (b) video-frequency basic flow of video-frequency basic flow queue heads is decoded; (c) a decoded frame image data is inserted video output queue afterbody, and activate video output condition variable; (d) the timestamp pts value in the video-frequency basic flow formation is kept in the video output queue corresponding node; Described video output queue judges by the length of this formation whether it overflows, if the video output queue is overflowed then video decode thread suspension wait.
Step 4, the audio sync that starts are exported thread and audio video synchronization output thread, and according to the difference of decoded timestamp and local system clock, the data of selecting to meet output time are exported; And according to the difference of the timestamp of video and audio frequency, video playback is carried out repetition and frame-skipping is handled, thereby make audio-visual synchronization output.
The described audio frequency output procedure of this step is realized by following steps: (a) the audio output line journey is blocked in audio frequency output condition variable; (b) behind audio decoder thread activation condition variable, calculate the local system clock STC of current time; (c) search timestamp pts value and the immediate formation node of STC value in the audio frequency output queue, with the audio frequency PCM data output of this formation node.
The described video output procedure of this step is realized by following steps: (a) the video output cable journey is blocked in video output condition variable; (b) behind video decode thread activation condition variable, calculate the local system clock STC of current time; (c) search pts value and the immediate formation node of STC value in the video output queue, calculate the difference of video PTS and audio frequency PTS, if the video pts value greater than the audio frequency pts value, then repeats to show the image in this formation node, if the video pts value, is then skipped this formation node less than the audio frequency pts value.
What the synchronous processing of step 2 and step 3 cross-thread was adopted is Semaphore Mechanism, by to the PV operational coordination demultiplexing thread of signal and decoding cross-thread synchronously, wherein the PV operation is made up of p operation (being that semaphore adds) and v operation (be that semaphore subtracts, hang up thread if semaphore is zero).What step 2 and step 3 cushioned basic flow data and decoded The data is formation, inserts data at rear of queue, in queue heads data is decoded.What the synchronous processing of step 3 and step 4 cross-thread was adopted is conditional-variable, namely activates corresponding output thread by the activation condition variable.
 
The present invention adopts the audio-visual synchronization control system of multithreading to realize above-mentioned control method.As shown in Figure 2, control system of the present invention comprises demultiplexing module, audio decoder, Video Decoder, the audio sync output module, the audio video synchronization output module, be used for preserving the basic flow queue of audio frequency that audio frequency flows substantially, be used for preserving the video-frequency basic flow formation of video-frequency basic flow, be used for preserving the audio frequency output queue of decoded voice data, be used for preserving the video output queue of decoded video data, wherein demultiplexing module is used for transport stream or the program stream of system layer are carried out demultiplexing, be split into audio frequency flow substantially with video-frequency basic flow after insert the basic flow queue of audio frequency and video-frequency basic flow formation respectively, and obtain corresponding timestamp; Audio decoder and Video Decoder are decoded to the data in the basic flow queue of audio frequency and the video-frequency basic flow formation respectively, and decoded voice data and video data are inserted audio frequency output queue and video output queue respectively; Audio frequency output module, video output module are according to the timestamp of decoding back data and the difference of local system clock, selection meets the data of output time and exports, and according to the difference of the timestamp of video and audio frequency, video playback is carried out repetition and frame-skipping processing, thereby make audio-visual synchronization output.Wherein decoded voice data is the PCM data.
Above-described embodiment is preferred implementation of the present invention; but embodiments of the present invention are not restricted to the described embodiments; other any do not deviate from change, the modification done under spiritual essence of the present invention and the principle, substitutes, combination, simplify; all should be the substitute mode of equivalence, be included within protection scope of the present invention.

Claims (10)

1. the audio-video synchronization control method of a multithreading is characterized in that may further comprise the steps:
Step 1, five threads of establishment are respectively demultiplexing thread, audio decoder thread, video decode thread, audio sync output thread and audio video synchronization output thread;
Step 2, startup demultiplexing thread are finished the transport stream of system layer or the demultiplexing of program stream are handled, and transport stream or program stream are split into audio frequency stream, video-frequency basic flow and temporal information substantially; Then audio frequency is flowed substantially and insert the basic flow queue afterbody of audio frequency, video-frequency basic flow is inserted video-frequency basic flow formation afterbody, activate the decoding thread; If then demultiplexing thread suspension wait is overflowed in the basic flow queue of audio frequency or video-frequency basic flow formation;
Bit stream in step 3, startup audio decoder thread and the video decode thread difference basic flow queue of decoded audio and the video-frequency basic flow formation, and with temporal information and decoded voice data and view data, insert audio frequency output queue afterbody and video output queue afterbody respectively, activate audio output line journey or video output cable journey; If overflowing the thread suspension of then decoding, audio frequency output queue or video output queue wait for;
Step 4, the audio sync that starts are exported thread and audio video synchronization output thread, and according to the difference of decoded timestamp and local system clock, the data of selecting to meet output time are exported; And according to the difference of the timestamp of video and audio frequency, video playback is carried out repetition and frame-skipping is handled, thereby make audio-visual synchronization output.
2. the audio-video synchronization control method of multithreading as claimed in claim 1, it is characterized in that: the described temporal information of step 2 comprises timestamp.
3. the audio-video synchronization control method of multithreading as claimed in claim 1 is characterized in that: the described activation decoding of step 2 thread is finished by semaphore being carried out the P operation.
4. the audio-video synchronization control method of multithreading as claimed in claim 1, it is characterized in that: step 3 described activation audio output line journey or video output cable journey are finished by the activation condition variable.
5. the audio-video synchronization control method of multithreading as claimed in claim 1 is characterized in that:
The described audio decoder process of step 3 is as follows: (a) the audio decoder thread block begins decoding in the audio signal amount after demultiplexing thread activation signal amount; (b) to the audio frequency of the basic flow queue head of audio frequency substantially stream decode; (c) the back one frame voice data of will decoding inserts audio frequency output queue afterbody, and activates audio frequency output condition variable; (d) the timestamp value in the basic flow queue of audio frequency is kept in the audio frequency output queue corresponding node; Described audio frequency output queue judges by the length of this formation whether it overflows, if the audio frequency output queue is overflowed then audio decoder thread suspension wait;
The described video decoding process of step 3 is as follows: (a) the video decode thread block begins decoding in the vision signal amount after demultiplexing thread activation signal amount; (b) video-frequency basic flow of video-frequency basic flow queue heads is decoded; (c) a decoded frame image data is inserted video output queue afterbody, and activate video output condition variable; (d) the timestamp value in the video-frequency basic flow formation is kept in the video output queue corresponding node; Described video output queue judges by the length of this formation whether it overflows, if the video output queue is overflowed then video decode thread suspension wait.
6. the audio-video synchronization control method of multithreading as claimed in claim 1, it is characterized in that: the described audio frequency output procedure of step 4 is as follows: (a) the audio output line journey is blocked in audio frequency output condition variable; (b) behind audio decoder thread activation condition variable, calculate the local system clock of current time; (c) search timestamp value and the immediate formation node of local system clock value in the audio frequency output queue, with the voice data output of this formation node;
The described video output procedure of step 4 is as follows: (a) the video output cable journey is blocked in video output condition variable; (b) behind video decode thread activation condition variable, calculate the local system clock of current time; (c) search timestamp value and the immediate formation node of local system clock value in the video output queue, calculate the difference of video time stamp value and audio time stamp value, if the video time stamp value is greater than the audio time stamp value, then repeat to show the image in this formation node, if the video time stamp value, is then skipped this formation node less than the audio time stamp value.
7. the audio-video synchronization control method of multithreading as claimed in claim 1 is characterized in that: what step 2 and the synchronous processing of step 3 cross-thread were adopted is Semaphore Mechanism, by to the PV operational coordination demultiplexing thread of signal and decoding cross-thread synchronously.
8. the audio-video synchronization control method of multithreading as claimed in claim 1 is characterized in that: what step 3 and the synchronous processing of step 4 cross-thread were adopted is conditional-variable, by the activation condition variable coordinate decoding thread and output cross-thread synchronously.
9. the audio-visual synchronization control system of a multithreading is characterized in that: comprise demultiplexing module, audio decoder, Video Decoder, audio sync output module, audio video synchronization output module, be used for preserving the basic flow queue of audio frequency, the video-frequency basic flow formation that is used for preserving video-frequency basic flow that audio frequency flows substantially, be used for preserving the audio frequency output queue of decoded voice data, the video output queue that is used for preserving decoded video data; Wherein demultiplexing module is used for transport stream or the program stream of system layer are carried out demultiplexing, be split into audio frequency flow substantially with video-frequency basic flow after insert the basic flow queue of audio frequency and video-frequency basic flow formation respectively, and obtain corresponding timestamp; Audio decoder and Video Decoder are decoded to the data in the basic flow queue of audio frequency and the video-frequency basic flow formation respectively, and decoded voice data and video data are inserted audio frequency output queue and video output queue respectively; Audio frequency output module, video output module are according to the timestamp of decoding back data and the difference of local system clock, selection meets the data of output time and exports, and according to the difference of the timestamp of video and audio frequency, video playback is carried out repetition and frame-skipping is handled.
10. the audio-visual synchronization control system of multithreading as claimed in claim 9, it is characterized in that: described decoded voice data is the pulse modulation coded data.
CN 201310144225 2013-04-24 2013-04-24 Multi-thread audio and video synchronization control method and system Withdrawn CN103237255A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201310144225 CN103237255A (en) 2013-04-24 2013-04-24 Multi-thread audio and video synchronization control method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201310144225 CN103237255A (en) 2013-04-24 2013-04-24 Multi-thread audio and video synchronization control method and system

Publications (1)

Publication Number Publication Date
CN103237255A true CN103237255A (en) 2013-08-07

Family

ID=48885266

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201310144225 Withdrawn CN103237255A (en) 2013-04-24 2013-04-24 Multi-thread audio and video synchronization control method and system

Country Status (1)

Country Link
CN (1) CN103237255A (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104159142A (en) * 2014-08-06 2014-11-19 乐视网信息技术(北京)股份有限公司 Video soft decoding method and device of equipment
CN104702914A (en) * 2015-01-14 2015-06-10 汉柏科技有限公司 Monitored video data processing method and system
CN105338426A (en) * 2015-10-30 2016-02-17 北京数码视讯科技股份有限公司 Correction method for time stamp in transport stream and device thereof
CN106658065A (en) * 2015-10-30 2017-05-10 中兴通讯股份有限公司 Audio and video synchronization method, device and system
CN106685895A (en) * 2015-11-09 2017-05-17 中国科学院声学研究所 Multi-input parameter cooperative media processing device supporting user intervention
CN107506176A (en) * 2016-06-14 2017-12-22 华为技术有限公司 A kind of method and apparatus for determining decoding task
CN108322811A (en) * 2018-02-26 2018-07-24 宝鸡文理学院 A kind of synchronous method in piano video teaching and system
CN108319438A (en) * 2017-01-16 2018-07-24 北京视联动力国际信息技术有限公司 A kind of method and apparatus of audio data collecting
CN108337248A (en) * 2017-01-20 2018-07-27 韩华泰科株式会社 Media playback and media serving device
CN110636359A (en) * 2018-06-21 2019-12-31 杭州海康威视数字技术股份有限公司 Method and device for synchronously playing audio and video
CN110958072A (en) * 2019-11-04 2020-04-03 北京航星机器制造有限公司 Multi-node audio and video information synchronous sharing display method
CN111131874A (en) * 2018-11-01 2020-05-08 珠海格力电器股份有限公司 Method and equipment for solving problem of H.256 code stream random access point playing jam
CN111526466A (en) * 2020-04-30 2020-08-11 成都千立网络科技有限公司 Real-time audio signal processing method for sound amplification system
CN113055711A (en) * 2021-02-22 2021-06-29 迅雷计算机(深圳)有限公司 Audio and video synchronization detection method and detection system thereof
WO2021217435A1 (en) * 2020-04-28 2021-11-04 青岛海信传媒网络技术有限公司 Streaming media synchronization method and display device

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104159142A (en) * 2014-08-06 2014-11-19 乐视网信息技术(北京)股份有限公司 Video soft decoding method and device of equipment
CN104702914A (en) * 2015-01-14 2015-06-10 汉柏科技有限公司 Monitored video data processing method and system
CN105338426A (en) * 2015-10-30 2016-02-17 北京数码视讯科技股份有限公司 Correction method for time stamp in transport stream and device thereof
CN106658065A (en) * 2015-10-30 2017-05-10 中兴通讯股份有限公司 Audio and video synchronization method, device and system
CN106658065B (en) * 2015-10-30 2021-10-22 中兴通讯股份有限公司 Audio and video synchronization method, device and system
CN106685895A (en) * 2015-11-09 2017-05-17 中国科学院声学研究所 Multi-input parameter cooperative media processing device supporting user intervention
CN106685895B (en) * 2015-11-09 2019-08-20 中国科学院声学研究所 A kind of multi input parameter collaboration media processor for supporting user intervention
CN107506176B (en) * 2016-06-14 2019-07-12 华为技术有限公司 A kind of method and apparatus of determining decoding task
CN107506176A (en) * 2016-06-14 2017-12-22 华为技术有限公司 A kind of method and apparatus for determining decoding task
US10886948B2 (en) 2016-06-14 2021-01-05 Huawei Technologies Co., Ltd. Method for determining a decoding task and apparatus
CN108319438B (en) * 2017-01-16 2019-05-17 视联动力信息技术股份有限公司 A kind of method and apparatus of audio data collecting
CN108319438A (en) * 2017-01-16 2018-07-24 北京视联动力国际信息技术有限公司 A kind of method and apparatus of audio data collecting
CN108337248A (en) * 2017-01-20 2018-07-27 韩华泰科株式会社 Media playback and media serving device
CN108337248B (en) * 2017-01-20 2021-03-30 韩华泰科株式会社 Media playback apparatus and media service apparatus
CN108322811A (en) * 2018-02-26 2018-07-24 宝鸡文理学院 A kind of synchronous method in piano video teaching and system
CN110636359A (en) * 2018-06-21 2019-12-31 杭州海康威视数字技术股份有限公司 Method and device for synchronously playing audio and video
CN110636359B (en) * 2018-06-21 2021-11-23 杭州海康威视数字技术股份有限公司 Method and device for synchronously playing audio and video
CN111131874A (en) * 2018-11-01 2020-05-08 珠海格力电器股份有限公司 Method and equipment for solving problem of H.256 code stream random access point playing jam
CN110958072A (en) * 2019-11-04 2020-04-03 北京航星机器制造有限公司 Multi-node audio and video information synchronous sharing display method
WO2021217435A1 (en) * 2020-04-28 2021-11-04 青岛海信传媒网络技术有限公司 Streaming media synchronization method and display device
CN114073098A (en) * 2020-04-28 2022-02-18 青岛海信传媒网络技术有限公司 Streaming media synchronization method and display device
CN114073098B (en) * 2020-04-28 2023-04-25 Vidaa(荷兰)国际控股有限公司 Streaming media synchronization method and display device
CN111526466A (en) * 2020-04-30 2020-08-11 成都千立网络科技有限公司 Real-time audio signal processing method for sound amplification system
CN113055711A (en) * 2021-02-22 2021-06-29 迅雷计算机(深圳)有限公司 Audio and video synchronization detection method and detection system thereof

Similar Documents

Publication Publication Date Title
CN103237255A (en) Multi-thread audio and video synchronization control method and system
CN101984672B (en) Method and device for multi-thread video and audio synchronous control
US10827208B2 (en) Transmitting method, receiving method, transmitting device and receiving device
US9628771B2 (en) Transmitter and receiver for transmitting and receiving multimedia content, and reproduction method therefor
JP6184408B2 (en) Receiving apparatus and receiving method thereof
EP2757795B1 (en) Video multiplexing apparatus, video multiplexing method, multiplexed video decoding apparatus, and multiplexed video decoding method
US10129587B2 (en) Fast switching of synchronized media using time-stamp management
US10187646B2 (en) Encoding device, encoding method, transmission device, decoding device, decoding method, and reception device
US11272196B2 (en) Coding apparatus, coding method, transmission apparatus, and reception apparatus
CN102761776A (en) Video and audio synchronizing method of P2PVoD (peer-to-peer video on demand) system based on SVC (scalable video coding)
US11722714B2 (en) Transmitting method, receiving method, transmitting device and receiving device
CN103458271A (en) Audio-video file splicing method and audio-video file splicing device
CN103686203A (en) Video transcoding method and device
CN106470291A (en) Recover in the interruption in time synchronized from audio/video decoder
US20140112395A1 (en) Method and apparatus for decoder buffering in hybrid coded video system
EP3179728A1 (en) Transmission method, reception method, transmission device, and reception device
US8428444B2 (en) Video server and seamless playback method
KR20140053777A (en) Method and apparatus for decoder buffering in hybrid coded video system
JP4967402B2 (en) Multiplexed stream conversion apparatus and method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C04 Withdrawal of patent application after publication (patent law 2001)
WW01 Invention patent application withdrawn after publication

Application publication date: 20130807