CN104618786A - Audio/video synchronization method and device - Google Patents
Audio/video synchronization method and device Download PDFInfo
- Publication number
- CN104618786A CN104618786A CN201410808969.1A CN201410808969A CN104618786A CN 104618786 A CN104618786 A CN 104618786A CN 201410808969 A CN201410808969 A CN 201410808969A CN 104618786 A CN104618786 A CN 104618786A
- Authority
- CN
- China
- Prior art keywords
- video
- timestamp
- data
- audio
- video data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 40
- 230000001105 regulatory effect Effects 0.000 claims abstract description 7
- 230000001360 synchronised effect Effects 0.000 claims description 39
- 238000012886 linear function Methods 0.000 claims description 11
- 230000001276 controlling effect Effects 0.000 claims description 3
- 230000005540 biological transmission Effects 0.000 abstract description 10
- 238000009877 rendering Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 5
- LENZDBCJOHFCAS-UHFFFAOYSA-N tris Chemical compound OCC(N)(CO)CO LENZDBCJOHFCAS-UHFFFAOYSA-N 0.000 description 3
- 230000002159 abnormal effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000008707 rearrangement Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
- H04N21/4305—Synchronising client clock from received content stream, e.g. locking decoder clock with encoder clock, extraction of the PCR packets
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4405—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving video stream decryption
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/8547—Content authoring involving timestamps for synchronizing content
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Security & Cryptography (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
The invention relates to an audio/video synchronization method and an audio/video synchronization device. The method includes the following steps of: decoding a received audio data packet and a received video data packet respectively to obtain the timestamp of audio data and the timestamp of video data; obtaining the timestamp of the currently played video data and the timestamp of the currently played audio data; calculating the time difference of the currently played audio/video according to the timestamp of the currently played video data and the timestamp of the currently played audio data; obtaining a regulating rate according to the time difference of the currently played audio/video; regulating the playing speed of the video or stopping the playing of the audio according to the regulating rate, to allow the audio data and the video data to play synchronously. When the network of the sending end is blocked, the video and audio played by the receiving end are prevented from being synchronously blocked, and the fluency of audio playing is improved; the audio/video synchronization method and device are particularly applied to synchronizing the audio and video in live video in audio/video separated collection, coding and transmission scenes.
Description
Technical field
The present invention relates to network information transfer field, particularly relate to a kind of audio and video synchronization method and device.
Background technology
Along with the development of network technology, user carries out online interaction frequent activity by network, such as carry out Online Video meeting or Online Video chat, need the view data and the voice data that gather session subscriber, namely audio, video data is transferred to other users, and locates other users the audio, video data playing this collection.
Traditional audio-video collection transmission means is collaborated audio frequency and video at transmitting terminal, then the data behind single channel transmission interflow are adopted, and at receiving terminal, the data separating behind interflow is play, transmitting terminal network occur card in, recipient play video and sound will occur simultaneously card pause.
Summary of the invention
Based on this, be necessary to occur that the problem that card pauses appears in the Audio and Video that card causes recipient to play simultaneously for transmitting terminal network in traditional audio and video playing, a kind of audio and video synchronization method and device are provided, Audio and Video can be avoided simultaneously to occur card, improve the fluency that audio frequency is play.
A kind of audio and video synchronization method, comprises the following steps:
The packets of audio data received respectively and video packets of data are decoded and obtains the timestamp of voice data and the timestamp of video data;
Obtain the timestamp of current playing video data and the timestamp of current playing audio-fequency data;
The time difference of current audio and video playing is calculated according to the timestamp of described current playing video data and the timestamp of current playing audio-fequency data;
Time difference according to described current audio and video playing obtains regulations speed;
Regulate described video playback speed according to described regulations speed or control audio frequency and stop playing, synchronous to make described voice data and video data play.
A kind of audio-visual synchronization device, comprising:
Decoder module, obtains the timestamp of voice data and the timestamp of video data for decoding to the packets of audio data received respectively and video packets of data;
Timestamp acquisition module, for the timestamp of the timestamp and current playing audio-fequency data that obtain current playing video data;
Time difference computing module, for calculating the time difference of current audio and video playing according to the timestamp of described current playing video data and the timestamp of current playing audio-fequency data;
Regulations speed acquisition module, obtains regulations speed for the time difference according to described current audio and video playing;
Adjustment module, stops playing for regulating described video playback speed according to described regulations speed or controlling audio frequency, synchronous to make described voice data and video data play.
Above-mentioned audio and video synchronization method and device, by calculating the time difference of current audio and video playing according to the timestamp of current playing video data and the timestamp of voice data, and obtain regulations speed according to the time difference of current audio and video playing, regulate video playback speed according to regulations speed or control audio frequency and stop playing, voice data and video data are play synchronous, card is there is immediately at transmitting terminal network, there is card in the Audio and Video that recipient can't be caused to play simultaneously, improve the fluency that audio frequency is play, especially be applied to and divide extracting, gathering in audio frequency and video, coding and transmitting scene under net cast in audio-visual synchronization, even if occur congested with under the scene of packet loss at transmitting terminal network, synchronously also can recover after network normally, and the fluency that sound is play is easier to ensure.
Accompanying drawing explanation
Fig. 1 is the applied environment schematic diagram of audio and video synchronization method and device in an embodiment;
Fig. 2 is the flow chart of audio and video synchronization method in an embodiment;
Fig. 3 is the flow chart of audio and video synchronization method in another embodiment;
Fig. 4 is the structural representation of an embodiment middle pitch video synchronization device;
Fig. 5 is the structural representation of another embodiment middle pitch video synchronization device;
Fig. 6 is the structural representation of another embodiment middle pitch video synchronization device.
Embodiment
In order to make object of the present invention, technical scheme and advantage clearly understand, below in conjunction with drawings and Examples, the present invention is further elaborated.Should be appreciated that specific embodiment described herein only in order to explain the present invention, be not intended to limit the present invention.
Be appreciated that term used in the present invention " first ", " second " etc. can in this article for describing various element, but these elements do not limit by these terms.These terms are only for distinguishing first element and another element.For example, without departing from the scope of the invention, first terminal the second terminal can be called, and similarly, the second terminal first terminal can be called.First terminal and the second terminal both terminals, but it is not same terminal.
Fig. 1 is the applied environment schematic diagram of audio and video synchronization method and device in an embodiment.As shown in Figure 1, this applied environment comprises first terminal 110, audio server 120, video server 130 and the second terminal 140.Wherein, first terminal 110 and the second terminal 140 all can be desktop computer, notebook computer, panel computer, personal digital assistant, smart mobile phone etc.The quantity of first terminal 110 and the second terminal 140 can be one or more, not limiting, only illustrating to illustrate herein at this.First terminal 110 is as the transmitting terminal of audio, video data, and the second terminal 140 is as the receiving terminal of audio, video data.First user is positioned on first terminal 110, and the second user is positioned in the second terminal 140.
First terminal 110 is for the timestamp of video data and correspondence thereof that gathers first user and the timestamp of voice data and correspondence thereof, and the video data of collection and the timestamp of correspondence thereof are encoded and sends to video server 130 after being packaged into video packets of data, and the voice data of collection and the timestamp of correspondence thereof are encoded and sends to audio server 120 after being packaged into packets of audio data.
Packets of audio data is sent to the second terminal 140 by audio server 120, and video packets of data is sent to the second user in the second terminal 140 by video server 130.
The timestamp that carrying out respectively after second terminal 140 receives packets of audio data and video packets of data decodes obtains voice data and correspondence thereof and the timestamp of video data and correspondence thereof, and the time difference of audio and video playing is calculated according to the timestamp of the video data of current broadcasting and the timestamp of voice data, and calculate regulations speed according to the time difference, regulate the playback rate of video data according to regulations speed or control voice data and stop playing, synchronous to make this voice data and video data play.
Second terminal 140 is by regulating the playback rate of video data, video data and voice data are play synchronous, network on first terminal 110 residing for first user occurs blocking up with under the scene of packet loss, after network recovery is normal, the broadcasting that can recover video data and voice data is timely synchronous, and improves the fluency of audio frequency broadcasting.This audio and video synchronization method is applied to the live audio-visual synchronization under audio frequency and video separated transmission scene, more can ensure that audio, video data is play synchronous.
In addition, in other applied environments, audio server 120 and video server 130 can merge into a server, but voice data and video data separated transmission.
Fig. 2 is the flow chart of audio and video synchronization method in an embodiment.Audio and video synchronization method in Fig. 2 is applied in the applied environment of Fig. 1.As shown in Figure 2, a kind of audio and video synchronization method, comprises the following steps:
Step 202, decodes to the packets of audio data received respectively and video packets of data and obtains the timestamp of voice data and the timestamp of video data.
Concrete, the packets of audio data of reception and video packets of data are through coding, and decode to the packets of audio data received and obtain the timestamp of voice data and correspondence thereof, decoding to the video packets of data received obtains the timestamp of video data and correspondence thereof.Packets of audio data decoding can be decoded according to timestamps ordering.The order of frame of video can be reset after video packets of data decoding, make it corresponding with initial acquisition order.
Step 204, obtains the timestamp of current playing video data and the timestamp of current playing audio-fequency data.
Concrete, when when playing video data, obtain the timestamp of current playing video data, and inquire about the timestamp of current playing audio-fequency data.
Step 206, calculates the time difference of current audio and video playing according to the timestamp of this current playing video data and the timestamp of current playing audio-fequency data.
Concrete, ask difference to obtain the time difference of present video video playback the timestamp of the timestamp of current playing video data and current playing audio-fequency data.
Step 208, the time difference according to this current audio and video playing obtains regulations speed.
In one embodiment, this regulations speed can be the linear function of the difference of the timestamp of current playing video data and the timestamp of current playing audio-fequency data.
Concrete, the playback rate of player is determined by the speed being submitted to rendering module decoded image data.The speed of view data depends on a time interval TP, is ideally that TP equals 1000/ average frame per second, if average frame per second is 20 frames/millisecond, then this TP is 50 milliseconds.
When calculating the time difference TC=TV-TA of audio and video playing, wherein, TA is the timestamp of voice data, and TV is the timestamp of video data.Regulations speed is the linear function of the difference of the timestamp of current playing video data and the timestamp of current playing audio-fequency data, such as, can be ACC=A+A*TC/1000, wherein, ACC is regulations speed, and A is a basic value, can arrange size as required, such as, basic value can be 5 milliseconds.When TC is very little, the impact of TC on ACC is smaller, and when TC is larger, the impact of TC on ACC is larger.
Calculate the time interval TR that is truly submitted to rendering module, TR=TC-ACC-TW, wherein, TW is that other processes the time loss (as getting frame, frame data decoding etc.) brought.
ACC be on the occasion of time, illustrate video data play faster than audio frequency, TR can be less than TC-TW, and video data broadcasting can be a bit larger tham predetermined frame per second; When ACC is negative value, illustrate that video data is play slower than audio frequency, TR can be greater than TC-TW, and video data broadcasting can be less than predetermined frame per second.Wherein, predetermined frame per second refers to the play frame rate preset.
Step 210, regulates this video playback speed according to this regulations speed or controls audio frequency stopping and playing, synchronous to make this voice data and video data play.
In one embodiment, step 210 comprises: this regulations speed is the linear function of the difference of the timestamp of current playing video data and the timestamp of current playing audio-fequency data; If regulations speed be on the occasion of, then reduce the playback rate of this video data, synchronous to make this voice data and video data play; If regulations speed is negative value, then increase the playback rate of this video data, synchronous to make this voice data and video data play; If regulations speed is negative value and the absolute value of regulations speed is greater than preset value, then control audio frequency and stop playing, synchronous to make this voice data and video data play.
Concrete, the playback rate reducing video data can be the playback rate the thoroughly deserving new video data playback rate of current video data being deducted regulations speed.The playback rate increasing video data can be the playback rate the thoroughly deserving new video data playback rate of current video data being increased regulations speed.
Above-mentioned audio and video synchronization method, by calculating the time difference of current audio and video playing according to the timestamp of current playing video data and the timestamp of voice data, and obtain regulations speed according to the time difference of current audio and video playing, regulate video playback speed according to regulations speed or control audio frequency and stop playing, voice data and video data are play synchronous, card is there is immediately at transmitting terminal network, there is card in the Audio and Video that recipient can't be caused to play simultaneously, improve the fluency that audio frequency is play, especially be applied to and divide extracting, gathering in audio frequency and video, coding and transmitting scene under net cast in audio-visual synchronization, even if occur congested with under the scene of packet loss at transmitting terminal network, synchronously also can recover after network normally, and the fluency that sound is play is easier to ensure.
In one embodiment, above-mentioned audio and video synchronization method also comprises: if the absolute value of the time difference of this current audio and video playing is in preset range, then judge that this current audio and video playing is synchronous.
Concrete, preset range can set as required.
If the time difference of this current audio and video playing is less than or equal to predetermined upper limit value, then obtain regulations speed according to the timestamp of this current audio and video playing.
If the time difference of this current audio and video playing is greater than predetermined upper limit value, then represent that this audio and video playing cannot be synchronous.
Concrete, predetermined upper limit value can set as required, such as, can be 3 minutes.If exceed this predetermined upper limit value, then represent that collection or transmission logic occur abnormal, be not suitable for carrying out synchronously.
Fig. 3 is the flow chart of audio and video synchronization method in another embodiment.The audio and video synchronization method of Fig. 3 is applied in the applied environment of Fig. 1.As shown in Figure 3, this audio and video synchronization method comprises:
Step 302, marks corresponding timestamp respectively by the voice data of collection and video data.
In one embodiment, step 302 comprises: the one-frame video data gathered is marked corresponding system timestamp, often one-frame video data is collected again after first frame, statistics has gathered the average frame per second of video data, and increase progressively the timestamp of video data according to this average frame per second, the difference that the timestamp after acquisition increases progressively and present system time stab; The difference that timestamp after this increases progressively and present system time stab is in error allowed band, then the timestamp after this being increased progressively is as the timestamp of this video data; The difference that timestamp after this increases progressively and present system time stab not in error allowed band, then according to the timestamp of this current this video data of system timestamp correction.
Concrete, the Frame that average frame per second can be exported afterwards by statistics coding per second, then obtained by cum rights average algorithm, such as computing formula is that B1=(B0*a+A) >>3 obtains, wherein, B1 is the average frame per second after upgrading; B0 is the average frame per second before upgrading; A is preset parameter, can set as required, and such as a is 7 or 10 etc.; A is new sample value, i.e. the new frame of video gathered.Initial frame per second can be set by the user value, the average frame per second that later renewal per second is once current.
In one embodiment, specifically comprise according to the timestamp that this average frame per second increases progressively video data: calculate the time value at every turn increased progressively according to average frame per second, the timestamp that the timestamp of current video frame equals a frame of video adds this time value.Such as time value is 1000/B, B is average frame per second, then T1=T0+1000/B, and wherein, T1 is the timestamp of current video frame, and T0 is the timestamp of a upper frame of video.
In one embodiment, the difference that timestamp after this increases progressively and present system time stab is not in error allowed band, then comprise according to the step of the timestamp of this current this video data of system timestamp correction: using present system time stamp with increase progressively after the half of difference of timestamp as correction value, the timestamp after increasing progressively is added that this correction value obtains revised timestamp.
Concrete, error allowed band can be and sets as required, the timestamp that such as can be after increasing progressively be less than that present system time deducts time value 1/3rd, or, the timestamp after increasing progressively be less than present system time deduct time value two/first-class.
Step 304, encodes the timestamp of voice data and correspondence thereof, and the voice data after coding and corresponding timestamp are packaged into packets of audio data.
Concrete, voice data can be encoded according to timestamps ordering, and the decoding of voice data thereafter is also carried out according to timestamps ordering.
Step 306, encodes the timestamp of video data and correspondence thereof, and the correspondent time of taking out of after the video data after coding and coding is packaged into video packets of data.
Concrete, video data is an image sets, and video encoder can reset the image frame data after coding to coding video data, and Video Decoder also can recover the image frame data after rearrangement.Video encoder is to taking corresponding timestamp after video data encoding out of, the timestamp of this correspondence may be different from the original corresponding timestamp of video data, but the timestamp that Video Decoder is taken out of after decoding to the video data after coding is identical with the original corresponding timestamp of video data, namely no matter adopt which kind of mode to carry out Code And Decode, video data timestamp is before encoding identical with decoded timestamp.
Video data encoding can adopt time delay to encode, and zero propagation also can be adopted to encode.
Step 308, sends this video packets of data and this packets of audio data.
Concrete, when video data encoding adopts zero propagation to encode, directly can send after video data encoding is packaged into video packets of data, can directly send after audio data coding is packaged into packets of audio data.
In one embodiment, above-mentioned audio and video synchronization method also comprises: calculate the delay time that this video data encoding consumes; Send after this this delay time of packets of audio data buffer memory again.
When video data adopts time delay to encode, the delay time needing calculating video data encoding to consume, then sends after packets of audio data buffer memory delay time again, to ensure the consistency of audio, video data in transmission starting point.Video data time delay alters a great deal because the coding parameter of setting is different, can from 0 second to tens of second not etc.Difference between the time that the delay time that video data encoding consumes records after can be time of receiving first coded frame and receiving the raw video image of collection.
Step 310, decodes to the packets of audio data received respectively and video packets of data and obtains the timestamp of voice data and the timestamp of video data.
Concrete, the packets of audio data of reception and video packets of data are through coding, and decode to the packets of audio data received and obtain the timestamp of voice data and correspondence thereof, decoding to the video packets of data received obtains the timestamp of video data and correspondence thereof.Packets of audio data decoding can be decoded according to timestamps ordering.The order of frame of video can be reset after video packets of data decoding, make it corresponding with initial acquisition order.
Step 312, obtains the timestamp of current playing video data and the timestamp of current playing audio-fequency data.
Concrete, when when playing video data, obtain the timestamp of current playing video data, and inquire about the timestamp of current playing audio-fequency data.
Step 314, calculates the time difference of current audio and video playing according to the timestamp of this current playing video data and the timestamp of current playing audio-fequency data.
Concrete, ask difference to obtain the time difference of present video video playback the timestamp of the timestamp of current playing video data and current playing audio-fequency data.
Step 316, the time difference according to this current audio and video playing obtains regulations speed.
In one embodiment, this regulations speed can be the linear function of the difference of the timestamp of current playing video data and the timestamp of current playing audio-fequency data.
Concrete, the playback rate of player is determined by the speed being submitted to rendering module decoded image data.The speed of view data depends on a time interval TP, is ideally that TP equals 1000/ average frame per second, if average frame per second is 20 frames/millisecond, then this TP is 50 milliseconds.
When calculating the time difference TC=TV-TA of audio and video playing, wherein, TA is the timestamp of voice data, and TV is the timestamp of video data.Regulations speed is the linear function of the difference of the timestamp of current playing video data and the timestamp of current playing audio-fequency data, such as, can be ACC=A+A*TC/1000, wherein, ACC is regulations speed, and A is a basic value, can arrange size as required, such as, basic value can be 5 milliseconds.When TC is very little, the impact of TC on ACC is smaller, and when TC is larger, the impact of TC on ACC is larger.
Calculate the time interval TR that is truly submitted to rendering module, TR=TC-ACC-TW, wherein, TW is that other processes the time loss (as getting frame, frame data decoding etc.) brought.
ACC be on the occasion of time, illustrate video data play faster than audio frequency, TR can be less than TC-TW, and video data broadcasting can be a bit larger tham predetermined frame per second; When ACC is negative value, illustrate that video data is play slower than audio frequency, TR can be greater than TC-TW, and video data broadcasting can be less than predetermined frame per second.Wherein, predetermined frame per second refers to the play frame rate preset.
Step 318, regulates this video playback speed according to this regulations speed or controls audio frequency stopping and playing, synchronous to make this voice data and video data play.
In one embodiment, step 318 comprises: this regulations speed is the linear function of the difference of the timestamp of current playing video data and the timestamp of current playing audio-fequency data; If regulations speed be on the occasion of, then reduce the playback rate of this video data, synchronous to make this voice data and video data play; If regulations speed is negative value, then increase the playback rate of this video data, synchronous to make this voice data and video data play; If regulations speed is negative value and the absolute value of regulations speed is greater than preset value, then control audio frequency and stop playing, synchronous to make this voice data and video data play.
Concrete, the playback rate reducing video data can be the playback rate the thoroughly deserving new video data playback rate of current video data being deducted regulations speed.The playback rate increasing video data can be the playback rate the thoroughly deserving new video data playback rate of current video data being increased regulations speed.
Above-mentioned audio and video synchronization method, by calculating the time difference of current audio and video playing according to the timestamp of current playing video data and the timestamp of voice data, and obtain regulations speed according to the time difference of current audio and video playing, regulate video playback speed according to regulations speed or control audio frequency and stop playing, voice data and video data are play synchronous, card is there is immediately at transmitting terminal network, there is card in the Audio and Video that recipient can't be caused to play simultaneously, improve the fluency that audio frequency is play, especially be applied to and divide extracting, gathering in audio frequency and video, coding and transmitting scene under net cast in audio-visual synchronization, even if occur congested with under the scene of packet loss at transmitting terminal network, synchronously also can recover after network normally, and the fluency that sound is play is easier to ensure, by calculating the delay time that video data encoding consumes, then send after packets of audio data buffer memory delay time again, to ensure the consistency of audio, video data in transmission starting point.
Fig. 4 is the structural representation of an embodiment middle pitch video synchronization device.Audio-visual synchronization device in Fig. 4 applies in the applied environment of Fig. 1.As shown in Figure 4, a kind of audio-visual synchronization device, comprises decoder module 410, timestamp acquisition module 420, time difference computing module 430, regulations speed acquisition module 440 and adjustment module 450.Wherein:
Decoder module 410 obtains the timestamp of voice data and the timestamp of video data for decoding to the voice data received respectively and video data.
Concrete, the packets of audio data received and video packets of data are through coding, decoder module 410 is decoded to the packets of audio data received and is obtained the timestamp of voice data and correspondence thereof, and decoding to the video packets of data received obtains the timestamp of video data and correspondence thereof.Packets of audio data decoding can be decoded according to timestamps ordering.The order of frame of video can be reset after video packets of data decoding, make it corresponding with initial acquisition order.
Timestamp acquisition module 420 is for the timestamp of the timestamp and current playing audio-fequency data that obtain current playing video data.
Concrete, when when playing video data, obtain the timestamp of current playing video data, and inquire about the timestamp of current playing audio-fequency data.
Time difference computing module 430 is for calculating the time difference of current audio and video playing according to the timestamp of this current playing video data and the timestamp of current playing audio-fequency data.
Concrete, ask difference to obtain the time difference of present video video playback the timestamp of the timestamp of current playing video data and current playing audio-fequency data.
Regulations speed acquisition module 440 obtains regulations speed for the time difference according to this current audio and video playing.
Concrete, this regulations speed can be the linear function of the difference of the timestamp of current playing video data and the timestamp of current playing audio-fequency data.
Concrete, the playback rate of player is determined by the speed being submitted to rendering module decoded image data.The speed of view data depends on a time interval TP, is ideally that TP equals 1000/ average frame per second, if average frame per second is 20 frames/millisecond, then this TP is 50 milliseconds.
When calculating the time difference TC=TV-TA of audio and video playing, wherein, TA is the timestamp of voice data, and TV is the timestamp of video data.Regulations speed is the linear function of the difference of the timestamp of current playing video data and the timestamp of current playing audio-fequency data, such as, can be ACC=A+A*TC/1000, wherein, ACC is regulations speed, and A is a basic value, can arrange size as required, such as, basic value can be 5 milliseconds.When TC is very little, the impact of TC on ACC is smaller, and when TC is larger, the impact of TC on ACC is larger.
Calculate the time interval TR that is truly submitted to rendering module, TR=TC-ACC-TW, wherein, TW is that other processes the time loss (as getting frame, frame data decoding etc.) brought.
ACC be on the occasion of time, illustrate video data play faster than audio frequency, TR can be less than TC-TW, and video data broadcasting can be a bit larger tham predetermined frame per second; When ACC is negative value, illustrate that video data is play slower than audio frequency, TR can be greater than TC-TW, and video data broadcasting can be less than predetermined frame per second.Wherein, predetermined frame per second refers to the play frame rate preset.
Adjustment module 450 stops playing for regulating this video playback speed according to this regulations speed or controlling audio frequency, synchronous to make this voice data and video data play.
This regulations speed is the linear function of the difference of the timestamp of current playing video data and the timestamp of current playing audio-fequency data.
If this adjustment module 450 also for regulations speed be on the occasion of, then reduce the playback rate of this video data, synchronous to make this voice data and video data play, if regulations speed is negative value, then increase the playback rate of this video data, synchronous to make this voice data and video data play, and if regulations speed is negative value and the absolute value of regulations speed is greater than preset value, then control audio frequency to stop playing, synchronous to make this voice data and video data play.
Concrete, the playback rate reducing video data can be the playback rate the thoroughly deserving new video data playback rate of current video data being deducted regulations speed.The playback rate increasing video data can be the playback rate the thoroughly deserving new video data playback rate of current video data being increased regulations speed.
Above-mentioned audio-visual synchronization device, by calculating the time difference of current audio and video playing according to the timestamp of current playing video data and the timestamp of voice data, and obtain regulations speed according to the time difference of current audio and video playing, regulate video playback speed according to regulations speed or control audio frequency and stop playing, voice data and video data are play synchronous, card is there is immediately at transmitting terminal network, there is card in the Audio and Video that recipient can't be caused to play simultaneously, improve the fluency that audio frequency is play, especially be applied to and divide extracting, gathering in audio frequency and video, coding and transmitting scene under net cast in audio-visual synchronization, even if occur congested with under the scene of packet loss at transmitting terminal network, synchronously also can recover after network normally, and the fluency that sound is play is easier to ensure.
Fig. 5 is the structural representation of another embodiment middle pitch video synchronization device.Audio-visual synchronization device in Fig. 5 applies in the applied environment of Fig. 1.As shown in Figure 5, this audio-visual synchronization device, comprises decoder module 410, timestamp acquisition module 420, time difference computing module 430, regulations speed acquisition module 440 and adjustment module 450, also comprises judge module 460.Wherein:
If judge module 460 is for judging that the absolute value of the time difference of this current audio and video playing is in preset range, then judge that this current audio and video playing is synchronous, if and judge that the time difference of this current audio and video playing is greater than predetermined upper limit value, then represent that this audio and video playing cannot be synchronous.
If when this regulations speed acquisition module 440 is also less than or equal to predetermined upper limit value for time difference of this current audio and video playing, the timestamp according to this current audio and video playing obtains regulations speed.
Concrete, predetermined upper limit value can set as required, such as, can be 3 minutes.If exceed this predetermined upper limit value, then represent that collection or transmission logic occur abnormal, be not suitable for carrying out synchronously.
Above-mentioned decoder module 410, timestamp acquisition module 420, time difference computing module 430, regulations speed acquisition module 440, adjustment module 450, judge module 460 can be positioned on receiving terminal.
Fig. 6 is the structural representation of another embodiment middle pitch video synchronization device.Audio-visual synchronization device in Fig. 6 applies in the applied environment of Fig. 1.As shown in Figure 6, this audio-visual synchronization device, comprise decoder module 410, timestamp acquisition module 420, time difference computing module 430, regulations speed acquisition module 440, adjustment module 450 and judge module 460, also comprise mark module 401, audio coding package module 402, Video coding package module 403, sending module 404 and time delay computing module 405.
Mark module 401 is for marking corresponding timestamp respectively by the voice data of collection and video data.Concrete, the one-frame video data of this mark module 401 also for gathering marks corresponding system timestamp, often one-frame video data is collected again after first frame, statistics has gathered the average frame per second of video data, and increase progressively the timestamp of video data according to this average frame per second, the difference that the timestamp after acquisition increases progressively and present system time stab; And the difference that timestamp after this increases progressively and present system time stab is in error allowed band, then the timestamp after this being increased progressively is as the timestamp of this video data; And the difference that timestamp after this increases progressively and present system time stab is not in error allowed band, then according to the timestamp of this current this video data of system timestamp correction.
Concrete, the Frame that average frame per second can be exported afterwards by statistics coding per second, then obtained by cum rights average algorithm, such as computing formula is that B1=(B0*a+A) >>3 obtains, wherein, B1 is the average frame per second after upgrading; B0 is the average frame per second before upgrading; A is preset parameter, can set as required, and such as a is 7 or 10 etc.; A is new sample value, i.e. the new frame of video gathered.Initial frame per second can be set by the user value, the average frame per second that later renewal per second is once current.
Specifically comprise according to the timestamp that this average frame per second increases progressively video data: calculate the time value at every turn increased progressively according to average frame per second, the timestamp that the timestamp of current video frame equals a frame of video adds this time value.Such as time value is 1000/B, B is average frame per second, then T1=T0+1000/B, and wherein, T1 is the timestamp of current video frame, and T0 is the timestamp of a upper frame of video.
In one embodiment, the difference that timestamp after this increases progressively and present system time stab is not in error allowed band, then comprise according to the timestamp of this current this video data of system timestamp correction: mark module 401 also for using present system time stamp with increase progressively after the half of difference of timestamp as correction value, the timestamp after increasing progressively is added that this correction value obtains revised timestamp.
Concrete, error allowed band can be and sets as required, the timestamp that such as can be after increasing progressively be less than that present system time deducts time value 1/3rd, or, the timestamp after increasing progressively be less than present system time deduct time value two/first-class.
Voice data after coding and corresponding timestamp for being encoded by the timestamp of voice data and correspondence thereof, and are packaged into packets of audio data by audio coding package module 402.Concrete, voice data can be encoded according to timestamps ordering, and the decoding of voice data thereafter is also carried out according to timestamps ordering.
The correspondent time of taking out of after the video data after coding and coding for being encoded by the timestamp of video data and correspondence thereof, and is packaged into video packets of data by Video coding package module 403.
Concrete, video data is an image sets, and video encoder can reset the image frame data after coding to coding video data, and Video Decoder also can recover the image frame data after rearrangement.Video encoder is to taking corresponding timestamp after video data encoding out of, the timestamp of this correspondence may be different from the original corresponding timestamp of video data, but the timestamp that Video Decoder is taken out of after decoding to the video data after coding is identical with the original corresponding timestamp of video data, namely no matter adopt which kind of mode to carry out Code And Decode, video data timestamp is before encoding identical with decoded timestamp.
Video data encoding can adopt time delay to encode, and zero propagation also can be adopted to encode.
Sending module 404 is for sending this video packets of data and this packets of audio data.
Concrete, when video data encoding adopts zero propagation to encode, directly can send after video data encoding is packaged into video packets of data, can directly send after audio data coding is packaged into packets of audio data.
The delay time that time delay computing module 405 consumes for calculating this video data encoding; This sending module 404 is also for sending after this this delay time of packets of audio data buffer memory again.
When video data adopts time delay to encode, the delay time needing calculating video data encoding to consume, then sends after packets of audio data buffer memory delay time again, to ensure the consistency of audio, video data in transmission starting point.Video data time delay alters a great deal because the coding parameter of setting is different, can from 0 second to tens of second not etc.Difference between the time that the delay time that video data encoding consumes records after can be time of receiving first coded frame and receiving the raw video image of collection.
One of ordinary skill in the art will appreciate that all or part of flow process realized in above-described embodiment method, that the hardware that can carry out instruction relevant by computer program has come, described program can be stored in a computer read/write memory medium, as in the embodiment of the present invention, this program can be stored in the storage medium of computer system, and performed by least one processor in this computer system, to realize the flow process of the embodiment comprised as above-mentioned each side method.Wherein, described storage medium can be magnetic disc, CD, read-only store-memory body (Read-Only Memory, ROM) or random store-memory body (Random Access Memory, RAM) etc.
The above embodiment only have expressed several execution mode of the present invention, and it describes comparatively concrete and detailed, but therefore can not be interpreted as the restriction to the scope of the claims of the present invention.It should be pointed out that for the person of ordinary skill of the art, without departing from the inventive concept of the premise, can also make some distortion and improvement, these all belong to protection scope of the present invention.Therefore, the protection range of patent of the present invention should be as the criterion with claims.
Claims (12)
1. an audio and video synchronization method, comprises the following steps:
The packets of audio data received respectively and video packets of data are decoded and obtains the timestamp of voice data and the timestamp of video data;
Obtain the timestamp of current playing video data and the timestamp of current playing audio-fequency data;
The time difference of current audio and video playing is calculated according to the timestamp of described current playing video data and the timestamp of current playing audio-fequency data;
Time difference according to described current audio and video playing obtains regulations speed;
Regulate described video playback speed according to described regulations speed or control audio frequency and stop playing, synchronous to make described voice data and video data play.
2. method according to claim 1, is characterized in that, described method also comprises:
If the absolute value of the time difference of described current audio and video playing is in preset range, then judge that described current audio and video playing is synchronous;
If the absolute value of the time difference of described current audio and video playing is less than or equal to predetermined upper limit value, then obtain regulations speed according to the timestamp of described current audio and video playing;
If the absolute value of the time difference of described current audio and video playing is greater than predetermined upper limit value, then represent that described audio and video playing cannot be synchronous.
3. method according to claim 1 and 2, is characterized in that, regulates described video playback speed or control audio frequency to stop playing according to described regulations speed, plays synchronous step comprise to make described voice data and video data:
Described regulations speed is the linear function of the difference of the timestamp of current playing video data and the timestamp of current playing audio-fequency data;
If regulations speed be on the occasion of, then reduce the playback rate of described video data, synchronous to make described voice data and video data play;
If regulations speed is negative value, then increase the playback rate of described video data, synchronous to make described voice data and video data play;
If regulations speed is negative value and the absolute value of regulations speed is greater than preset value, then control audio frequency and stop playing, synchronous to make described voice data and video data play.
4. method according to claim 1 and 2, is characterized in that, described method also comprises:
The voice data of collection and video data are marked respectively corresponding timestamp;
The timestamp of voice data and correspondence thereof is encoded, and the voice data after coding and corresponding timestamp are packaged into packets of audio data;
The timestamp of video data and correspondence thereof is encoded, and the correspondent time of taking out of after the video data after coding and coding is packaged into video packets of data;
Send described video packets of data and described packets of audio data.
5. method according to claim 4, is characterized in that, described method also comprises:
Calculate the delay time that described video data encoding consumes;
Send after delay time described in described packets of audio data buffer memory again.
6. method according to claim 4, is characterized in that, the described step voice data of collection and video data being marked respectively corresponding timestamp comprises:
By system timestamp corresponding for the one-frame video data mark gathered, often one-frame video data is collected again after first frame, statistics has gathered the average frame per second of video data, and increase progressively the timestamp of video data according to described average frame per second, the difference that the timestamp after acquisition increases progressively and present system time stab;
The difference that timestamp after described increasing progressively and present system time stab in error allowed band, then using the timestamp of the timestamp after described increasing progressively as described video data;
The difference that timestamp after described increasing progressively and present system time stab not in error allowed band, then revises the timestamp of described video data according to described present system time stamp.
7. an audio-visual synchronization device, is characterized in that, comprising:
Decoder module, obtains the timestamp of voice data and the timestamp of video data for decoding to the packets of audio data received respectively and video packets of data;
Timestamp acquisition module, for the timestamp of the timestamp and current playing audio-fequency data that obtain current playing video data;
Time difference computing module, for calculating the time difference of current audio and video playing according to the timestamp of described current playing video data and the timestamp of current playing audio-fequency data;
Regulations speed acquisition module, obtains regulations speed for the time difference according to described current audio and video playing;
Adjustment module, stops playing for regulating described video playback speed according to described regulations speed or controlling audio frequency, synchronous to make described voice data and video data play.
8. device according to claim 7, is characterized in that, described device also comprises:
Judge module, if for the absolute value of time difference of judging described current audio and video playing in preset range, then judge that described current audio and video playing is synchronous, if and judge that the absolute value of time difference of described current audio and video playing is greater than predetermined upper limit value, then represent that described audio and video playing cannot be synchronous;
If when described regulations speed acquisition module is also less than or equal to predetermined upper limit value for the absolute value of the time difference of described current audio and video playing, the timestamp according to described current audio and video playing obtains regulations speed.
9. the device according to claim 7 or 8, is characterized in that, described regulations speed is the linear function of the difference of the timestamp of current playing video data and the timestamp of current playing audio-fequency data;
If described adjustment module also for regulations speed be on the occasion of, then reduce the playback rate of described video data, synchronous to make described voice data and video data play, if regulations speed is negative value, then increase the playback rate of described video data, synchronous to make described voice data and video data play, and if regulations speed is negative value and the absolute value of regulations speed is greater than preset value, then control audio frequency to stop playing, synchronous to make described voice data and video data play.
10. the device according to claim 7 or 8, is characterized in that, described device also comprises:
Mark module, for marking corresponding timestamp respectively by the voice data of collection and video data;
Audio coding package module, for being encoded by the timestamp of voice data and correspondence thereof, and is packaged into packets of audio data by the voice data after coding and corresponding timestamp;
Video coding package module, for being encoded by the timestamp of video data and correspondence thereof, and is packaged into video packets of data by the correspondent time of taking out of after the video data after coding and coding;
Sending module, for sending described video packets of data and described packets of audio data.
11. devices according to claim 10, is characterized in that, described device also comprises:
Time delay computing module, for calculating the delay time that described video data encoding consumes;
Described sending module is also for sending after delay time described in described packets of audio data buffer memory again.
12. devices according to claim 10, it is characterized in that, the one-frame video data of described mark module also for gathering marks corresponding system timestamp, often one-frame video data is collected again after first frame, statistics has gathered the average frame per second of video data, and increase progressively the timestamp of video data according to described average frame per second, the difference that the timestamp after acquisition increases progressively and present system time stab;
And when described in increase progressively after timestamp and the present system time difference of stabbing in error allowed band, then using the timestamp of the timestamp after described increasing progressively as described video data;
And when described in increase progressively after timestamp and the present system time difference of stabbing not in error allowed band, then revise the timestamp of described video data according to described present system time stamp.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410808969.1A CN104618786B (en) | 2014-12-22 | 2014-12-22 | Audio and video synchronization method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410808969.1A CN104618786B (en) | 2014-12-22 | 2014-12-22 | Audio and video synchronization method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104618786A true CN104618786A (en) | 2015-05-13 |
CN104618786B CN104618786B (en) | 2018-01-05 |
Family
ID=53153031
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410808969.1A Active CN104618786B (en) | 2014-12-22 | 2014-12-22 | Audio and video synchronization method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104618786B (en) |
Cited By (47)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105049917A (en) * | 2015-07-06 | 2015-11-11 | 深圳Tcl数字技术有限公司 | Method and device for recording an audio and video synchronization timestamp |
CN105187862A (en) * | 2015-08-31 | 2015-12-23 | 北京暴风科技股份有限公司 | Distributed player flow control method and system |
CN105898534A (en) * | 2015-12-10 | 2016-08-24 | 乐视云计算有限公司 | Network video playing method and device |
CN106060610A (en) * | 2016-06-07 | 2016-10-26 | 微鲸科技有限公司 | Player synchronization system and method |
CN106331820A (en) * | 2015-06-29 | 2017-01-11 | 成都鼎桥通信技术有限公司 | Synchronous audio and video processing method and device |
CN106612452A (en) * | 2015-10-22 | 2017-05-03 | 深圳市中兴微电子技术有限公司 | Audio-video synchronizing method and device of set top box |
WO2017071670A1 (en) * | 2015-10-30 | 2017-05-04 | 中兴通讯股份有限公司 | Audio and video synchronization method, device and system |
CN106658133A (en) * | 2016-10-26 | 2017-05-10 | 广州市百果园网络科技有限公司 | Audio and video synchronous playing method and terminal |
CN106686438A (en) * | 2016-12-29 | 2017-05-17 | 北京奇艺世纪科技有限公司 | Cross-device audio/image synchronous playing method, equipment and system |
CN106792070A (en) * | 2016-12-19 | 2017-05-31 | 广东威创视讯科技股份有限公司 | A kind of audio, video data DMA transfer method and device |
CN107404599A (en) * | 2017-07-17 | 2017-11-28 | 歌尔股份有限公司 | Audio, video data synchronous method, apparatus and system |
CN107517400A (en) * | 2016-06-15 | 2017-12-26 | 成都鼎桥通信技术有限公司 | Flow media playing method and DST PLAYER |
CN107517401A (en) * | 2016-06-15 | 2017-12-26 | 成都鼎桥通信技术有限公司 | multimedia data playing method and device |
CN108495177A (en) * | 2018-03-30 | 2018-09-04 | 北京三体云联科技有限公司 | A kind of audio speed changing processing method and processing device |
CN109039994A (en) * | 2017-06-08 | 2018-12-18 | 中国移动通信集团甘肃有限公司 | A kind of method and apparatus calculating the audio and video asynchronous time difference |
CN109167890A (en) * | 2018-08-22 | 2019-01-08 | 青岛海信电器股份有限公司 | A kind of sound draws synchronous method and device and display equipment |
CN109167943A (en) * | 2018-08-01 | 2019-01-08 | 广州长嘉电子有限公司 | COAX interface TV ISDB-T signal processing method and system |
CN109168059A (en) * | 2018-10-17 | 2019-01-08 | 上海赛连信息科技有限公司 | A kind of labial synchronization method playing audio & video respectively on different devices |
CN109218794A (en) * | 2017-06-30 | 2019-01-15 | 全球能源互联网研究院 | Remote job guidance method and system |
CN109257642A (en) * | 2018-10-12 | 2019-01-22 | Oppo广东移动通信有限公司 | Video resource playback method, device, electronic equipment and storage medium |
CN109327724A (en) * | 2017-08-01 | 2019-02-12 | 成都鼎桥通信技术有限公司 | Audio and video synchronization playing method and device |
CN109819315A (en) * | 2019-03-21 | 2019-05-28 | 广州华多网络科技有限公司 | Playback method, device, terminal and the storage medium of HEVC video |
CN110111614A (en) * | 2019-03-14 | 2019-08-09 | 杭州笔声智能科技有限公司 | A kind of method and system that audio-video teaching implementation sound screen is synchronous |
CN110418183A (en) * | 2019-08-05 | 2019-11-05 | 北京字节跳动网络技术有限公司 | Audio and video synchronization method, device, electronic equipment and readable medium |
CN110545447A (en) * | 2019-07-31 | 2019-12-06 | 视联动力信息技术股份有限公司 | Audio and video synchronization method and device |
CN110661760A (en) * | 2018-06-29 | 2020-01-07 | 视联动力信息技术股份有限公司 | Data processing method and device |
CN110856009A (en) * | 2019-11-27 | 2020-02-28 | 广州华多网络科技有限公司 | Network karaoke system, audio and video playing method of network karaoke and related equipment |
CN111010589A (en) * | 2019-12-19 | 2020-04-14 | 腾讯科技(深圳)有限公司 | Live broadcast method, device, equipment and storage medium based on artificial intelligence |
CN111654736A (en) * | 2020-06-10 | 2020-09-11 | 北京百度网讯科技有限公司 | Method and device for determining audio and video synchronization error, electronic equipment and storage medium |
CN112243145A (en) * | 2019-07-18 | 2021-01-19 | 瑞昱半导体股份有限公司 | Audio and video synchronization method and audio and video processing device |
CN112423121A (en) * | 2020-08-11 | 2021-02-26 | 上海幻电信息科技有限公司 | Video test file generation method and device and player test method and device |
CN113490029A (en) * | 2021-06-21 | 2021-10-08 | 深圳Tcl新技术有限公司 | Video playing method, device, equipment and storage medium |
CN114095771A (en) * | 2021-09-06 | 2022-02-25 | 贵阳语玩科技有限公司 | Audio and video synchronization method, storage medium and electronic equipment |
CN114339326A (en) * | 2021-12-10 | 2022-04-12 | 北京拼响天空文化科技有限公司 | Sound and picture synchronization method, device and system based on video playing |
CN114339454A (en) * | 2022-03-11 | 2022-04-12 | 浙江大华技术股份有限公司 | Audio and video synchronization method and device, electronic device and storage medium |
CN114827696A (en) * | 2021-01-29 | 2022-07-29 | 华为技术有限公司 | Method for synchronously playing cross-device audio and video data and electronic device |
CN115052199A (en) * | 2022-06-20 | 2022-09-13 | 蔚来汽车科技(安徽)有限公司 | Vehicle-mounted film watching system and vehicle-mounted film watching method |
CN115102927A (en) * | 2022-04-29 | 2022-09-23 | 厦门立林科技有限公司 | SIP (Session initiation protocol) talkback method, system and storage device for keeping video clear |
CN115834921A (en) * | 2022-11-17 | 2023-03-21 | 北京奇艺世纪科技有限公司 | Video processing method, video processing apparatus, video processing server, storage medium, and program product |
CN115883859A (en) * | 2021-09-29 | 2023-03-31 | 中移(成都)信息通信科技有限公司 | Multimedia data processing method, electronic device and storage medium |
CN115914711A (en) * | 2022-09-28 | 2023-04-04 | 长视科技股份有限公司 | Audio and video playing method, device, equipment, medium and computer program product |
CN116017012A (en) * | 2022-11-28 | 2023-04-25 | 深圳创维-Rgb电子有限公司 | Multi-screen synchronization method, device, display equipment and computer readable storage medium |
CN116437134A (en) * | 2023-06-13 | 2023-07-14 | 中国人民解放军军事科学院系统工程研究院 | Method and device for detecting audio and video synchronicity |
US20230291811A1 (en) * | 2020-08-10 | 2023-09-14 | Beijing Xiaomi Mobile Software Co., Ltd. | Multimodal data transmission method and apparatus, and multimodal data processing method and apparatus |
CN117097936A (en) * | 2023-10-19 | 2023-11-21 | 天迈极光(福建)科技有限公司 | Audio and video synchronization method and device, electronic equipment and storage medium |
CN117478958A (en) * | 2023-09-19 | 2024-01-30 | 广州开得联软件技术有限公司 | Video playing method, device, electronic equipment and storage medium |
WO2024093490A1 (en) * | 2022-10-31 | 2024-05-10 | 抖音视界有限公司 | Method and apparatus for processing audio coding data packet |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1554188A (en) * | 2001-10-31 | 2004-12-08 | 松下移动通信株式会社 | Time stamp value controller |
CN103414957A (en) * | 2013-07-30 | 2013-11-27 | 广东工业大学 | Method and device for synchronization of audio data and video data |
US20130342632A1 (en) * | 2012-06-25 | 2013-12-26 | Chi-Chung Su | Video conference apparatus and method for audio-video synchronization |
CN103546662A (en) * | 2013-09-23 | 2014-01-29 | 浙江工业大学 | Audio and video synchronizing method in network monitoring system |
-
2014
- 2014-12-22 CN CN201410808969.1A patent/CN104618786B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1554188A (en) * | 2001-10-31 | 2004-12-08 | 松下移动通信株式会社 | Time stamp value controller |
US20130342632A1 (en) * | 2012-06-25 | 2013-12-26 | Chi-Chung Su | Video conference apparatus and method for audio-video synchronization |
CN103414957A (en) * | 2013-07-30 | 2013-11-27 | 广东工业大学 | Method and device for synchronization of audio data and video data |
CN103546662A (en) * | 2013-09-23 | 2014-01-29 | 浙江工业大学 | Audio and video synchronizing method in network monitoring system |
Cited By (74)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106331820B (en) * | 2015-06-29 | 2020-01-07 | 成都鼎桥通信技术有限公司 | Audio and video synchronization processing method and device |
CN106331820A (en) * | 2015-06-29 | 2017-01-11 | 成都鼎桥通信技术有限公司 | Synchronous audio and video processing method and device |
CN105049917B (en) * | 2015-07-06 | 2018-12-07 | 深圳Tcl数字技术有限公司 | The method and apparatus of recording audio/video synchronized timestamp |
CN105049917A (en) * | 2015-07-06 | 2015-11-11 | 深圳Tcl数字技术有限公司 | Method and device for recording an audio and video synchronization timestamp |
WO2017005066A1 (en) * | 2015-07-06 | 2017-01-12 | 深圳Tcl数字技术有限公司 | Method and apparatus for recording audio and video synchronization timestamp |
CN105187862A (en) * | 2015-08-31 | 2015-12-23 | 北京暴风科技股份有限公司 | Distributed player flow control method and system |
CN105187862B (en) * | 2015-08-31 | 2017-12-15 | 暴风集团股份有限公司 | A kind of distributed player flow control methods and system |
CN106612452A (en) * | 2015-10-22 | 2017-05-03 | 深圳市中兴微电子技术有限公司 | Audio-video synchronizing method and device of set top box |
CN106658065A (en) * | 2015-10-30 | 2017-05-10 | 中兴通讯股份有限公司 | Audio and video synchronization method, device and system |
WO2017071670A1 (en) * | 2015-10-30 | 2017-05-04 | 中兴通讯股份有限公司 | Audio and video synchronization method, device and system |
CN106658065B (en) * | 2015-10-30 | 2021-10-22 | 中兴通讯股份有限公司 | Audio and video synchronization method, device and system |
CN105898534A (en) * | 2015-12-10 | 2016-08-24 | 乐视云计算有限公司 | Network video playing method and device |
CN106060610A (en) * | 2016-06-07 | 2016-10-26 | 微鲸科技有限公司 | Player synchronization system and method |
CN107517401A (en) * | 2016-06-15 | 2017-12-26 | 成都鼎桥通信技术有限公司 | multimedia data playing method and device |
CN107517400A (en) * | 2016-06-15 | 2017-12-26 | 成都鼎桥通信技术有限公司 | Flow media playing method and DST PLAYER |
CN107517400B (en) * | 2016-06-15 | 2020-03-24 | 成都鼎桥通信技术有限公司 | Streaming media playing method and streaming media player |
CN106658133B (en) * | 2016-10-26 | 2020-04-14 | 广州市百果园网络科技有限公司 | Audio and video synchronous playing method and terminal |
CN106658133A (en) * | 2016-10-26 | 2017-05-10 | 广州市百果园网络科技有限公司 | Audio and video synchronous playing method and terminal |
CN106792070B (en) * | 2016-12-19 | 2020-06-23 | 广东威创视讯科技股份有限公司 | DMA transmission method and device for audio and video data |
CN106792070A (en) * | 2016-12-19 | 2017-05-31 | 广东威创视讯科技股份有限公司 | A kind of audio, video data DMA transfer method and device |
CN106686438B (en) * | 2016-12-29 | 2019-12-13 | 北京奇艺世纪科技有限公司 | method, device and system for synchronously playing audio images across equipment |
CN106686438A (en) * | 2016-12-29 | 2017-05-17 | 北京奇艺世纪科技有限公司 | Cross-device audio/image synchronous playing method, equipment and system |
CN109039994A (en) * | 2017-06-08 | 2018-12-18 | 中国移动通信集团甘肃有限公司 | A kind of method and apparatus calculating the audio and video asynchronous time difference |
CN109039994B (en) * | 2017-06-08 | 2020-12-08 | 中国移动通信集团甘肃有限公司 | Method and equipment for calculating asynchronous time difference between audio and video |
CN109218794A (en) * | 2017-06-30 | 2019-01-15 | 全球能源互联网研究院 | Remote job guidance method and system |
CN107404599A (en) * | 2017-07-17 | 2017-11-28 | 歌尔股份有限公司 | Audio, video data synchronous method, apparatus and system |
CN107404599B (en) * | 2017-07-17 | 2020-05-19 | 歌尔股份有限公司 | Audio and video data synchronization method, device and system |
CN109327724A (en) * | 2017-08-01 | 2019-02-12 | 成都鼎桥通信技术有限公司 | Audio and video synchronization playing method and device |
CN109327724B (en) * | 2017-08-01 | 2021-08-31 | 成都鼎桥通信技术有限公司 | Audio and video synchronous playing method and device |
CN108495177B (en) * | 2018-03-30 | 2021-07-13 | 北京世纪好未来教育科技有限公司 | Audio frequency speed change processing method and device |
CN108495177A (en) * | 2018-03-30 | 2018-09-04 | 北京三体云联科技有限公司 | A kind of audio speed changing processing method and processing device |
CN110661760A (en) * | 2018-06-29 | 2020-01-07 | 视联动力信息技术股份有限公司 | Data processing method and device |
CN109167943A (en) * | 2018-08-01 | 2019-01-08 | 广州长嘉电子有限公司 | COAX interface TV ISDB-T signal processing method and system |
CN109167890A (en) * | 2018-08-22 | 2019-01-08 | 青岛海信电器股份有限公司 | A kind of sound draws synchronous method and device and display equipment |
CN109257642A (en) * | 2018-10-12 | 2019-01-22 | Oppo广东移动通信有限公司 | Video resource playback method, device, electronic equipment and storage medium |
CN113286184B (en) * | 2018-10-17 | 2024-01-30 | 上海赛连信息科技有限公司 | Lip synchronization method for respectively playing audio and video on different devices |
CN109168059A (en) * | 2018-10-17 | 2019-01-08 | 上海赛连信息科技有限公司 | A kind of labial synchronization method playing audio & video respectively on different devices |
CN113286184A (en) * | 2018-10-17 | 2021-08-20 | 上海赛连信息科技有限公司 | Lip sound synchronization method for respectively playing audio and video on different devices |
CN109168059B (en) * | 2018-10-17 | 2021-06-18 | 上海赛连信息科技有限公司 | Lip sound synchronization method for respectively playing audio and video on different devices |
CN110111614A (en) * | 2019-03-14 | 2019-08-09 | 杭州笔声智能科技有限公司 | A kind of method and system that audio-video teaching implementation sound screen is synchronous |
CN109819315A (en) * | 2019-03-21 | 2019-05-28 | 广州华多网络科技有限公司 | Playback method, device, terminal and the storage medium of HEVC video |
CN112243145A (en) * | 2019-07-18 | 2021-01-19 | 瑞昱半导体股份有限公司 | Audio and video synchronization method and audio and video processing device |
CN110545447A (en) * | 2019-07-31 | 2019-12-06 | 视联动力信息技术股份有限公司 | Audio and video synchronization method and device |
CN110418183A (en) * | 2019-08-05 | 2019-11-05 | 北京字节跳动网络技术有限公司 | Audio and video synchronization method, device, electronic equipment and readable medium |
CN110856009B (en) * | 2019-11-27 | 2021-02-26 | 广州华多网络科技有限公司 | Network karaoke system, audio and video playing method of network karaoke and related equipment |
CN110856009A (en) * | 2019-11-27 | 2020-02-28 | 广州华多网络科技有限公司 | Network karaoke system, audio and video playing method of network karaoke and related equipment |
CN111010589B (en) * | 2019-12-19 | 2022-02-25 | 腾讯科技(深圳)有限公司 | Live broadcast method, device, equipment and storage medium based on artificial intelligence |
CN111010589A (en) * | 2019-12-19 | 2020-04-14 | 腾讯科技(深圳)有限公司 | Live broadcast method, device, equipment and storage medium based on artificial intelligence |
CN111654736A (en) * | 2020-06-10 | 2020-09-11 | 北京百度网讯科技有限公司 | Method and device for determining audio and video synchronization error, electronic equipment and storage medium |
US20230291811A1 (en) * | 2020-08-10 | 2023-09-14 | Beijing Xiaomi Mobile Software Co., Ltd. | Multimodal data transmission method and apparatus, and multimodal data processing method and apparatus |
US12003604B2 (en) * | 2020-08-10 | 2024-06-04 | Beijing Xiaomi Mobile Software Co., Ltd. | Multimodal data transmission method and apparatus, and multimodal data processing method and apparatus |
CN112423121A (en) * | 2020-08-11 | 2021-02-26 | 上海幻电信息科技有限公司 | Video test file generation method and device and player test method and device |
CN114827696A (en) * | 2021-01-29 | 2022-07-29 | 华为技术有限公司 | Method for synchronously playing cross-device audio and video data and electronic device |
CN114827696B (en) * | 2021-01-29 | 2023-06-27 | 华为技术有限公司 | Method for synchronously playing audio and video data of cross-equipment and electronic equipment |
CN113490029A (en) * | 2021-06-21 | 2021-10-08 | 深圳Tcl新技术有限公司 | Video playing method, device, equipment and storage medium |
CN114095771A (en) * | 2021-09-06 | 2022-02-25 | 贵阳语玩科技有限公司 | Audio and video synchronization method, storage medium and electronic equipment |
CN114095771B (en) * | 2021-09-06 | 2024-04-02 | 贵阳语玩科技有限公司 | Audio and video synchronization method, storage medium and electronic equipment |
CN115883859A (en) * | 2021-09-29 | 2023-03-31 | 中移(成都)信息通信科技有限公司 | Multimedia data processing method, electronic device and storage medium |
CN114339326A (en) * | 2021-12-10 | 2022-04-12 | 北京拼响天空文化科技有限公司 | Sound and picture synchronization method, device and system based on video playing |
CN114339326B (en) * | 2021-12-10 | 2023-06-27 | 北京拼响天空文化科技有限公司 | Sound and picture synchronization method, device and system based on video playing |
CN114339454A (en) * | 2022-03-11 | 2022-04-12 | 浙江大华技术股份有限公司 | Audio and video synchronization method and device, electronic device and storage medium |
CN115102927A (en) * | 2022-04-29 | 2022-09-23 | 厦门立林科技有限公司 | SIP (Session initiation protocol) talkback method, system and storage device for keeping video clear |
CN115102927B (en) * | 2022-04-29 | 2023-10-27 | 厦门立林科技有限公司 | SIP intercom method, system and storage device for keeping video clear |
CN115052199A (en) * | 2022-06-20 | 2022-09-13 | 蔚来汽车科技(安徽)有限公司 | Vehicle-mounted film watching system and vehicle-mounted film watching method |
CN115914711A (en) * | 2022-09-28 | 2023-04-04 | 长视科技股份有限公司 | Audio and video playing method, device, equipment, medium and computer program product |
WO2024093490A1 (en) * | 2022-10-31 | 2024-05-10 | 抖音视界有限公司 | Method and apparatus for processing audio coding data packet |
CN115834921A (en) * | 2022-11-17 | 2023-03-21 | 北京奇艺世纪科技有限公司 | Video processing method, video processing apparatus, video processing server, storage medium, and program product |
CN116017012A (en) * | 2022-11-28 | 2023-04-25 | 深圳创维-Rgb电子有限公司 | Multi-screen synchronization method, device, display equipment and computer readable storage medium |
CN116437134B (en) * | 2023-06-13 | 2023-09-22 | 中国人民解放军军事科学院系统工程研究院 | Method and device for detecting audio and video synchronicity |
CN116437134A (en) * | 2023-06-13 | 2023-07-14 | 中国人民解放军军事科学院系统工程研究院 | Method and device for detecting audio and video synchronicity |
CN117478958A (en) * | 2023-09-19 | 2024-01-30 | 广州开得联软件技术有限公司 | Video playing method, device, electronic equipment and storage medium |
CN117478958B (en) * | 2023-09-19 | 2024-05-31 | 广州开得联软件技术有限公司 | Video playing method, device, electronic equipment and storage medium |
CN117097936A (en) * | 2023-10-19 | 2023-11-21 | 天迈极光(福建)科技有限公司 | Audio and video synchronization method and device, electronic equipment and storage medium |
CN117097936B (en) * | 2023-10-19 | 2024-02-06 | 天迈极光(福建)科技有限公司 | Audio and video synchronization method and device, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN104618786B (en) | 2018-01-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104618786A (en) | Audio/video synchronization method and device | |
CN106686438B (en) | method, device and system for synchronously playing audio images across equipment | |
CN101212679B (en) | AV synchronization method and device for switching multi-channel audio streams during playing of AVI files | |
CN105657524A (en) | Seamless video switching method | |
CN107566918A (en) | A kind of low delay under video distribution scene takes the neutrel extraction of root | |
CN108259964B (en) | Video playing rate adjusting method and system | |
CN101710997A (en) | MPEG-2 (Moving Picture Experts Group-2) system based method and system for realizing video and audio synchronization | |
CN101009824A (en) | A network transfer method for audio/video data | |
CN103888813A (en) | Audio and video synchronization realization method and system | |
CN108810656B (en) | Real-time live broadcast TS (transport stream) jitter removal processing method and processing system | |
CN106792154B (en) | Frame skipping synchronization system of video player and control method thereof | |
US11438645B2 (en) | Media information processing method, related device, and computer storage medium | |
CN105791735A (en) | Method and system for dynamically adjusting video call code streams | |
CN104683823A (en) | Multi-screen linked audio and video synchronizing system | |
WO2016008131A1 (en) | Techniques for separately playing audio and video data in local networks | |
CN101449584B (en) | Video processing | |
CN104333795A (en) | Real-time video bitstream play speed control method independent of timestamp | |
CN113225585A (en) | Video definition switching method and device, electronic equipment and storage medium | |
CN109040818B (en) | Audio and video synchronization method, storage medium, electronic equipment and system during live broadcasting | |
CN103929682B (en) | Method and device for setting key frames in video live broadcast system | |
CN105430453A (en) | Media data acquisition method, media terminal and online music teaching system | |
EP2643977A1 (en) | Method and apparatus for processing a video signal | |
CN113207040A (en) | Data processing method, device and system for video remote quick playback | |
CN117255236A (en) | Audio and video synchronization method for digital visual intercom | |
CN105376595A (en) | Video mixing encoding system and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |