CN109168059B - Lip sound synchronization method for respectively playing audio and video on different devices - Google Patents

Lip sound synchronization method for respectively playing audio and video on different devices Download PDF

Info

Publication number
CN109168059B
CN109168059B CN201811210525.2A CN201811210525A CN109168059B CN 109168059 B CN109168059 B CN 109168059B CN 201811210525 A CN201811210525 A CN 201811210525A CN 109168059 B CN109168059 B CN 109168059B
Authority
CN
China
Prior art keywords
video
audio
playing device
playing
sending end
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811210525.2A
Other languages
Chinese (zh)
Other versions
CN109168059A (en
Inventor
范圣冲
高新媛
白刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Sailian Information Technology Co ltd
Original Assignee
Shanghai Sailian Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Sailian Information Technology Co ltd filed Critical Shanghai Sailian Information Technology Co ltd
Priority to CN202110568007.3A priority Critical patent/CN113286184B/en
Priority to CN201811210525.2A priority patent/CN109168059B/en
Publication of CN109168059A publication Critical patent/CN109168059A/en
Application granted granted Critical
Publication of CN109168059B publication Critical patent/CN109168059B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43632Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wired protocol, e.g. IEEE 1394
    • H04N21/43635HDMI
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43637Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Security & Cryptography (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)

Abstract

The invention discloses a lip sound synchronization method for respectively playing audio and video on different devices, wherein the different devices comprise audio playing devices and video playing devices, the lip sound synchronization method comprises the step of sending audio and video codes to the audio playing devices and the video playing devices by using a sending end, the sending end adds timestamp information to data packets sent by the sending end, and the audio playing devices and the video playing devices respectively use a synchronization mechanism to synchronously play audio and video information with the same timestamp. The lip sound synchronous playing mechanism designed by the invention can deal with 99% of network jitter conditions, and ensures that audio and video on different devices can be synchronously played when the network quality is unstable.

Description

Lip sound synchronization method for respectively playing audio and video on different devices
Technical Field
The invention relates to the technical field of multimedia, in particular to a lip sound synchronization method for playing audio and video on different devices respectively.
Background
In a video conference scenario, each participant terminal receives video and audio streams from other participants, and displays and plays the received video and audio streams through a local image display device (such as a display screen) and a local sound playing device (such as a loudspeaker). Because the audio and video code stream is received and played by the same device, the corresponding audio and video images are lip-synchronized during playing, i.e. the mouth shape of the participant during speaking in the video images is consistent with the corresponding sound.
Without human intervention, however, the playing times of the sound and video may be inconsistent due to the instability of the IP network. Resulting in the user experiencing unsynchronized sound and video, i.e., lip sounds.
For example, in some specific scenarios (such as fig. 1-2), the receiving and playing of audio, video and audio sometimes need to be performed on different devices, such as receiving a video image is performed by a video terminal in a conference room, and receiving audio is performed by a wireless microphone + a sound box connected to the terminal in a pairing manner; or if the intelligent terminal all-in-one machine NE60 is an audio and video communication device suitable for a desktop or a small conference room, the touch screen is convenient and easy to operate, the intelligent terminal all-in-one machine NE60 can be matched with an ME terminal, sound can be played by using audio equipment on the NE60 on the desktop, and video images are output through a television screen connected with the ME terminal. In these scenarios, sound and video are received from different devices and played and displayed. Because two independent devices receive audio and video data code streams through an IP network, and the instability of the IP network and the arrival time of sound and video at different devices are possibly different, if each device directly plays the received audio and video code streams, the problem of audio and video asynchronization exists, namely the mouth shape of the displayed participant when speaking is inconsistent with the corresponding sound.
The invention solves the problem of audio and video synchronous playing when respectively transmitting sound and video data to different devices under the condition of using an IP network.
Disclosure of Invention
The invention provides a lip sound synchronization method for respectively playing audio and video on different devices, wherein the different devices comprise an audio playing device and a video playing device,
and the audio playing device and the video playing device respectively use a synchronization mechanism to synchronously play audio and video information with the same timestamp.
Furthermore, the audio playing device and the video playing device respectively maintain a buffer queue for sound and video, and play the data packets through the synchronization mechanism after a certain number of data packets are buffered.
Furthermore, the method also comprises the steps that the sound playing device is set as a master side in a synchronization mechanism, the sound playing device uniformly plays sound data from a local buffer queue, simultaneously periodically sends synchronization messages to the video display device according to a time period, a collection time stamp of the sound data which is played currently is synchronized to the video display device, the video display device controls the length of the buffer queue of the video data according to the received time stamp, and performs moderate playing, so that the synchronization of sound and video is ensured
Further, each time the sound playing device plays data of one time period, the sound playing device sends a synchronization message to the video playing device, and after receiving the synchronization message, the video playing device returns a confirmation message to the sending end, wherein the confirmation message comprises a received current sending timestamp from the audio playing device; when the confirmation message is received by the sending end, the sending end subtracts the system time when sending from the currently received system time to confirm the round trip delay value delta under the current network;
the sending end performs weighting processing on the round trip delay value delta every other weighting period and then sends the round trip delay value delta to the sound playing equipment, the sound playing equipment adds a collecting timestamp to the round trip delay value delta after the current weighting processing and then sends the round trip delay value delta to the video playing equipment, and the video playing equipment performs synchronous playing of videos according to the timestamp.
Further, the weighting processing on the round trip delay value Δ includes dividing the round trip delay value Δ by 2 to obtain a one-way delay value and applying a filtering algorithm to the one-way delay value.
Further, the filtering algorithm includes performing weighted average on the one-way delay values in a period of time to obtain the filtered one-way delay values.
Preferably or optionally, the method further includes setting the video playing device as an active side in a synchronization mechanism, collecting a difference value of audio and video asynchronism through the video playing device, filtering the difference value, comparing the difference value with a preset threshold value, sending the deviation value to the sound playing device when the difference value is greater than the preset threshold value, and increasing the buffer queue of the sound playing device by a corresponding length according to the deviation value.
Without human intervention, the playing times of sound and video may be inconsistent due to the instability of the IP network. Resulting in the user experiencing unsynchronized sound and video, i.e., lip sounds. The lip sound synchronous playing mechanism designed based on the method can deal with 99% of network jitter conditions, and ensures that audio and video on different devices can be synchronously played when the network quality is unstable.
Drawings
The invention will be further understood from the following description in conjunction with the accompanying drawings. The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the embodiments. In the drawings, like reference numerals designate corresponding parts throughout the different views.
FIG. 1 is a schematic diagram of an application scenario of the present invention;
FIG. 2 is a schematic diagram of an application scenario two of the present invention;
FIG. 3 is a schematic diagram of the synchronization mechanism of the present invention, that is, the audio playing device sends a synchronization message to the video playing device;
fig. 4 is a schematic diagram of the synchronization mechanism of the present invention, that is, the video playback device sends a synchronization message to the audio playback device.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to embodiments thereof; it should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. Other systems, methods, and/or features of the present embodiments will become apparent to those skilled in the art upon review of the following detailed description. It is intended that all such additional systems, methods, features and advantages be included within this description, be within the scope of the invention, and be protected by the accompanying claims. Additional features of the disclosed embodiments are described in, and will be apparent from, the detailed description that follows.
The first embodiment is as follows:
the invention provides a lip sound synchronization method for respectively playing audio and video on different devices, wherein the different devices comprise an audio playing device and a video playing device,
and the audio playing device and the video playing device respectively use a synchronization mechanism to synchronously play audio and video information with the same timestamp.
Furthermore, the audio playing device and the video playing device respectively maintain a buffer queue for sound and video, and play the data packets through the synchronization mechanism after a certain number of data packets are buffered.
Furthermore, the method also comprises the steps that the sound playing device is set as a master side in a synchronization mechanism, the sound playing device uniformly plays sound data from a local buffer queue, simultaneously periodically sends synchronization messages to the video display device according to a time period, a collection time stamp of the sound data which is played currently is synchronized to the video display device, the video display device controls the length of the buffer queue of the video data according to the received time stamp, and performs moderate playing, so that the synchronization of sound and video is ensured
Further, each time the sound playing device plays data of one time period, the sound playing device sends a synchronization message to the video playing device, and after receiving the synchronization message, the video playing device returns a confirmation message to the sending end, wherein the confirmation message comprises a received current sending timestamp from the audio playing device; when the confirmation message is received by the sending end, the sending end subtracts the system time when sending from the currently received system time to confirm the round trip delay value delta under the current network;
the sending end performs weighting processing on the round trip delay value delta every other weighting period and then sends the round trip delay value delta to the sound playing equipment, the sound playing equipment adds a collecting timestamp to the round trip delay value delta after the current weighting processing and then sends the round trip delay value delta to the video playing equipment, and the video playing equipment performs synchronous playing of videos according to the timestamp.
Further, the weighting processing on the round trip delay value Δ includes dividing the round trip delay value Δ by 2 to obtain a one-way delay value and applying a filtering algorithm to the one-way delay value. The filtering algorithm comprises the step of carrying out weighted average on the one-way delay values in a period of time to obtain the one-way delay values after filtering processing.
Of course, the filtering method is only an example, and other alternative filtering methods may be used as an alternative, but the method is recommended to be proved to be better or best in the embodiment through experiments.
Example two:
the invention provides a lip sound synchronization method for respectively playing audio and video on different devices, wherein the different devices comprise an audio playing device and a video playing device,
and the audio playing device and the video playing device respectively use a synchronization mechanism to synchronously play audio and video information with the same timestamp.
Furthermore, the audio playing device and the video playing device respectively maintain a buffer queue for sound and video, and play the data packets through the synchronization mechanism after a certain number of data packets are buffered.
In this embodiment, the method further includes setting the video playing device as an active side in the synchronization mechanism, collecting a difference value of audio and video asynchronism by the video playing device, filtering the difference value, comparing the difference value with a preset threshold value, sending the deviation value to the sound playing device when the difference value is greater than the preset threshold value, and increasing the buffer queue of the sound playing device by a corresponding length according to the deviation value. The filtering method is similar to that in the previous embodiment, or other filtering methods common in the art may be adopted, and details are not described again.
Example three:
in this embodiment, in order to solve the problem of lip-sound asynchronism, timestamp information needs to be added to the data packets sent by the audio/video code stream at the sending end of the audio/video code stream, that is, each sent audio/video data packet carries timestamp information when the audio/video data packet is collected, and the audio data packet and the video data packet which are collected at the same time Tx carry the same timestamps which are TSx.
At the receiving end, because the audio and video are received and played by different devices, the two devices at the receiving end can extract the collecting time stamps in the audio and video data packets respectively. In order to deal with the phenomenon of audio and video playing asynchronization caused by network jitter, two different devices at a receiving end respectively maintain a buffer queue for sound and video, aiming at reducing the influence of network jitter and replacing the smoothness and synchronization during playing with certain delay. After receiving the audio and video code stream, the receiving device does not directly play unconditionally, but firstly puts the audio and video code stream into a buffer queue, and plays the audio and video code stream through a synchronization mechanism after buffering a certain number of data packets.
The invention sets the sound playing device A1 as the master in the synchronization mechanism, it defaults to play sound data from the local buffer queue uniformly, at the same time, it also sends synchronization message to the video display device V1 periodically (every 20ms), synchronizes the collection time stamp of the sound data currently being played to the video display device V1, the video display device controls the buffer queue length of the video data according to the received time stamp, and plays moderately, thus ensures the synchronization of sound and video. (FIG. 3)
Since the synchronization message between a1 and V1 is also transmitted using IP network, there is also a certain system delay, and a random delay due to network jitter. Therefore, the following method is needed to eliminate the error caused by the delay of the synchronization message:
every 20ms of data is played by the sound playing device a1, a synchronization message is sent to the video playing device V1. The data packet in the synchronization message carries the local system time stamp LT1 of the sound playing device a1 and the original acquisition time stamp TS1 of the locally played audio data at the time of transmission of this packet. The video playback device V1 returns an acknowledgement message immediately after receiving the synchronization message. The data packet in the acknowledgement message carries the transmission time stamp LT1 received from the audio playback device a 1. When the acknowledgement message is received by the sender, the sender subtracts the system time LT1 at the time of transmission from the currently received system time LT2, and thus confirms the round trip delay value Δ between a1 and V1 in the current network. The round trip delay value delta is divided by 2 to obtain a one-way delay value. After the one-way delay value of each packet is obtained, the data is processed by a filtering algorithm, for example, the one-way delay values in a period of time are weighted and averaged, and the weighted average value is used as the delay compensation Δ 1. After obtaining the delay value Δ 1, when a1 sends a sync message to V1 next time, the sound collection timestamp in the packet will be modified from the original value TS2 to TS2+ Δ 1. The video playback device V1 will receive the modified sound data time stamp TS2+ Δ 1, and will perform synchronous playback of the video according to this time stamp. The jitter of the transmission delay of the IP network is sometimes large, and the one-way delay value Δ of a single calculation may change frequently. However, since each synchronous data packet participates in the calculation of the delay value within a period of time, the filtered delay data can achieve better effects in speed and accuracy.
Normally, the bandwidth occupied by the video image data is much higher than that occupied by the audio data (i.e. the video data packet is much larger than the audio/video data packet), so in the transmission process, the audio data is usually transmitted faster than the image data, and is received by the receiving end device earlier and starts playing. In extreme cases, such as too slow video packet transmission and too large delay, the buffered video data in the buffer queue of the video device V1 is all played, no new data arrives, and the queue is completely empty. At this time, the one-way synchronization message sent from the sound playing device a1 cannot guarantee audio-video synchronization. Therefore, it is necessary to collect the difference of the audio-video asynchronism at the video playing device V1 and filter the difference to eliminate the jitter interference. If the deviation is found to be excessive for a longer period of time, the deviation value is sent to the sound reproduction apparatus a 1. The audio playback device a1 will increase its buffer queue by a corresponding length, which is equivalent to let more audio packets enter the buffer queue to wait for the successful reception of the video data, so as to ensure the final audio and video synchronization. (FIG. 4)
Although the invention has been described above with reference to various embodiments, it should be understood that many changes and modifications may be made without departing from the scope of the invention. That is, the methods, systems or devices discussed above are examples. Various configurations may omit, substitute, or add various procedures or components as appropriate. For example, in alternative configurations, the methods may be performed in an order different than that described, and/or various stages may be added, omitted, and/or combined. Also, features described with respect to certain configurations may be combined in various other configurations. Different aspects and elements of the configuration may be combined in a similar manner. Furthermore, many of the elements that follow as technology develops are merely examples and do not limit the scope of the disclosure or claims.
Specific details are given in the description to provide a thorough understanding of the exemplary configurations including implementations. However, configurations may be practiced without these specific details, for example, well-known circuits, processes, algorithms, structures, and techniques have been shown without unnecessary detail in order to avoid obscuring the configurations. This description provides example configurations only, and does not limit the scope, applicability, or configuration of the claims. Rather, the foregoing description of the configurations will provide those skilled in the art with an enabling description for implementing the described techniques. Various changes may be made in the function and arrangement of elements without departing from the spirit or scope of the disclosure.
Further, although each operation may describe the operation as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. There may be other steps in a process. Furthermore, examples of the methods may be implemented by hardware, software, firmware, middleware, code, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware or code, the program code or code segments to perform the necessary tasks may be stored in a non-transitory computer-readable medium such as a storage medium and the described tasks are performed by a processor.
It is intended that the foregoing detailed description be regarded as illustrative rather than limiting, and that it be understood that it is the following claims, including all equivalents, that are intended to define the spirit and scope of this invention. The above examples are to be construed as merely illustrative and not limitative of the remainder of the disclosure. After reading the description of the invention, the skilled person can make various changes or modifications to the invention, and these equivalent changes and modifications also fall into the scope of the invention defined by the claims.

Claims (4)

1. Lip synchronization method for playing audio and video separately on different devices, said different devices comprising an audio playing device and a video playing device, characterized in that said method comprises,
sending audio and video codes to the audio playing device and the video playing device by using a sending end, adding timestamp information to a data packet sent by the sending end by using the sending end, and synchronously playing audio and video information with the same timestamp by using the audio playing device and the video playing device respectively by using a synchronization mechanism;
the audio playing device and the video playing device respectively maintain a buffer queue for sound and video, and play the sound and the video through the synchronization mechanism after a certain number of data packets are buffered;
the audio playing device can be set as an active side in a synchronization mechanism, uniformly plays sound data from a local buffer queue, simultaneously periodically sends a synchronization message to the video display device according to a time period, synchronizes a collection time stamp of the sound data currently being played to the video display device, and the video display device controls the buffer queue length of the video data to play according to the received time stamp, so that the synchronization of sound and video is ensured;
the audio playing device sends a synchronous message to the video playing device every time the audio playing device plays data of one time period, the video playing device returns a confirmation message to the sending end after receiving the synchronous message, and the confirmation message comprises a received current sending timestamp from the audio playing device; when the confirmation message is received by the sending end, the sending end subtracts the system time when sending from the currently received system time to confirm the round trip delay value delta under the current network;
the sending end performs weighting processing on the round trip delay value delta every other weighting period and then sends the round trip delay value delta to the audio playing equipment, the audio playing equipment adds a collecting timestamp to the round trip delay value delta after the current weighting processing and then sends the round trip delay value delta to the video playing equipment, and the video playing equipment performs synchronous playing of videos according to the timestamp.
2. A method as claimed in claim 1, wherein said weighting said round trip delay values Δ comprises dividing said round trip delay values Δ by 2 to obtain one way delay values and applying a filtering algorithm to said one way delay values.
3. The method of claim 2, wherein the filtering algorithm comprises performing a weighted average of the one-way delay values over a period of time to obtain filtered one-way delay values.
4. Lip synchronization method for playing audio and video separately on different devices, said different devices comprising an audio playing device and a video playing device, characterized in that said method comprises,
sending audio and video codes to the audio playing device and the video playing device by using a sending end, adding timestamp information to a data packet sent by the sending end by using the sending end, and synchronously playing audio and video information with the same timestamp by using the audio playing device and the video playing device respectively by using a synchronization mechanism;
the audio playing device and the video playing device respectively maintain a buffer queue for sound and video, and play the sound and the video through the synchronization mechanism after a certain number of data packets are buffered;
the audio playing device sends a synchronous message to the video playing device every time the audio playing device plays data of one time period, the video playing device returns a confirmation message to the sending end after receiving the synchronous message, and the confirmation message comprises a received current sending timestamp from the audio playing device; when the confirmation message is received by the sending end, the sending end subtracts the system time when sending from the currently received system time to confirm the round trip delay value delta under the current network;
the sending end performs weighting processing on the round trip delay value delta every other weighting period and then sends the round trip delay value delta to the audio playing equipment, the audio playing equipment adds a current weighted round trip delay value delta to a collecting timestamp and then sends the collecting timestamp to the video playing equipment, and the video playing equipment performs synchronous video playing according to the timestamp; the method is characterized by also comprising the steps of setting the video playing equipment as an active side in a synchronization mechanism, collecting audio and video asynchronization difference values through the video playing equipment, filtering the difference values, comparing the difference values with a preset threshold value, sending the deviation value to the audio playing equipment when the difference value is larger than the preset threshold value, and increasing the corresponding length of a buffer queue of the audio playing equipment according to the deviation value.
CN201811210525.2A 2018-10-17 2018-10-17 Lip sound synchronization method for respectively playing audio and video on different devices Active CN109168059B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110568007.3A CN113286184B (en) 2018-10-17 2018-10-17 Lip synchronization method for respectively playing audio and video on different devices
CN201811210525.2A CN109168059B (en) 2018-10-17 2018-10-17 Lip sound synchronization method for respectively playing audio and video on different devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811210525.2A CN109168059B (en) 2018-10-17 2018-10-17 Lip sound synchronization method for respectively playing audio and video on different devices

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202110568007.3A Division CN113286184B (en) 2018-10-17 2018-10-17 Lip synchronization method for respectively playing audio and video on different devices

Publications (2)

Publication Number Publication Date
CN109168059A CN109168059A (en) 2019-01-08
CN109168059B true CN109168059B (en) 2021-06-18

Family

ID=64878546

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110568007.3A Active CN113286184B (en) 2018-10-17 2018-10-17 Lip synchronization method for respectively playing audio and video on different devices
CN201811210525.2A Active CN109168059B (en) 2018-10-17 2018-10-17 Lip sound synchronization method for respectively playing audio and video on different devices

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202110568007.3A Active CN113286184B (en) 2018-10-17 2018-10-17 Lip synchronization method for respectively playing audio and video on different devices

Country Status (1)

Country Link
CN (2) CN113286184B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109819303B (en) * 2019-03-06 2021-04-23 Oppo广东移动通信有限公司 Data output method and related equipment
US12095582B2 (en) * 2020-02-07 2024-09-17 Microsoft Technology Licensing, Llc Latency compensation for synchronously sharing video content within web conferencing sessions
CN114827696B (en) * 2021-01-29 2023-06-27 华为技术有限公司 Method for synchronously playing audio and video data of cross-equipment and electronic equipment
CN114124631B (en) * 2021-11-15 2023-10-27 四川九洲空管科技有限责任公司 Processing method suitable for audio synchronous control between embedded equipment of aircraft cabin
CN114173208A (en) * 2021-11-30 2022-03-11 广州番禺巨大汽车音响设备有限公司 Audio and video playing control method and device of sound box system based on HDMI (high-definition multimedia interface)
CN114554270A (en) * 2022-02-28 2022-05-27 维沃移动通信有限公司 Audio and video playing method and device
CN114827681B (en) * 2022-04-24 2024-03-22 咪咕视讯科技有限公司 Video synchronization method, device, electronic equipment, terminal equipment and storage medium
US11856275B1 (en) * 2022-10-19 2023-12-26 For Eyes Ug (Haftungsbeschraenkt) Video reproduction system and media reproduction system and method of synchronized reproducing of a video data stream of an audio-visual data stream and computer-readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103024517A (en) * 2012-12-17 2013-04-03 四川九洲电器集团有限责任公司 Method for synchronously playing streaming media audios and videos based on parallel processing
CN104618786A (en) * 2014-12-22 2015-05-13 深圳市腾讯计算机系统有限公司 Audio/video synchronization method and device
CN104853239A (en) * 2015-04-27 2015-08-19 浙江生辉照明有限公司 Audio and video synchronous playback control method and system
CN106658135A (en) * 2016-12-28 2017-05-10 北京奇艺世纪科技有限公司 Audio and video playing method and device
CN106792073A (en) * 2016-12-29 2017-05-31 北京奇艺世纪科技有限公司 Method, playback equipment and system that the audio, video data of striding equipment is synchronously played
CN107801080A (en) * 2017-11-10 2018-03-13 普联技术有限公司 A kind of audio and video synchronization method, device and equipment
CN108377406A (en) * 2018-04-24 2018-08-07 青岛海信电器股份有限公司 A kind of adjustment sound draws the method and device of synchronization

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7843974B2 (en) * 2005-06-30 2010-11-30 Nokia Corporation Audio and video synchronization
CN101212690B (en) * 2006-12-26 2011-04-20 中兴通讯股份有限公司 Method for testing lip synchronization for multimedia audio/video stream
CN100579238C (en) * 2008-02-22 2010-01-06 上海华平信息技术股份有限公司 Synchronous playing method for audio and video buffer
CN103269448A (en) * 2013-05-24 2013-08-28 浙江工商大学 Method for achieving synchronization of audio and video on the basis of RTP/RTCP feedback early-warning algorithm
CN103905880A (en) * 2014-03-13 2014-07-02 北京奇艺世纪科技有限公司 Playing method of audio data and video data, smart television set and mobile equipment
CN103905878A (en) * 2014-03-13 2014-07-02 北京奇艺世纪科技有限公司 Video data and audio data synchronized playing method and device and equipment
TWI548278B (en) * 2014-03-25 2016-09-01 鴻海精密工業股份有限公司 Audio/video synchronization device and audio/video synchronization method
CN104735470B (en) * 2015-02-11 2018-06-19 海信集团有限公司 A kind of streaming media data transmission method and device
US9532099B2 (en) * 2015-03-24 2016-12-27 Intel Corporation Distributed media stream synchronization control
US10015103B2 (en) * 2016-05-12 2018-07-03 Getgo, Inc. Interactivity driven error correction for audio communication in lossy packet-switched networks
CN106791271B (en) * 2016-12-02 2019-08-13 福建星网智慧科技股份有限公司 A kind of audio and video synchronization method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103024517A (en) * 2012-12-17 2013-04-03 四川九洲电器集团有限责任公司 Method for synchronously playing streaming media audios and videos based on parallel processing
CN104618786A (en) * 2014-12-22 2015-05-13 深圳市腾讯计算机系统有限公司 Audio/video synchronization method and device
CN104853239A (en) * 2015-04-27 2015-08-19 浙江生辉照明有限公司 Audio and video synchronous playback control method and system
CN106658135A (en) * 2016-12-28 2017-05-10 北京奇艺世纪科技有限公司 Audio and video playing method and device
CN106792073A (en) * 2016-12-29 2017-05-31 北京奇艺世纪科技有限公司 Method, playback equipment and system that the audio, video data of striding equipment is synchronously played
CN107801080A (en) * 2017-11-10 2018-03-13 普联技术有限公司 A kind of audio and video synchronization method, device and equipment
CN108377406A (en) * 2018-04-24 2018-08-07 青岛海信电器股份有限公司 A kind of adjustment sound draws the method and device of synchronization

Also Published As

Publication number Publication date
CN113286184B (en) 2024-01-30
CN109168059A (en) 2019-01-08
CN113286184A (en) 2021-08-20

Similar Documents

Publication Publication Date Title
CN109168059B (en) Lip sound synchronization method for respectively playing audio and video on different devices
CN106686438B (en) method, device and system for synchronously playing audio images across equipment
AU2022252735B2 (en) Method and apparatus for synchronizing applications' consumption of remote data
CN112351294A (en) Method and system for frame synchronization among multiple machine positions of cloud director
EP2538689A1 (en) Adaptive media delay matching
US10362173B2 (en) Web real-time communication from an audiovisual file
KR102519381B1 (en) Method and apparatus for synchronously switching audio and video streams
CN112291498B (en) Audio and video data transmission method and device and storage medium
CN109379619A (en) Sound draws synchronous method and device
CN107438990B (en) Method and apparatus for delivering timing information
US10382810B2 (en) Method and device for implementing synchronous playing
CN101137066B (en) Multimedia data flow synchronous control method and device
CN112995720B (en) Audio and video synchronization method and device
JP2015012557A (en) Video audio processor, video audio processing system, video audio synchronization method, and program
CN114095771B (en) Audio and video synchronization method, storage medium and electronic equipment
JP5186094B2 (en) Communication terminal, multimedia playback control method, and program
CN116112720A (en) Ultra-high-definition audio and video synchronization system based on PTP network synchronization
CN113645491A (en) Method for realizing real-time synchronous playing of multiple live broadcast playing ends
JP2011087074A (en) Output controller of remote conversation system, method thereof, and computer executable program
JP2015046708A (en) Communication system, communication method, transmission-side synchronous signal distribution device, transmission-side synchronous control device, reception-side synchronous signal distribution device, reception-side synchronous control device and program
JP7552900B2 (en) Communication system performing synchronous control, synchronous control method thereof, receiving server, and synchronous control program
CN115174978B (en) Sound and picture synchronization method for 3D digital person and electronic equipment
JP2010219783A (en) Communication terminal, communication method, and computer program
JP2010183237A (en) System and method of content synchronous reproduction, reproduction terminal, method of controlling the same, and control program
CN115209199A (en) Media data processing method and device, terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant