CN109963186B - Video and audio synchronization device and video and audio data synchronization method - Google Patents

Video and audio synchronization device and video and audio data synchronization method Download PDF

Info

Publication number
CN109963186B
CN109963186B CN201810076267.7A CN201810076267A CN109963186B CN 109963186 B CN109963186 B CN 109963186B CN 201810076267 A CN201810076267 A CN 201810076267A CN 109963186 B CN109963186 B CN 109963186B
Authority
CN
China
Prior art keywords
audio
video
image data
processor
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810076267.7A
Other languages
Chinese (zh)
Other versions
CN109963186A (en
Inventor
陈佑易
杨昀林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aten International Co Ltd
Original Assignee
Aten International Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aten International Co Ltd filed Critical Aten International Co Ltd
Publication of CN109963186A publication Critical patent/CN109963186A/en
Application granted granted Critical
Publication of CN109963186B publication Critical patent/CN109963186B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Television Receiver Circuits (AREA)

Abstract

The invention provides an audio-video synchronizer and an audio-video data synchronization method, wherein the audio-video synchronizer comprises a receiver, a transmitter and a processor. The processor is coupled to the receiver and the transmitter. The processor receives the start instruction at the first calibration time, and outputs the characteristic image data to the transmitter in response to the start instruction. The transmitter transmits the received characteristic image data to the receiver so that the processor receives the characteristic image data at the second calibration time. The processor adjusts the sound data delay parameter according to a time difference between the first correction time and the second correction time.

Description

Video and audio synchronization device and video and audio data synchronization method
Technical Field
The invention relates to a device and a method for processing audio-visual data; more particularly, the present invention relates to an audio-video synchronization device and an audio-video data synchronization method.
Background
The audio/video signals include video signals and audio signals, and generally speaking, after the audio/video signals are processed by the audio/video device, the video signals are transmitted to the display device to output images, and the audio signals are transmitted to the audio playing device to output sounds. In the process of processing the video and audio signals, the output video and audio are not synchronized due to the influence of hardware resources, video and audio processing complexity and other factors. Generally, the complexity of video signal processing is much higher than that of audio signal processing, so the processing time of video signal is longer than that of audio signal, so the output time of audio signal must be delayed to improve the problem of audio signal output asynchronism. The video and audio asynchronous improvement scheme provided by the prior art mostly depends on manual setting or can be completed by an additional external device, and is inconvenient to operate. Furthermore, manual setting may also be subject to human inaccuracies. Therefore, the existing video and audio synchronization technology still needs to be improved.
Disclosure of Invention
An objective of the present invention is to provide an audio/video synchronization apparatus and an audio/video data synchronization method to improve the situation of asynchronous playing of audio/video data.
The video and audio synchronization device comprises a receiver, a transmitter and a processor. The processor is coupled to the receiver and the transmitter. The processor receives the start instruction at the first calibration time, and outputs the characteristic image data to the transmitter in response to the start instruction. The transmitter transmits the received characteristic image data to the receiver so that the processor receives the characteristic image data at the second calibration time. The processor adjusts the sound data delay parameter according to a time difference between the first correction time and the second correction time.
In one embodiment, the apparatus further includes a controller coupled to the processor, the controller providing the start command to the processor at the first calibration time, the processor providing an end command to the controller at the second calibration time, the controller calculating the time difference in response to the end command and informing the processor to adjust the sound data delay parameter.
In one embodiment, the processor includes a feature image generation unit that outputs the feature image data in response to the start instruction.
In one embodiment, the processor includes an image data receiving unit, the characteristic image data is transmitted through the transmitter and the receiver and received by the image data receiving unit, and the image data receiving unit outputs the ending command to the controller.
In one embodiment, the receiver is an HDMI receiver and the transmitter is an HDMI transmitter.
In one embodiment, the characteristic image data is one of a reduced image data, an enlarged image data, a divided image data and a synthesized image data.
The video and audio data synchronization method comprises the following steps: receiving a start instruction at a first correction time; outputting the feature image data in response to the start command; receiving the feature image data at the second calibration time; a time difference between the first correction time and the second correction time is calculated, and the sound data delay parameter is adjusted according to the time difference.
In one embodiment, before receiving the start command at the first calibration time, the method further comprises: and the audio-video data output end is coupled with an audio-video data receiving end of the audio-video device.
In one embodiment, the method further comprises: disconnecting the audio-video data output end and the audio-video data receiving end; receiving original video and audio data; separating the original audio-video data into an original image data and an original sound data; processing the original image data according to an image processing mode; and synchronously outputting the processed original image data and the original sound data according to the adjusted sound data delay parameter.
In one embodiment, the image processing mode is one of a reduced image mode, a magnified image mode, a split image mode, and a composite image mode.
The audio-video synchronization device and the audio-video data synchronization method can improve the condition of asynchronous playing of the audio-video data.
Drawings
Fig. 1 is a schematic diagram of an audio-video synchronization device according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of an audio-visual synchronization apparatus according to another embodiment of the invention.
Fig. 3 is a flowchart of an embodiment of an audio-visual data synchronization method according to the present invention.
Description of the main element symbols:
10 video and audio synchronizer
110 receiver
120 transmitter
130 processor
132 characteristic image generating unit
134 image data receiving unit
140 controller
S101, S103, S105, S107, S109, S111, S201, S203, S205, S207, S209
Detailed Description
The invention provides an audio-video synchronizer, which utilizes a transmitter to connect a receiver to detect audio-video delay. The video synchronization device may be a device having a digital video transmission Interface, and the digital video transmission Interface may be, for example, a High Definition Multimedia Interface (HDMI), but not limited thereto.
Fig. 1 is a schematic diagram of an audio/video synchronization apparatus 10 according to an embodiment of the present invention. As shown in fig. 1, the av sync device 10 includes a receiver 110, a transmitter 120, and a processor 130. The processor 130 is coupled to the receiver 110 and the transmitter 120. In one embodiment, the receiver 110 is an HDMI receiver and the transmitter 120 is an HDMI transmitter. The processor 130 receives the start command at the first calibration time, and outputs the feature image data to the transmitter 120 in response to the start command. For example, a control module (not shown) in the processor 130 generates a start command and records a first calibration time, and the processor 130 outputs the characteristic image data to the transmitter 120 according to the start command.
The characteristic image data is used as a reference data for calibration to simulate the time consumed for processing the image data provided by a signal source (not shown). In one embodiment, the feature image data may be reduced image data. In other embodiments, the feature image data may be reduced image data, enlarged image data, divided image data, or synthesized image data.
As shown in fig. 1, the transmitter 120 transmits the received feature image data to the receiver 110 so that the processor 130 receives the feature image data at the second calibration time. In other words, the transmitter 120 transmits the feature image data back to the processor 130 via the receiver 110, and the processor 130 measures the second calibration time. For example, the control module in the processor 130 records a first calibration time and records a second calibration time when the receiver 110 returns the feature image data. The processor 130 adjusts the sound data delay parameter according to a time difference between the first correction time and the second correction time. Therefore, the image delay is detected by connecting the output characteristic image data back to the receiver 110 by the transmitter 120, so as to achieve the video and audio synchronization, and meanwhile, the inconvenience of repeated adjustment under manual setting can be avoided, and the use quality is improved.
Fig. 2 is a schematic diagram of an audio-visual synchronization apparatus 10 according to another embodiment of the present invention. As shown in fig. 2, the av sync device 10 includes a receiver 110, a transmitter 120, a processor 130, and a controller 140. The processor 130 is coupled to the transmitter 120 and the receiver 110, and the controller 140 is coupled to the processor 130. In addition, the processor 130 includes a feature image generating unit 132 and an image data receiving unit 134. The controller 140 provides a start instruction to the processor 130 at a first calibration time. The feature image generation unit 132 outputs feature image data in response to the start instruction. The transmitter 120 transmits the feature image data back to the processor 130 via the receiver 110, so that the processor 130 provides an end command to the controller 140 at a second calibration time. Specifically, the transfer of the feature image data via the transmitter 120 and the receiver 110 is received by the image data receiving unit 134 at the second calibration time and outputs an end command to the controller 140. Then, the controller 140 calculates a time difference between the first calibration time and the second calibration time in response to the end command and notifies the processor 130 to adjust the audio data delay parameter.
For example, the controller 140 records a first calibration time, records a second calibration time when receiving the end command, and further calculates a time difference between the first calibration time and the second calibration time, so as to obtain the time consumed by the av sync device 10 itself in response to different video data processing requirements, and report the time difference to the processor 130 to adjust the audio data delay parameter. Therefore, the image delay is detected by connecting the output characteristic image data back to the receiver 110 by the transmitter 120, so as to achieve the video and audio synchronization, and meanwhile, the inconvenience of repeated adjustment under manual setting can be avoided, and the use quality is improved. In addition, in the preferred embodiment, the characteristic image data includes the different types of image data (reduced image data, enlarged image data, divided image data, or synthesized image data, etc.), so that different time delay amounts formed when performing image processing of different complexities can be obtained.
Fig. 3 is a flowchart of an embodiment of an audio-visual data synchronization method according to the present invention. As shown in fig. 3, the method for synchronizing video data includes a calibration calculation stage and an original video data processing stage. The calibration calculation stage may correspond to the operation method introduced in the audio/video synchronization device 10 of fig. 1 or fig. 2, and includes steps S101, S103, S105, S107, S109, and S111. The stage of processing the original video data includes steps S201, S203, S205, S207, and S209.
As shown in fig. 3, in step S101: a start command is received at a first calibration time. The processor receives the initial instruction provided by the controller during the first calibration. In step S103: the feature image data is output in response to the start command. For example, after the processor receives the start command, the feature image generating unit outputs feature image data. In step S105: the feature image data is received at the second calibration timing. For example, the feature image data is transmitted through the transmitter and the receiver, and the feature image data is received by the image data receiving unit of the processor at the second calibration time. In step S107: calculates a time difference between the first correction time and the second correction time, and at step S109: and adjusting the sound data delay parameter according to the time difference. For example, the image data receiving unit outputs an end command to the controller at the second calibration time. Then, the controller calculates the time difference value in response to the ending command and informs the processor to adjust the sound data delay parameter.
In step S111, it is confirmed whether the adjustment of the delay parameter of the voice data is completed. If the adjustment is completed, the stage of processing the original video and audio data is entered. Otherwise, if the adjustment is not completed, the steps S101 to S109 are repeated. For example, the feature image data includes different types of image data, such as reduced image data, enlarged image data, divided image data, or synthesized image data, so that different time delay amounts generated when image processing with different complexities is performed can be obtained. After the audio-video synchronizer calculates the correction time difference by reducing the image data and adjusts the delay parameter of the sound data, the process is repeated by amplifying the image data until the adjustment of the delay parameter of the sound data corresponding to various image data is finished. It is to be added that the method may further include the step of coupling the video data output terminal of the video apparatus with the video data receiving terminal before the first calibration time receives the start command. For example, the transmitter 120 of the av sync device 10 shown in fig. 1 is connected to the receiver 110, and an output path of the transmitter 120 is formed to be connected back to an input path of the receiver 110.
As shown in fig. 3, in step S201: and disconnecting the audio-video data output end and the audio-video data receiving end. And when the adjustment is completed, disconnecting the audio-video data output end and the audio-video data receiving end. For example, the path connecting the transmitter 120 to the receiver 110 of the av sync device 10 shown in fig. 1 is disconnected. In step S203: and receiving the original video and audio data. The original video data is provided by a data source device (e.g., a multimedia data providing device such as a computer, a DVD player, an optical disk drive, etc.). In step S205: the original audio and video data is separated into original image data and original sound data. In step S207: and processing the original image data according to the image processing mode. For example, raw image data is processed by a processor. In one embodiment, the image processing mode may be a reduced image mode. In other embodiments, the image processing mode may be a reduced image mode, an enlarged image mode, a split image mode, or a composite image mode.
In step S209: and synchronously outputting the processed original image data and the original sound data. For example, after the original image data is processed in the reduced image mode, since the time delay of the reduced image data is known in the calibration calculation stage, the delay parameter of the audio data can be automatically adjusted accordingly, and the processed original image data and the processed original audio data are synchronously output. Thus, the video and audio synchronization is achieved.
The present invention has been described in relation to the above embodiments, which are only exemplary of the implementation of the present invention. It must be noted that the disclosed embodiments do not limit the scope of the invention. On the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. An audio-video synchronization apparatus, comprising:
a receiver;
a transmitter; and
a processor, coupled to the receiver and the transmitter, for receiving a start command at a first calibration time and outputting a feature image data to the transmitter in response to the start command, the transmitter transmitting the received feature image data to the receiver such that the processor receives the feature image data at a second calibration time, the processor adjusting an audio data delay parameter according to a time difference between the first calibration time and the second calibration time;
the output path of the transmitter is connected to the input path of the receiver.
2. The video audio synchronizer of claim 1 further comprising a controller coupled to the processor, the controller providing the start command to the processor at the first calibration time, the processor providing an end command to the controller at the second calibration time, the controller calculating the time difference in response to the end command and informing the processor to adjust the audio data delay parameter.
3. The video audio synchronizer of claim 1 wherein the processor comprises a feature image generating unit, the feature image generating unit outputting the feature image data in response to the start command.
4. The video audio synchronizer of claim 2 wherein the processor comprises an image data receiving unit, the characteristic image data is transmitted through the transmitter and the receiver and received by the image data receiving unit, and the image data receiving unit outputs the end command to the controller.
5. The video audio synchronizer of claim 1 wherein the receiver is an HDMI receiver and the transmitter is an HDMI transmitter.
6. The video synchronization apparatus of claim 1, wherein the characteristic image data is one of a reduced image data, an enlarged image data, a divided image data and a synthesized image data.
7. A video and audio data synchronization method is suitable for a video and audio synchronization device, the video and audio synchronization device comprises a receiver, a transmitter and a processor, and the method is characterized by comprising the following steps:
the processor receives a start instruction at a first calibration time;
the processor outputs a characteristic image data through the transmitter in response to the start command;
the processor receiving the characteristic image data through the receiver at a second calibration time;
the processor calculates a time difference value between the first correction time and the second correction time; and
the processor adjusts a sound data delay parameter according to the time difference value;
the output path of the transmitter is connected to the input path of the receiver.
8. The method of claim 7, further comprising, before receiving the start command at the first calibration time: the audio-video device is coupled with an audio-video data output end and an audio-video data receiving end.
9. The video audio data synchronization method of claim 8, further comprising:
disconnecting the audio-video data output end and the audio-video data receiving end;
receiving original video and audio data;
separating the original audio-video data into an original image data and an original sound data;
processing the original image data according to an image processing mode; and
and synchronously outputting the processed original image data and the original sound data according to the adjusted sound data delay parameter.
10. The method of claim 9, wherein the image processing mode is one of a reduced image mode, a magnified image mode, a split image mode, and a composite image mode.
CN201810076267.7A 2017-12-22 2018-01-26 Video and audio synchronization device and video and audio data synchronization method Active CN109963186B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW106145403A TWI672956B (en) 2017-12-22 2017-12-22 Apparatus and method for audio and video synchronization
TW106145403 2017-12-22

Publications (2)

Publication Number Publication Date
CN109963186A CN109963186A (en) 2019-07-02
CN109963186B true CN109963186B (en) 2021-12-17

Family

ID=67023104

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810076267.7A Active CN109963186B (en) 2017-12-22 2018-01-26 Video and audio synchronization device and video and audio data synchronization method

Country Status (2)

Country Link
CN (1) CN109963186B (en)
TW (1) TWI672956B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210028880A (en) 2019-09-05 2021-03-15 삼성전자주식회사 Display apparatus and method for controlling thereof

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004104730A (en) * 2002-09-13 2004-04-02 Hitachi Kokusai Electric Inc Delay time detecting method and av synchronization detecting method
CN1725864A (en) * 2004-07-23 2006-01-25 Lg电子株式会社 Video apparatus and method for controlling the same
CN101171838A (en) * 2005-04-28 2008-04-30 松下电器产业株式会社 Lip-sync correcting device and lip-sync correcting method
CN103188549A (en) * 2011-12-28 2013-07-03 宏碁股份有限公司 Video playing device and operation method thereof
JP2013143706A (en) * 2012-01-12 2013-07-22 Onkyo Corp Video audio processing device and program therefor
CN105704506A (en) * 2016-01-19 2016-06-22 北京流金岁月文化传播股份有限公司 Device and method for synchronizing audio and video coding labial sound
CN106686438A (en) * 2016-12-29 2017-05-17 北京奇艺世纪科技有限公司 Cross-device audio/image synchronous playing method, equipment and system

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997003508A1 (en) * 1995-07-13 1997-01-30 Sony Corporation Data transmission method, data transmission apparatus and data transmission system
WO2006008696A1 (en) * 2004-07-15 2006-01-26 Koninklijke Philips Electronics N.V. Measurement system for delay between two signals transmitted via two transmission paths
CN101079992B (en) * 2004-07-30 2010-10-06 联发科技股份有限公司 Simultaneously playing device of video/audio signals and its method
JP4182437B2 (en) * 2004-10-04 2008-11-19 ソニー株式会社 Audio video synchronization system and monitor device
CN101212588A (en) * 2006-12-29 2008-07-02 明基电通股份有限公司 Audio/video playing system and method capable of playing audio and video signals synchronously
TWI496455B (en) * 2013-04-10 2015-08-11 Wistron Corp Audio-video synchronizing device and method thereof
CN106997770B (en) * 2016-01-22 2023-01-03 鼎卓创意科技股份有限公司 Audio-video synchronous control method, audio-video synchronous control system and related electronic device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004104730A (en) * 2002-09-13 2004-04-02 Hitachi Kokusai Electric Inc Delay time detecting method and av synchronization detecting method
CN1725864A (en) * 2004-07-23 2006-01-25 Lg电子株式会社 Video apparatus and method for controlling the same
CN101171838A (en) * 2005-04-28 2008-04-30 松下电器产业株式会社 Lip-sync correcting device and lip-sync correcting method
CN103188549A (en) * 2011-12-28 2013-07-03 宏碁股份有限公司 Video playing device and operation method thereof
JP2013143706A (en) * 2012-01-12 2013-07-22 Onkyo Corp Video audio processing device and program therefor
CN105704506A (en) * 2016-01-19 2016-06-22 北京流金岁月文化传播股份有限公司 Device and method for synchronizing audio and video coding labial sound
CN106686438A (en) * 2016-12-29 2017-05-17 北京奇艺世纪科技有限公司 Cross-device audio/image synchronous playing method, equipment and system

Also Published As

Publication number Publication date
CN109963186A (en) 2019-07-02
TW201929537A (en) 2019-07-16
TWI672956B (en) 2019-09-21

Similar Documents

Publication Publication Date Title
US8117330B2 (en) Information processing device for relaying streaming data
US8208069B2 (en) Audio processing apparatus, video processing apparatus, and method for controlling the same
US20140376873A1 (en) Video-audio processing device and video-audio processing method
JP2007533189A (en) Video / audio synchronization
JPWO2006025441A1 (en) Image processing apparatus, audio processing apparatus, image / audio supply apparatus, image / audio processing system, and image / audio synchronization method
JP2015130662A (en) Av apparatus and its control method
CN101394527A (en) Reception apparatus and method of controlling image output by reception apparatus
CN109963186B (en) Video and audio synchronization device and video and audio data synchronization method
CN204305260U (en) The television set of a kind of video and wireless sound box Audio Matching
WO2017141977A1 (en) Audio device and control method
KR100688981B1 (en) Media Player, Control Method Thereof And Media Play System Comprising Therof
US20120013721A1 (en) Audio/video reproduction system, hearing aid, and audio/video processing device
CN101282438A (en) Method for synchronization of image and sound as well as video apparatus using the same
US11134178B2 (en) Video signal output apparatus, control method, and recording medium
KR20120074700A (en) Audio apparatus and display apparatus having the same, apparatus and method for compensating lipsync with external device
KR20010018572A (en) Apparatus for synchronizing audio/video signal and method thereof
US20210184726A1 (en) Reception device and communication system
JP5506239B2 (en) Video processing apparatus, control method, and program
TWM568008U (en) Video signal conversion device
TWM568010U (en) Video signal conversion device
CN218450215U (en) HDMI one-in three-out frequency divider with audio Bluetooth function
US11347473B2 (en) Display device
KR100655000B1 (en) Television Receiver and Method for Auto Controlling Signal Syncronization
JP2013143706A (en) Video audio processing device and program therefor
CN112188181B (en) Image display device, stereoscopic image processing circuit and synchronization signal correction method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant