WO2008073083A1 - Method and system for subframe accurate synchronization - Google Patents

Method and system for subframe accurate synchronization Download PDF

Info

Publication number
WO2008073083A1
WO2008073083A1 PCT/US2006/047337 US2006047337W WO2008073083A1 WO 2008073083 A1 WO2008073083 A1 WO 2008073083A1 US 2006047337 W US2006047337 W US 2006047337W WO 2008073083 A1 WO2008073083 A1 WO 2008073083A1
Authority
WO
WIPO (PCT)
Prior art keywords
audio
occurrence
common event
mode
event
Prior art date
Application number
PCT/US2006/047337
Other languages
English (en)
French (fr)
Inventor
Ingo Tobias Doser
Ana Belen Benitez
Dong-Qing Zhang
Original Assignee
Thomson Licensing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomson Licensing filed Critical Thomson Licensing
Priority to JP2009541270A priority Critical patent/JP5031039B2/ja
Priority to EP06845263A priority patent/EP2095367A1/en
Priority to CA2673100A priority patent/CA2673100C/en
Priority to US12/312,362 priority patent/US8483540B2/en
Priority to CN2006800565763A priority patent/CN101606203B/zh
Priority to PCT/US2006/047337 priority patent/WO2008073083A1/en
Publication of WO2008073083A1 publication Critical patent/WO2008073083A1/en

Links

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/2368Multiplexing of audio and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43072Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of multiple content streams on the same device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4334Recording operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4341Demultiplexing of audio and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/802Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving processing of the sound signal

Definitions

  • the present invention generally relates to synchronization of recording modes such as audio and image recording components, and more particularly to, for example, synchronizing clap slates in the movie and television industries with sub- frame accuracy.
  • a method, apparatus and system in accordance with various embodiments of the present invention address these and other deficiencies of the prior art by providing synchronization at a sub-frame accuracy between at least two recording modes.
  • a method for synchronizing two recording modes includes identifying a common event in the two recording modes, determining an occurrence of the common event in time for at least a higher accuracy one of the two recording modes, predicting an occurrence of the common event in a lower accuracy one of the two recording modes by determining a time when the common event occurred between frames in the lower accuracy one, and synchronizing the occurrence of the common event in the higher accuracy one to the lower accuracy one to provide sub-frame accuracy alignment between the two recording modes.
  • a method for synchronizing an audio recording and a video recording includes identifying a common event in the audio recording and the video recording, determining the location of the common event in the audio recording, associating the location of the event in the audio recording with a nearest frame of the occurrence of the event in the video recording, and if the event does not occur during a frame in the video recording, estimating a location between frames for the occurrence of the event, and adjusting the associated location of the audio recording by an amount equal to a difference between the occurrence of the nearest frame and the estimated location for the occurrence of the event.
  • the common event can include the closing of a clap slate and estimating the location between frames for the occurrence of the common event comprises calculating an angular speed of the closing of the clap slate and predicting a time when the clap slate has closed.
  • a system for synchronizing video and audio information in a video production includes a means for determining a nearest frame of the occurrence of the common event in a video mode of the video production, a means for determining the location of the common event in an audio mode of the video production and associating the location of the common event in the audio mode with the nearest frame of the occurrence of the common event in the video mode, a means for estimating the occurrence of the common -ev ⁇ nt in the video mode by determining a location between frames when the common event occurred in the video mode, and a means for synchronizing the audio mode to the video mode.
  • the synchronizing means synchronizes the audio mode to the video mode by adjusting the associated location of the audio mode by an amount equal to a difference between the occurrence of the nearest frame and the estimated location for the occurrence of the event in the video mode.
  • the synchronizing means synchronizes the audio mode to the video mode by adding a correction time to time of occurrence of the common event in the audio mode to designate a starting point and aligning the starting point to a nearest frame after the occurrence of the common event in the video mode.
  • FIG. 1 depicts two film sequences illustrating clap slates in a plurality of states
  • FIG. 2 depicts a time line illustrating a mismatch or error between an audio clap and a visual clap in a conventional technique
  • FIG. 3 depicts a time line illustrating synchronization between a new start point offset from an audio clap and a visual clap in accordance with an embodiment of the present invention
  • FIG. 4 depicts a high level block diagram of a system for synchronizing two recording modes in accordance with an embodiment of the present invention
  • FIG. 5 depicts a flow diagram of a method for synchronizing two recording modes in accordance with an embodiment of the present invention. It should be understood that the drawings are for purposes of illustrating the concepts of the invention and are not necessarily the only possible configuration for illustrating the invention. To facilitate understanding, identical reference numerals have been used, where possible, to designate identical elements that are common to the figures.
  • the present invention advantageously provides a method, apparatus and system for audio and image synchronization in, for example, movie production applications.
  • the present invention will be described primarily within the context of movie production, the specific embodiments of the present invention should not be treated as limiting the scope of the invention.
  • the concepts of the present invention can be advantageously applied in other synchroni2ation techniques.
  • the concepts of the present invention can be implemented in film splicing, film recording, audio mixing, image mixing and the like.
  • Such concepts may include an indicator that provides an event in at least two modes (e.g., audio and visual modes). The indicator is then recognized in time for at least a higher accuracy mode. Then, the lower accuracy mode has a corresponding time extrapolated to predict the time when the event occurred between frames of the lower accuracy mode. The events in the two modes are then synchronized to provide sub-frame accuracy between the two modes.
  • processors can be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software.
  • the functions can be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which can be shared.
  • explicit use of the term "processor”, “module” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and can implicitly include, without limitation, digital signal processor (“DSP”) hardware, read-only memory (“ROM”) for storing software, random access memory (“RAM”), and non-volatile storage.
  • DSP digital signal processor
  • ROM read-only memory
  • RAM random access memory
  • non-volatile storage non-volatile storage
  • a method, apparatus and system for synchronizing audio and image components in film production are disclosed.
  • the present invention describes a solution for achieving a more accurate synchronization of audio and video.
  • a slate time code is provided with the modality of higher accuracy (currently, audio), which is then aligned with a slate time code of the other modality (currently, video).
  • FIG. 1 two example slate closing sequences 10 and 20 are illustratively depicted.
  • a first picture 12 shows a slate 15 at a 30 degree open position
  • a second picture 14 shows a slate 30 in a 15 degree open position
  • a third picture 16 shows a closed slate 30.
  • an audio clap occurs just at the exact time the third picture 16 is captured, if a constant angular speed of the clap slate (also referred to herein as a clapper) is assumed. Note that although the slate 30 can be in any position or orientation, the angular speed of the slate in the projected 2D image by the camera remains constant because of the linear perspective projection relationship.
  • a constant angular speed of the clap slate also referred to herein as a clapper
  • a first picture 22 shows a slate 30 in a 50 degree open position
  • a second picture 24 shows a slate 30 with a 15 degree open position
  • a third picture shows a closed slate 30.
  • the audio clap does not occur at the time of the third picture's capture.
  • a time line of events shows sequence 20 along with an audio track 40 to demonstrate a time of occurrence 42 of the audio clap.
  • the slate time code of the video is determined based on visual clues provided in the video sequence.
  • Video as referred to herein relates to images, moving images and/or visual data.
  • the audio slate time code is corrected to appropriately align with the video time code of the closed slate. If the clapper closed in between two motion picture frames, then the audio time code is aligned not with the first visual frame where the slate is closed but with the actual time when the clap occurred. In FIG. 3, the exact moment in time does not have a picture time code associated with it because it happened between two picture frames. Therefore and in accordance with the present invention, the audio slate time code is corrected by determining a new start point 47 which is aligned with the time code (event time) of the first picture frame 49 that shows the closed slate.
  • FIG. 4 depicts a high level block diagram of a system for synchronizing two recording modes in accordance with an embodiment of the present invention.
  • the system 100 of FIG.4 illustratively comprises a stand-alone device that synchronizes two modes used in recording.
  • the system 100 can comprise a part of a mixing device, a recording device, a production device or any other device that needs to synchronize two modes of recordings.
  • the modes include audio and visual data.
  • the system 100 illustratively comprises a slate angle analysis block 110, a slate closing prediction block 126, a visual slate closing timecode block 120, a slate audio recognition block 134 and a slate angle storage • block 122.
  • the slate angle analysis block illustratively comprises a slate image recognition block 112 and a slate angel calculation block 114.
  • the slate closing prediction block illustratively comprises an angular speed calculation block 128 and a closing moment prediction block 130.
  • motion picture data 102 is communicated to the slate angle analysis block 110.
  • motion picture time codes (visual timecode) 104 are communicated to the visual slate closing timecode block 120.
  • audio data 106 with audio time codes 108 are communicated to the slate audio recognition block 134.
  • the motion picture data 102 is received by the slate image recognition mechanism 112 of the slate angle analysis block 110.
  • the slate image recognition mechanism 112 analyzes the picture content 102 and determines a geometric shape that resembles a slate image.
  • the image recognition mechanism 112 can be implemented in software or hardware, or can alternatively be performed manually by a technician.
  • the recognition process can include identifying portions of the clap slate either automatically (using image recognition software) or manually.
  • the geometrical shape resembling the slate determined by the recognition mechanism 112 is further analyzed by a slate angle calculation block 114 to detect the angle of the clapper.
  • the slate angle analysis block 110 can include video recognition software (not shown) configured to identify the clapper and to determine slate angles suring different frames.
  • the angle determination can be performed visually, for example, by applying a protractor on an image of the clapper. As such, a determination of a more precise instant (time) of when the clapper was closed can be made.
  • the identification of the clapper in an image or video sequence is easily accomplished because the clapper has distinctive markings and is usually prominently displayed in the video sequence.
  • One output signal 116 can comprise a "slate closed signal".
  • the slate closed signal 116 is used to identify a first picture frame having the clapper completely closed.
  • the slate closed signal 116 can be implemented as a boolean signal becoming "true” for the time periods for all picture frames with the slate closed and "false” for all other frame periods.
  • a second output signal 118 of the slate analysis block 110 can comprise a "current slate angle” signal.
  • the current slate angle signal 118 identifies the angle of the clapper for a current picture in, for example, degrees or radians.
  • the second output signal 118 is communicated to two-subsequent blocks in parallel: the slate closing prediction block 126 and the slate angle storage block 122.
  • the output signal 116 (slate closed signal) is received by the visual slate closing timecode block 120.
  • a first time code during which the slate closed signal 116 becomes "true” is selected as a visual slate time code 136 which is then used for synchronization with audio.
  • the slate angle storage of previous frames block 122 stores one or several previous clap slate angles to permit the slate closing prediction 126 to make predictions on speed and position of the slate. In this way, angular speed can be calculated in the angular speed calculation block 128 of the slate closing prediction block 126 and a closing moment prediction can be made in the closing moment prediction block 130 of the slate closing prediction block 126.
  • angular velocity and angular acceleration can be considered in predicting the close time of the clapper.
  • a constant angular velocity is assumed.
  • the information stored regarding the slate angle of previous frames can be discarded from the slate angle storage block 122 after the slate closing is determined.
  • the slate closed signal 116 from the slate angle analysis block can be used for indicating when the storage information can be discarded, provided that the signal 116 is delayed by at least one picture frame period before being received by the slate angle storage block 122.
  • the angular speed calculation block 128 uses the clap slate angle of a previous frame (N-1) and a clap slate angle of a frame before the previous frame (N-2) to determine an angular speed of the clap slate. As such, a constant angular speed is assumed. In one embodiment of the present invention, the angular speed calculation block 128 can determine angular speed according to Equation one (1), which follows:
  • AngularSpeed [Angle ⁇ N-2) - Angle ( N-i>] / FramePeriod (1)
  • AngularSpeed depicts the angular speed of the clap slate or clapper in degrees per second
  • Angle ( N-i) depicts the angle in degrees of the clapper in the last frame before the clapper closes
  • Angle ⁇ N-2) depicts the angle in, for example, degrees of the clapper in the second to last frame before the clapper closes
  • FramePeriod depicts the period of time in seconds between two consecutive video frames (e.g., in the case of motion picture with 24 frames per second, it is 1/24 seconds).
  • the angular speed calculation block 128 can use the absolute value of the last angle (Angle ⁇ -i)) and the calculated AngularSpeed to calculate the expected time of the actual closing of the clapper using Equation two (2), which follows:
  • the CloseTime is the time between the last clapper open picture frame and the time when the clapper actually closed.
  • One "CloseTime” has to be subtracted from a FramePeriod (e.g., 1/24 sec) to obtain a "CorrectionTime” (time difference information) 138.
  • This value is a positive value by definition because the clapper is closed before or at the time of the first picture frame with the closed clapper and can be characterized according to Equation three (3), which follows:
  • a Corrected Audio Slate Time Code 140 is calculated by summing, using for example an adder 132 or similar offset device, the "CorrectionTime" 138 determined, for example, using Equation (3) to the Audio Slate Time Code 142 from the slate audio recognition block 134.
  • the corrected audio slate timecode 140 synchronizes the audio track to the video track to provide the desired synchronization with sub- frame accuracy.
  • the corrected audio slate time code can be characterized according to Equation four (4), which follows:
  • the audio and video are actually synced to the frame time code that is closest to the clap slate.
  • the clap slate is recognized in the audio data by the audio recognition device 134. That is, the audio recognition device 134 can designate an audio slate time code 142 or instant that the clap slate occurred. This can be performed, for example in one embodiment, by employing an acoustic waveform analysis and selecting the largest (loudest) peak. Since in this case, the audio signal is more accurate (not restricted to the frame rate of the picture images), the audio event is employed as the reference. This reference is compared with the actual clap slate close in the video signal.
  • FIG. 5 depicts a flow diagram of a method for synchronizing two recording modes in accordance with an embodiment of the present invention.
  • the method of FIG. 5 begins at step 202, in which a common event is identified in at least two recording modes.
  • the two recording modes include an audio recording mode and a video recording mode.
  • the identified common event can include the closing of a clapper which provides a visual and an audio event. The method then proceeds to step 204.
  • the identified event (e.g., the clapper visual and audio) is recognized in time in at least the higher accuracy one of the two recording modes.
  • the higher accuracy recording mode includes the audio recording mode and the lower accuracy mode includes the video recording mode. That is, due to the frame rate restrictions imposed on film recording (e.g., 1/24 sec or 1/60 sec frame rate), the video recording mode is less accurate for identifying an event in time in the video.
  • recognizing the event in time for at least a higher accuracy one of the two recording modes includes determining when the clap slate is closed using audio recognition (e.g., acoustic waveform analysis). The method then proceeds to step 206.
  • the event is identified in the lower accuracy recording mode, in the embodiment described above, by determining a time when the event occurred between frames of the video recording mode. This can include calculating an angular speed of the slate closing and predicting a time when the clap slate has closed.
  • the event identification can include performing image recognition or audio recognition. The method then proceeds to step 208.
  • the event in the higher accuracy recording mode and the lower accuracy recording mode are synchronized.
  • the audio event is synchronized to a nearest frame in the video recording.
  • a correction time is determined as described above. The determined correction time is then added or subtracted from the time of occurrence of the selected nearest frame to identify a point in time of the occurrence of the event in the video recording.
  • the modes are synchronized by adding a correction time to time of occurrence of the common event in the high accuracy mode (e.g., the audio recording) to designate a starting point and aligning the starting point to a nearest frame after the occurrence of the common event in the lower accuracy mode (e.g., the video recording).
  • the method is then exited.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Television Signal Processing For Recording (AREA)
  • Signal Processing For Digital Recording And Reproducing (AREA)
  • Indexing, Searching, Synchronizing, And The Amount Of Synchronization Travel Of Record Carriers (AREA)
PCT/US2006/047337 2006-12-12 2006-12-12 Method and system for subframe accurate synchronization WO2008073083A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
JP2009541270A JP5031039B2 (ja) 2006-12-12 2006-12-12 サブフレームの正確な同期化のための方法およびシステム
EP06845263A EP2095367A1 (en) 2006-12-12 2006-12-12 Method and system for subframe accurate synchronization
CA2673100A CA2673100C (en) 2006-12-12 2006-12-12 Method and system for subframe accurate synchronization
US12/312,362 US8483540B2 (en) 2006-12-12 2006-12-12 Method and system for subframe accurate synchronization
CN2006800565763A CN101606203B (zh) 2006-12-12 2006-12-12 用于子帧精确同步的方法和系统
PCT/US2006/047337 WO2008073083A1 (en) 2006-12-12 2006-12-12 Method and system for subframe accurate synchronization

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2006/047337 WO2008073083A1 (en) 2006-12-12 2006-12-12 Method and system for subframe accurate synchronization

Publications (1)

Publication Number Publication Date
WO2008073083A1 true WO2008073083A1 (en) 2008-06-19

Family

ID=37891454

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2006/047337 WO2008073083A1 (en) 2006-12-12 2006-12-12 Method and system for subframe accurate synchronization

Country Status (6)

Country Link
US (1) US8483540B2 (zh:)
EP (1) EP2095367A1 (zh:)
JP (1) JP5031039B2 (zh:)
CN (1) CN101606203B (zh:)
CA (1) CA2673100C (zh:)
WO (1) WO2008073083A1 (zh:)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014053474A1 (en) * 2012-10-01 2014-04-10 Kehlet Korrektur Method and system for organising image recordings and sound recordings

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7929764B2 (en) * 2007-06-15 2011-04-19 Microsoft Corporation Identifying character information in media content
EP2545709A4 (en) * 2010-03-09 2014-04-02 Vijay Sathya SYSTEM, METHOD AND DEVICE FOR DETECTING THE REPRODUCTION OF AN EVENT AND USING THE BEST SUITED EVENT SOUND
US9558405B2 (en) * 2015-01-16 2017-01-31 Analogic Corporation Imaging based instrument event tracking
CN108156350A (zh) * 2018-03-09 2018-06-12 海宁瑞峰祥宇影视制作有限公司 新型场记板

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2243969A (en) 1990-05-11 1991-11-13 British Broadcasting Corp Electronic clapperboard for television sound-vision synchronisation
WO1997037489A1 (en) * 1996-03-29 1997-10-09 Maestroworks, Inc. Telecine system
GB2326781A (en) * 1997-05-30 1998-12-30 British Broadcasting Corp Video-audio synchronization
GB2366110A (en) * 2000-06-23 2002-02-27 Ibm Synchronising audio and video.
EP1460835A1 (en) 2003-03-19 2004-09-22 Thomson Licensing S.A. Method for identification of tokens in video sequences
EP1465193A1 (en) * 2003-04-04 2004-10-06 Thomson Licensing S.A. Method for synchronizing audio and video streams
WO2005004470A1 (en) * 2003-07-01 2005-01-13 Lg Electronics Inc. Method and apparatus for testing lip-sync of digital television receiver

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0993615A (ja) * 1995-09-25 1997-04-04 Nippon Hoso Kyokai <Nhk> 映像と音声の時間差測定方法
US5877842A (en) * 1997-03-14 1999-03-02 Daily Disc Licensing, Inc. Digital Dailies
JPH10285483A (ja) * 1997-04-03 1998-10-23 Nippon Hoso Kyokai <Nhk> テレビジョンの映像信号と音声信号の時間差測定方法および装置
EP1463301A1 (en) * 2003-03-19 2004-09-29 Thomson Licensing S.A. Method for identification of tokens in video sequences
CN100549987C (zh) * 2004-03-16 2009-10-14 无敌科技(西安)有限公司 具有多文件同步播放功能的mp3播放设备及其方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2243969A (en) 1990-05-11 1991-11-13 British Broadcasting Corp Electronic clapperboard for television sound-vision synchronisation
WO1997037489A1 (en) * 1996-03-29 1997-10-09 Maestroworks, Inc. Telecine system
GB2326781A (en) * 1997-05-30 1998-12-30 British Broadcasting Corp Video-audio synchronization
GB2366110A (en) * 2000-06-23 2002-02-27 Ibm Synchronising audio and video.
EP1460835A1 (en) 2003-03-19 2004-09-22 Thomson Licensing S.A. Method for identification of tokens in video sequences
EP1465193A1 (en) * 2003-04-04 2004-10-06 Thomson Licensing S.A. Method for synchronizing audio and video streams
WO2005004470A1 (en) * 2003-07-01 2005-01-13 Lg Electronics Inc. Method and apparatus for testing lip-sync of digital television receiver

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014053474A1 (en) * 2012-10-01 2014-04-10 Kehlet Korrektur Method and system for organising image recordings and sound recordings
US9612519B2 (en) 2012-10-01 2017-04-04 Praqo As Method and system for organising image recordings and sound recordings

Also Published As

Publication number Publication date
JP2010512711A (ja) 2010-04-22
CN101606203A (zh) 2009-12-16
US20100054696A1 (en) 2010-03-04
EP2095367A1 (en) 2009-09-02
US8483540B2 (en) 2013-07-09
CA2673100C (en) 2015-04-07
JP5031039B2 (ja) 2012-09-19
CA2673100A1 (en) 2008-06-19
CN101606203B (zh) 2012-06-27

Similar Documents

Publication Publication Date Title
US10075758B2 (en) Synchronizing an augmented reality video stream with a displayed video stream
CA2673100C (en) Method and system for subframe accurate synchronization
US9116001B2 (en) Adaptive estimation of frame time stamp latency
US9392322B2 (en) Method of visually synchronizing differing camera feeds with common subject
US10284888B2 (en) Multiple live HLS streams
EP2399240A1 (en) Horizontal gaze estimation for video conferencing
US7519845B2 (en) Software-based audio rendering
US20140029837A1 (en) Inertial sensor aided instant autofocus
KR20110058438A (ko) 프리젠테이션 녹화 장치 및 방법
JP4606318B2 (ja) 映像メタデータ補正装置及びプログラム
US7738772B2 (en) Apparatus and method for synchronizing video data and audio data having different predetermined frame lengths
CN110089120B (zh) 用于在多个远程设备上同步播放媒体项的系统和方法
US8588310B2 (en) Method and apparatus for managing delivery of bits to a decoder
JP2007288269A (ja) 映像受信再生装置
US20100259621A1 (en) Image Processing Apparatus, Image Processing Method and Storage Medium
US20100332888A1 (en) Deriving accurate media position information
CN113794814B (zh) 一种控制视频图像输出的方法、装置及存储介质
CN112860211B (zh) 确定时延的方法、装置、终端与存储介质
KR102040940B1 (ko) 시간 동기화 장치 및 방법
CN114173082B (zh) 一种控制视频图像输出的装置、摄像设备及会议系统
CN115546876B (zh) 一种瞳孔追踪方法及装置
Denzler et al. Probabilistic integration of cues from multiple cameras
JP4368234B2 (ja) 競泳用計測システムおよび、タイム特定方法
JP4046673B2 (ja) 意図的な移動撮像シーンの検出装置および検出方法ならびに意図的な移動撮像シーン検出プログラムを記録した記録媒体
JP2003052056A (ja) 映像タイミング差検出方法、映像補正方法、映像タイミング差検出装置、映像補正装置及びそれらを用いた記録再生装置、受像機

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200680056576.3

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 06845263

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2500/DELNP/2009

Country of ref document: IN

WWE Wipo information: entry into national phase

Ref document number: 12312362

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 2673100

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 2009541270

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2006845263

Country of ref document: EP