WO2004112391A1 - 映像データと音声データの記録装置 - Google Patents
映像データと音声データの記録装置 Download PDFInfo
- Publication number
- WO2004112391A1 WO2004112391A1 PCT/JP2004/008053 JP2004008053W WO2004112391A1 WO 2004112391 A1 WO2004112391 A1 WO 2004112391A1 JP 2004008053 W JP2004008053 W JP 2004008053W WO 2004112391 A1 WO2004112391 A1 WO 2004112391A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- audio
- time
- frame
- data
- video data
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/04—Synchronising
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
- H04N5/92—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N5/926—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback by pulse code modulation
- H04N5/9265—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback by pulse code modulation with processing of the sound signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
- H04N5/92—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/78—Television signal recording using magnetic recording
- H04N5/782—Television signal recording using magnetic recording on tape
- H04N5/783—Adaptations for reproducing at a rate different from the recording rate
Definitions
- the present invention relates to an audio / video synchronization processing device, an audio / Z video synchronization processing method, and an audio / video recording device for synchronizing video data and audio data.
- AV audio Z video
- AV recording device such as an input device of an MPEG encoder
- frame length (frame period) of input signals of video data and audio data is generally different.
- fetch cycle of audio data and video data is performed in frame units.
- FIG. 15 is a system configuration diagram of a conventional AV recording apparatus.
- This system includes a data control unit 2a receiving a control instruction from a host (HOST) 1a and a system encoder 3a.
- HOST host
- the data control unit 2a receives a control instruction from the host 1a by the audio Z video control unit (AV-CTRL) 2la, and based on the time information from the timer (T IMER) 24a, The video controller 26a is controlled.
- AV-CTRL audio Z video control unit
- T IMER timer
- the audio / video control unit is referred to as an AV control unit.
- AV control unit 2] a performs input control of audio data (A-DATA) by issuing a control instruction to the audio control unit (A-CTRL) 22a.
- the input voice data is stored in a voice data memory (A-MEM) 23a.
- the AV control unit 2a performs input control of video data (V-DATA) by issuing a control instruction to the video control unit (V_CTRL) 26a.
- the input video data is stored in the video data memory (V_MEM) 25a.
- the data control unit 2a Based on the time information from the timer 24a, the data control unit 2a provides the system encoder 3a with audio data (A-PTS) and video data (VTS) added with a PTS (Presentation Time Stamp) as time information. — PTS).
- the system encoder 3a is controlled by a control instruction from the host 1a.
- the audio encoder (A-ENC) 31a encodes and encodes the audio data added with the PTS from the data control unit 2a.
- the video encoder (V—ENC) 33a encodes and encodes the video data added with the PTS from the data control unit 2a.
- the multiplexer (MPX) 32a multiplexes the data encoded by the audio encoder 31a and the video encoder 33a to generate a bit stream (BSD).
- BSD bit stream
- AV recording devices including MPEG encoders often cannot change the frame period of video data and audio data due to hardware limitations.
- AV synchronization deviation a deviation of the audio data from the video data occurs when the pause is released thereafter
- FIG. 16 is a diagram illustrating an example of the AV synchronization deviation when the pause and the pause release control are performed.
- data capture control can be performed only on a frame basis, and even during pause, each frame period of video data and audio data (V 1 de o_f ram e_ time, audi o_ f rame_time) cannot be changed.
- a pause request (indicated by “P” in the figure) is received from the host 1 a
- the pause request is reflected in the data control unit 2 a at the time of a frame break of the video data 1.
- T 1 6 1 The audio data is in the middle of the frame cycle at time t161, and the pause request is reflected in the next audio frame, so tp161 is generated as the difference between the video data and audio data at the time of pause. .
- the frame period of the video data and the frame period of the audio data do not change as they are, and tp16, which is the difference between the video data and the audio data at the time of the pause, is generated and is not corrected.
- the data controller 2a When a pause release request (indicated as “P—RL” in the figure) is received from the CPU 1a, the data controller 2a reflects the pause release request only when the input of video data n (VD n) is started. This is the timing of time t 1 62.
- VD n video data n
- tP1661 a deviation in AV synchronization occurs. No o
- the difference tp 162 from the pause release time t 16 2 to the input start time of the audio input data n is due to the pause. Since it is generated as a difference between audio data and video data at the time of release, as a result, a deviation tp 163 of the AV synchronization occurs at the time of the pause release from the time t 16 1 and the time tp 16 2.
- this tp166 may be accumulated for each pause request, and may be perceived as uncomfortable. Disclosure of the invention
- An object of the present invention is that the frame lengths of video data and audio data are different, and An object of the present invention is to provide an AV synchronization processing apparatus and method which do not cause an AV synchronization delay in an AV recording apparatus in which the frame length of audio data cannot be changed from the initial time.
- the present invention has been made in consideration of the above problems, and a first aspect of the present invention is to provide an audio Z-video synchronization that performs synchronization processing on video data and audio data each having a different predetermined frame length.
- Storage means for storing a start time, a pause request time, and a pause release request time of each frame of the video data and the audio data measured by the timer means;
- Control means for deciding whether to perform the delay or not to delay any of them.
- the audio delay time which is the delay time of the audio data frame, is calculated based on the break of the video data frame
- either the video data or the audio data is delayed in frame units after the pause release request, or the delay is not delayed. To decide.
- a delay time (sound delay time) of audio data with respect to video data at the time of a pause request is obtained, and the video data during the pause is acquired.
- the audio data deviation with respect to the video data is suppressed to less than the ⁇ ⁇ ⁇ audio data frame regardless of the pause release request. Since the playback timing of the audio data is adjusted, the AV synchronization deviation can be greatly suppressed.
- a second aspect of the present invention is an audio / video recording apparatus that generates multiplexed data including video data and audio data having different predetermined frame lengths, and includes timer means,
- Storage means for storing a start time, a pause request time, and a pause release request time of each frame of the video data and audio data measured by the timer means;
- Synchronization control for synchronizing audio data after the pause release request on a frame basis based on the start time of each frame of the video data and audio data, the pause request time, and the pause release request time Means
- Multiplexed data generating means for generating the multiplexed data by adding time information to the video data and the audio data synchronized by the synchronization control means.
- a delay time (sound delay time) of audio data with respect to video data at the time of a pause request is obtained, and a difference between a frame of video data and a frame of audio data during the subsequent pause is acquired.
- the playback timing of the audio data after the pause is released is controlled so that the deviation of the audio data from the video data is kept within one audio data frame even when the pause release request is received. Adjustment makes it possible to generate multiplexed data with significantly reduced AV synchronization deviation.
- FIG. 1 is a diagram showing a system configuration of an AV recording apparatus as one embodiment of the present invention. is there.
- FIG. 2 is a flowchart showing processing when the AV control unit 21 receives a START request from the host 1.
- FIG. 3 is a timing chart for explaining a video PTS (V_PTS) and an audio PTS (A-PTS) generated according to the start of data input.
- V_PTS video PTS
- A-PTS audio PTS
- FIG. 4 is a flowchart showing a process of adding an FTS when the data control unit 2 provides audio data to the system encoder 3.
- FIG. 5 is a flowchart showing a process of adding a PTS when the data control unit 2 provides video data to the system encoder 3.
- FIG. 6 is a flowchart showing a process performed by the AV control unit 2 # based on a pause request from the host 1.
- FIG. 7 is a timing chart showing processing for a pause request.
- FIG. 8 is a flow chart showing the processing after the pause request processing from the host (processing during the pause).
- FIG. 9 is a diagram illustrating a method of measuring a frame shift time (f-count).
- FIG. 10 is a flowchart illustrating a process performed by the AV control unit 21 when a pause release request is issued from the host 1. It is.
- FIG. 11 is a timing chart illustrating a method of calculating the voice correction time (a-dif) during the measurement of the delay time during the pause.
- FIG. 12 is a timing chart illustrating a method of calculating the audio correction time (a-diff) when the delay time during the pause is not being measured.
- FIG. 13 is a diagram for explaining a process of eliminating the AV synchronization error by a process of delaying the resumption of the input of the video data by one frame.
- FIG. 14 is a diagram for explaining a process of resolving the AV synchronization error by a process of delaying the resumption of the input of the audio data by ⁇ frame.
- FIG. 5 is a diagram showing a system configuration of a conventional AV recording equipment.
- FIG. 16 is a timing chart showing the pause and pause release processing of the conventional AV recording apparatus.
- FIG. 1 shows an AV recording device which is an embodiment of the audio-video synchronization processing device according to the present invention.
- the AV recording apparatus shown in FIG. 1 has the same system configuration as the conventional AV recording apparatus shown in FIG. 15, but has a special control in the AV control unit 21.
- the AV control unit (AV-CTRL) 21 performs processing based on a START request from the host (HOST) 1, processing at a regular time, processing based on a pause request from the host 1, processing during a pause, in this order.
- the processing based on the pause release request from the host 1 and the processing for eliminating the AV synchronization error caused by the pause and the pause release request will be described.
- FIG. 2 is a flowchart showing processing when the AV control unit 21 receives a START request from the host 1.
- the AV control unit 21 acquires time information from the timer 24 and stores it as STC-0 f f set in a memory (not shown).
- the timer (TIMER) 24 is, for example, a timer that operates with a clock of 90 kHz.
- FIG. 2 shows a processing flow of the ST ART request from the host in the data control unit 2.
- the AV control unit 21 Waiting for an overnight frame break, and detecting a frame break in the video data (
- ST21 obtains time information from the timer 24, and holds the time information as STC_offset (ST22).
- FIG. 3 is a timing chart for explaining a video PTS (V-PTS) and an audio PTS (A_PTS) generated according to the start of data input.
- the AV control unit 21 of the data control unit 2 upon receiving a START request from the host 1, the AV control unit 21 of the data control unit 2 starts inputting video data and audio data on the basis of a video frame. Then, a time t31 at the start is acquired from the timer 24, and is held as STC-0 f f set.
- the current time is sequentially obtained from the timer 24 at the break of each frame of the video data and the audio data, and a value obtained by subtracting STC at the time of START-offset (t31) is output to the system encoder 3 as PTS.
- a time t32 is acquired from the timer 24, and the system encoder 3 is notified of the PTS of the video data together with the video input data.
- a time t33 is obtained from the timer 24, and the PTS of the audio data is notified to the system encoder 3 together with the audio data.
- FIG. 4 is a flowchart showing a process of adding a PTS when the data control unit 2 provides audio data to the system encoder 3.
- the above processing is performed for each voice input frame during the regular processing.
- FIG. 5 is a flowchart showing a process of adding a PTS when the data control unit 2 supplies video data to the system encoder 3.
- the AV control unit 21 detects a break in the frame of the video data (ST51)
- the AV control unit 21 acquires and saves time information from the evening image 24 (ST52).
- a video PTS is generated from the STC— ⁇ ffset stored at the time of START and the acquired time information (ST53).
- the video encoder 33 of the system encoder 3 is notified of the information obtained by adding the PTS information to the video frame data (ST54). The above processing is performed for each video input frame during regular processing.
- FIG. 6 is a flowchart showing processing performed by the AV control unit 21 based on a pause request from the host 1.
- the time information that the AV control unit 21 acquires from the timer 24 when a pause request is received from the host 1 is set as p ause_STC_off set.
- the AV control unit 2 Upon receiving a pause request from the host, the AV control unit 2] waits for a frame break in the video data, and detects a frame break in the video data (ST61). Get (ST 62). Further, it instructs the video controller 26 to stop inputting video data (S T63) Based on the time information from the timer 24, the measurement of the difference between the audio data and the video data is started (ST64).
- the control waits for a break in the audio data frame.
- a break in the audio frame is detected (ST65)
- the measurement of the time lag between the audio data and the video data is completed based on the time information from the timer 24 (ST65).
- ST66 the delay time between the audio data and the video data is stored as the audio delay time (a-de1ay) (ST 67).
- the input instruction of the audio data is instructed (ST68), and the processing for the pause request from the host 1 is completed.
- Fig. 7 is a timing chart showing the processing for the pause request shown in Fig. 6.
- the AV control unit 21 upon receiving a pause request from the host 1, the AV control unit 21 stops inputting video data on the basis of the video data frame. At this time, the time t71 obtained from the timer 24 is stored as Pauses_STC_offfset. Then, from the time t71 when the video data input is temporarily stopped, when a break in the frame of the audio data is detected next, the time t72 is acquired from the timer 24.
- time t72 and time t71 are stored as a—de1ay, and the input of voice data is temporarily stopped.
- the frame shift time (f_count) which is the shift time between the frames of audio data and video data, is measured as described below.
- FIG. 8 first, it is determined whether or not a pause is currently made (ST81). If the pause is being made, a pause between frames of audio data is waited. ST82), obtains and saves time information from the timer 24, and starts measuring a frame shift time between audio data and video data (ST83). Next, it waits for a frame break in the video data, and when a frame break in the video is detected (ST84), obtains and saves the time information from the timer 24, and stores the time lag between the audio data and the video data. The measurement ends (ST85).
- the frame shift time (f_c0unt) is written from the start time of measuring the shift time between the audio data and the video data in ST83 and the end time of measuring the shift time between the audio data and the video data in ST85 (ST86). .
- the above processing is repeated during the pause, and the measurement of the frame shift time (fc-unt) is continued.
- the frame shift time (f-count) indicates a time lag between the latest audio data and the video data during the pause because the memory in the AV control unit 21 is overwritten.
- the reason why f-count is constantly updated is that it is not possible to predict when a pause release request will be made, and it is necessary to prepare for the request.
- FIG. 9 illustrates a method of measuring the frame shift time (fcunt) described based on the flowchart of FIG.
- the AV control unit 21 Upon detecting a break in the audio data, the AV control unit 21 obtains the time information t91 from the timer 24 and starts measuring the time difference between the audio data and the video data.
- the AV control unit 21 obtains the time information t92 from the timer 24, and measures the time lag between the audio data and the video data (t92-t91).
- the time difference between the measured audio data and the video data is the frame time difference (fc-unt).
- This control is repeatedly performed based on the audio data during the pause, and the latest frame shift time (fc0 unt) is always stored.
- the latest value of the frame shift time (f ⁇ c0unt) is the difference (t96 ⁇ t95) between time t95 and time t96.
- the AV control unit 2 based on the difference between audio data and video data measured during the pause request and during the pause.
- the method 1 determines whether to delay the resumption of the input of audio data, the resumption of the input of video data, or neither of them, and a method of resolving the deviation of AV synchronization will be described below.
- FIG. 10 is a flowchart showing processing performed by the AV control unit 21 when a pause release request is issued from the host 1.
- a_diff in the flowchart of FIG. 0 represents an audio correction time which is a difference between audio data and video data at the time of pause and at the time of pause release.
- tota 1—a ud i o_d e 1 ay in the flowchart of FIG. 10 is a cumulative sound capture time that is a variable that accumulates a shift of the sound data with respect to the video data. Initialized to 0.
- the frame shift time (f ⁇ c0unt) is updated at the timing of the break of the video data frame. Therefore, on the time axis, the time from the break of the audio data frame to the break of the video data frame is “measurement of the time difference between audio data and video data” in the flowchart of FIG. are doing.
- the time indicated by the horizontal arrow in Fig. 9 means that the time difference between the audio data and the video data is being measured, and the other time means that the time difference between the audio data and the video data is being measured. Means no.
- the AV control unit 21 when receiving a pause release request from the host 1, the AV control unit 21 waits for a frame break of video data, and when detecting a frame break of video data (ST 101), updates the STC_offset. (ST 102). Thereafter, it is determined whether or not the delay time between the audio data and the video data is being measured (ST 103), and if the delay time between the audio data and the video data is being measured, based on equation (1) described later.
- the audio correction time (a-diff) which is the time difference between audio data and video data during pause and pause release (ST) 04). During measurement Then, the voice correction time (a-diff) is obtained based on the following equation (2) (S105) «>
- the details of the audio correction time (a-diff) will be described later. However, based on the audio delay time during pause a- de 1 ay and the frame shift time (f-count), correction is performed during pause processing during pause processing. It shows the deviation of audio data from the video data to be performed.
- the audio correction time (a-diff) is a positive value, it means that the audio data is delayed with respect to the video data.
- the audio correction time (a_diff) is negative, the audio correction time (a_diff) is Means that the audio data is advanced.
- step ST104 the voice correction time (a-diff) obtained in step ST104 or ST105 is added to the accumulated voice correction time t0tal_audio_de1ay (ST106).
- the accumulated voice correction time t o t a l_a u d i o_d e 1 a y which is the initial value 0 at the time of system startup is cumulatively added in step ST 106 for a plurality of pause processes while the system is operating.
- the voice capture time (a-diff) is the difference between the voice data to be corrected during each pause process, while the cumulative voice correction time t 0 tal_a udi o_de 1 ay is the voice correction time ( a—diff) is the cumulative value, so this is the audio data correction value for the video data that should actually be corrected.
- step ST107 onward how to control the deviation of AV synchronization based on the value of the cumulative audio correction time total—aud io—delay accumulated for each pause process during system operation. Is a process for determining whether or not the shift of the audio data with respect to the video data should be corrected, and determining whether to delay the audio data or the video data if it is to be corrected.
- step ST] 07 when the cumulative voice capture time t 0 tal_a udi o_de 1 ay is negative, that is, when the voice data is advanced, the cumulative voice capture time After adding the time of one frame length of the video data to total—audio—delay (ST 108), perform the process of delaying the actual restart of the video data] frames.
- the process of delaying the resumption of video data by one frame is realized by waiting for resumption of video data input until a frame break of video data is detected (ST109) 0
- step ST107 when the accumulated audio correction time t0tal_audio_de1ay is not negative, that is, when the audio data is the same or delayed, the human power is restarted without delaying the video data overnight (ST 1 0), the process proceeds to step ST 1 1 1.
- step ST 1 if the cumulative sound correction time t 0 tal_a udio-de 1 ay force is longer than 1 audio data frame (audio-frame e_time), the audio data Since it is necessary to delay the restart, go to step ST112 or later.
- step ST 1 1 2 the time of one frame length of the audio data is subtracted from the accumulated audio correction time t 0 tal_a udi o_de 1 ay (ST 1 1 2), and then the actual restart of the audio data is performed by one frame. Perform processing to delay.
- the process of delaying the restart of the audio data by one frame is realized by waiting for a manual restart of the audio data until a frame break of the audio data is detected (ST113).
- FIG. 11 is a timing chart illustrating how to calculate the audio correction time (a-diff) when measuring the delay time during pause.
- the audio data is transmitted from a frame break to a video data frame.
- the voice correction time (a-diff) is calculated using the value of the frame shift time (f-c0unt) obtained after the pause release request because the processing was performed during the break.
- step ST104 of FIG. 10 to calculate the audio correction time (a ⁇ diff) will be described with reference to FIG.
- the AV control unit 2 Upon receiving the pause release request from the host #, the AV control unit 2 obtains the time t111 from the timer 24 in accordance with the video frame period, and updates the Pause_STC_offset saved at the time of the pause request. Reset ST C—offset as reference.
- a-delay is the frame shift time between the audio data and the video data at the time of the pause, as described above, and is the data calculated and held at the time of the pause.
- Audio-frame-time is the frame period of the audio data.
- FIG. 12 is a timing chart illustrating a method for calculating the voice capture time (a-diff) when the measurement of the delay time during the pause is not being performed.
- the voice capture time (a_diff) is calculated using the frame shift time (f-count) obtained before the pause release request.
- step ST05 of FIG. 5 the procedure performed in step ST05 of FIG. 5 to calculate the voice correction time (a ⁇ diff) will be described with reference to FIG.
- the AV control unit 21 When the AV control unit 21 receives the pause release request from the host 1, it acquires the time t121 from the timer 24 in synchronization with the video frame period, and stores the time t pause 21 when the pause request was made. Pause— STC— 0 ffset Reset ST C_offset based on.
- a—de 1 a y is the pause audio delay time, which is the pause time of the audio data during pause and the video data as described above, and is the data calculated and held during pause.
- Audio-frame-time is the frame period of the audio data.
- video—frame—time is the frame period of the video data.
- the speech correction time (a-diff) can be obtained by the following equation (2).
- a- diff a-de 1 ay + f-c oun ta ud io-fr ame-time + vide o_f ram e_ time C 2)
- FIG. 13 is a diagram for explaining a process of eliminating the AV synchronization error by a process of delaying the input restart of the video data by one frame.
- the control for delaying the resumption of the input of the video data is based on the fact that the accumulated sound capturing time t 0 tal_a udi o_de 1 ay is negative as described above according to the pause release flowchart in FIG. 10 (ST 107). ), Corrects the AV synchronization deviation (ST108), and performs processing to delay the video data] frame until a frame break of the video data is found (ST109).
- the AV control unit 2] when the AV control unit 2] receives a pause release request from the host 1, it waits for a frame break in the video data, and detects a frame break in the video data (time t1 31). Is calculated on the basis of the processing flow of (1), t0tal_audio_de1ay. Since this is negative, the input of video data is resumed after waiting for one video frame (time t 1 32).
- FIG. 14 is a diagram for explaining a process of resolving the AV synchronization error by a process of delaying the resumption of the input of the audio data by one frame.
- the control for delaying the restart of the input of the audio data is performed when the cumulative audio correction time t 0 tal_a udi o_d e 1 ay is one or more audio frames.
- This is realized by a process (ST113) of correcting the AV synchronization deviation (St ⁇ 12) and delaying the restart of the audio data by one frame until a frame break of the audio data is found.
- the AV control unit 21 when the AV control unit 21 receives a pause release request from the host 1, it waits for a frame break of video data, and when it detects a frame break of video data (time t 141), Since o_d e 1 ay exceeds one voice frame, input of voice data is restarted after waiting for one voice frame (time t 142). As is clear from steps ST 107 and ST 111 in FIG. 10, if total — au di o_d e 1 ay is positive and does not exceed one audio frame, any of audio data and video data input restarts Do not delay. In this case, the difference between the audio data and the video data caused by the pause processing at this time is accumulated in t0ta1—audio_de1ay.
- the cumulative audio correction time (steps ST108 and ST312 in Fig. 10) t 0 ta 1— audi o_de 1 ay) does not become 0, so that the AV synchronization deviation is not completely eliminated.
- the accumulated audio correction time (t0ta1—audio_de1ay) always falls within one audio data frame during the operation of the AV recording device. Therefore, the difference is not recognized by the viewer, and it is possible to sufficiently eliminate the AV synchronization deviation.
- the present invention is applicable to an apparatus that records or reproduces audio data and video data in synchronization.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Television Signal Processing For Recording (AREA)
- Signal Processing For Digital Recording And Reproducing (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/559,419 US7738772B2 (en) | 2003-06-12 | 2004-06-03 | Apparatus and method for synchronizing video data and audio data having different predetermined frame lengths |
EP04735991A EP1633138B1 (en) | 2003-06-12 | 2004-06-03 | Device for recording video data and audio data |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003-168019 | 2003-06-12 | ||
JP2003168019A JP4305065B2 (ja) | 2003-06-12 | 2003-06-12 | Av同期処理装置および方法ならびにav記録装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004112391A1 true WO2004112391A1 (ja) | 2004-12-23 |
Family
ID=33549324
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/008053 WO2004112391A1 (ja) | 2003-06-12 | 2004-06-03 | 映像データと音声データの記録装置 |
Country Status (6)
Country | Link |
---|---|
US (1) | US7738772B2 (ja) |
EP (1) | EP1633138B1 (ja) |
JP (1) | JP4305065B2 (ja) |
KR (1) | KR101006593B1 (ja) |
CN (1) | CN100521766C (ja) |
WO (1) | WO2004112391A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110418183A (zh) * | 2019-08-05 | 2019-11-05 | 北京字节跳动网络技术有限公司 | 音视频同步方法、装置、电子设备及可读介质 |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005098854A1 (ja) * | 2004-04-06 | 2005-10-20 | Matsushita Electric Industrial Co., Ltd. | 音声再生装置、音声再生方法及びプログラム |
JP4560788B2 (ja) * | 2005-05-26 | 2010-10-13 | カシオ計算機株式会社 | カメラ装置及び録画装置並びにプログラム |
JP4665719B2 (ja) * | 2005-10-28 | 2011-04-06 | オムロン株式会社 | フィルタパラメータ設定装置、フィルタリング処理装置、フィルタパラメータ設定方法、作業時間計測システム、制御プログラム、および、記録媒体 |
TWI314017B (en) * | 2006-07-12 | 2009-08-21 | Quanta Comp Inc | System and method for synchronizing video frames and audio frames |
US20080263612A1 (en) * | 2007-04-18 | 2008-10-23 | Cooper J Carl | Audio Video Synchronization Stimulus and Measurement |
WO2010069375A1 (en) | 2008-12-17 | 2010-06-24 | Telefonaktiebolaget L M Ericsson (Publ) | Method and apparatus for measuring audiovisual synchronisation |
US8989280B2 (en) * | 2011-06-30 | 2015-03-24 | Cable Television Laboratories, Inc. | Frame identification |
WO2014087449A1 (en) * | 2012-12-04 | 2014-06-12 | Hitachi, Ltd. | Network device and method of controlling the network device |
US8913189B1 (en) * | 2013-03-08 | 2014-12-16 | Amazon Technologies, Inc. | Audio and video processing associated with visual events |
JP6358113B2 (ja) * | 2015-01-30 | 2018-07-18 | 株式会社Jvcケンウッド | 記録装置及び多重化方法 |
CN105141869B (zh) * | 2015-08-19 | 2018-12-18 | 中山市天启智能科技有限公司 | 基于Android系统的分段录像数据处理方法 |
AU2018266806B2 (en) | 2017-05-09 | 2022-10-13 | Echo360, Inc. | Methods and apparatus for ordered serial synchronization of multimedia streams upon sensor changes |
CN109040818B (zh) * | 2017-06-12 | 2021-04-27 | 武汉斗鱼网络科技有限公司 | 直播时的音视频同步方法、存储介质、电子设备及系统 |
CN108965971B (zh) * | 2018-07-27 | 2021-05-14 | 北京数码视讯科技股份有限公司 | 多路音频同步控制方法、控制装置及电子设备 |
CN110225279B (zh) * | 2019-07-15 | 2022-08-16 | 北京小糖科技有限责任公司 | 一种移动终端的视频制作系统和视频制作方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5502573A (en) | 1992-12-18 | 1996-03-26 | Sony Corporation | Apparatus for reproducing and decoding multiplexed data from a record medium with means for controlling data decoding as a function of synchronization errors |
JP2001266549A (ja) * | 2000-03-17 | 2001-09-28 | Sony Corp | 情報再生装置および画像表示制御方法、並びに記録媒体 |
EP1161095A2 (en) | 2000-05-31 | 2001-12-05 | Fujitsu Limited | Apparatus for reproduction of video and audio with suspension function |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4703355A (en) * | 1985-09-16 | 1987-10-27 | Cooper J Carl | Audio to video timing equalizer method and apparatus |
US5675511A (en) * | 1995-12-21 | 1997-10-07 | Intel Corporation | Apparatus and method for event tagging for multiple audio, video, and data streams |
US6148135A (en) * | 1996-01-29 | 2000-11-14 | Mitsubishi Denki Kabushiki Kaisha | Video and audio reproducing device and video decoding device |
WO1997046027A1 (en) * | 1996-05-29 | 1997-12-04 | Sarnoff Corporation | Preserving synchronization of audio and video presentation |
JP3698376B2 (ja) * | 1996-08-19 | 2005-09-21 | 松下電器産業株式会社 | 同期再生装置 |
WO1998021722A1 (fr) * | 1996-11-13 | 1998-05-22 | Matsushita Electric Industrial Co., Ltd. | Appareil et procede de generation de train de bits pour memoire a disque d'enregistrement de donnees permettant une reproduction continue de plusieurs pieces de donnees-images et support d'enregistrement sur lequel est enregistre un programme applique a l'appareil de generation |
US6262777B1 (en) * | 1996-11-15 | 2001-07-17 | Futuretel, Inc. | Method and apparatus for synchronizing edited audiovisual files |
US6262776B1 (en) * | 1996-12-13 | 2001-07-17 | Microsoft Corporation | System and method for maintaining synchronization between audio and video |
JP3094999B2 (ja) * | 1998-10-15 | 2000-10-03 | 日本電気株式会社 | オーディオ・ビデオ同期再生装置 |
US6583821B1 (en) * | 1999-07-16 | 2003-06-24 | Thomson Licensing S.A. | Synchronizing apparatus for a compressed audio/video signal receiver |
JP2003199045A (ja) * | 2001-12-26 | 2003-07-11 | Victor Co Of Japan Ltd | 情報記録信号の生成方法、情報信号の再生方法、情報信号の伝送方法、情報記録信号生成装置、情報信号再生装置、情報信号伝送装置、情報信号記録媒体、及び情報信号伝送用プログラム |
US6850284B2 (en) * | 2002-08-27 | 2005-02-01 | Motorola, Inc. | Method and apparatus for decoding audio and video information |
-
2003
- 2003-06-12 JP JP2003168019A patent/JP4305065B2/ja not_active Expired - Fee Related
-
2004
- 2004-06-03 CN CNB2004800160768A patent/CN100521766C/zh not_active Expired - Fee Related
- 2004-06-03 EP EP04735991A patent/EP1633138B1/en not_active Expired - Fee Related
- 2004-06-03 WO PCT/JP2004/008053 patent/WO2004112391A1/ja active Application Filing
- 2004-06-03 KR KR1020057022680A patent/KR101006593B1/ko not_active IP Right Cessation
- 2004-06-03 US US10/559,419 patent/US7738772B2/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5502573A (en) | 1992-12-18 | 1996-03-26 | Sony Corporation | Apparatus for reproducing and decoding multiplexed data from a record medium with means for controlling data decoding as a function of synchronization errors |
JP2001266549A (ja) * | 2000-03-17 | 2001-09-28 | Sony Corp | 情報再生装置および画像表示制御方法、並びに記録媒体 |
EP1161095A2 (en) | 2000-05-31 | 2001-12-05 | Fujitsu Limited | Apparatus for reproduction of video and audio with suspension function |
JP2001346147A (ja) * | 2000-05-31 | 2001-12-14 | Fujitsu Ltd | 映像・音声再生装置及び映像・音声再生方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1633138A4 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110418183A (zh) * | 2019-08-05 | 2019-11-05 | 北京字节跳动网络技术有限公司 | 音视频同步方法、装置、电子设备及可读介质 |
Also Published As
Publication number | Publication date |
---|---|
EP1633138A1 (en) | 2006-03-08 |
JP2005006095A (ja) | 2005-01-06 |
KR20060010829A (ko) | 2006-02-02 |
CN100521766C (zh) | 2009-07-29 |
KR101006593B1 (ko) | 2011-01-07 |
US20060140280A1 (en) | 2006-06-29 |
US7738772B2 (en) | 2010-06-15 |
EP1633138B1 (en) | 2012-08-08 |
EP1633138A4 (en) | 2010-11-10 |
CN1802851A (zh) | 2006-07-12 |
JP4305065B2 (ja) | 2009-07-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2004112391A1 (ja) | 映像データと音声データの記録装置 | |
JP3698376B2 (ja) | 同期再生装置 | |
JP4852094B2 (ja) | 再生装置、携帯電話機、及び再生方法 | |
US8526501B2 (en) | Decoder and decoding method based on video and audio time information | |
RU2483470C2 (ru) | Способ и устройство для обработки видео- и аудиоданных, принимаемых в системе декодирования | |
JPH09205618A (ja) | 動画像音声伸張再生装置および動画像音声同期制御器 | |
JPH11355263A (ja) | データ復号装置及びデータ復号方法 | |
JP5024000B2 (ja) | 記録再生装置及びタイムコード復元方法 | |
JP3100308B2 (ja) | 画像及び音声情報の再生システム | |
JP3350365B2 (ja) | 映像同期信号補正装置 | |
JPH10200860A (ja) | 画像と音声の同期再生装置 | |
JP2009290768A (ja) | 映像処理装置および映像処理方法 | |
JP2008131591A (ja) | リップシンク制御装置及びリップシンク制御方法 | |
JP2000308065A (ja) | 動画伝送装置 | |
JPH10210483A (ja) | 動画像再生装置及びその方法 | |
JP5310189B2 (ja) | ビデオエンコーダ装置及び該ビデオエンコーダ装置に用いられる符号化データ出力方法 | |
JP2006229484A (ja) | 画像処理方法およびその装置 | |
WO2006043768A1 (en) | Image processor and operating method thereof | |
KR20070056547A (ko) | 신호의 처리방법 및 장치 | |
JP3165661B2 (ja) | 音声同期再生装置 | |
JP2003339023A (ja) | 動画再生装置 | |
JP2001078195A (ja) | システムエンコード装置 | |
JP4894251B2 (ja) | 音声切替方法及び音声切替装置 | |
JPH11261968A (ja) | 外部入力機能を有する画像圧縮記録装置 | |
JP4932242B2 (ja) | ストリーム切換装置及びストリーム切換方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2004735991 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020057022680 Country of ref document: KR |
|
ENP | Entry into the national phase |
Ref document number: 2006140280 Country of ref document: US Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10559419 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 20048160768 Country of ref document: CN |
|
WWP | Wipo information: published in national office |
Ref document number: 1020057022680 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2004735991 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 10559419 Country of ref document: US |