WO2006064689A1 - 無線通信システム - Google Patents
無線通信システム Download PDFInfo
- Publication number
- WO2006064689A1 WO2006064689A1 PCT/JP2005/022379 JP2005022379W WO2006064689A1 WO 2006064689 A1 WO2006064689 A1 WO 2006064689A1 JP 2005022379 W JP2005022379 W JP 2005022379W WO 2006064689 A1 WO2006064689 A1 WO 2006064689A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- video
- audio
- time
- reproduction
- Prior art date
Links
- 238000004891 communication Methods 0.000 title claims description 43
- 238000012545 processing Methods 0.000 claims abstract description 49
- 238000000034 method Methods 0.000 claims description 56
- 230000001360 synchronised effect Effects 0.000 claims description 27
- 230000006837 decompression Effects 0.000 claims description 26
- 230000008929 regeneration Effects 0.000 claims 1
- 238000011069 regeneration method Methods 0.000 claims 1
- 230000005540 biological transmission Effects 0.000 description 49
- 230000008569 process Effects 0.000 description 33
- 230000015654 memory Effects 0.000 description 30
- 238000006243 chemical reaction Methods 0.000 description 12
- 239000000284 extract Substances 0.000 description 11
- 238000010586 diagram Methods 0.000 description 8
- 230000004048 modification Effects 0.000 description 7
- 238000012986 modification Methods 0.000 description 7
- 238000013075 data extraction Methods 0.000 description 6
- 238000007906 compression Methods 0.000 description 4
- 230000006835 compression Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000003111 delayed effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 238000007796 conventional method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/61—Network physical structure; Signal processing
- H04N21/6156—Network physical structure; Signal processing specially adapted to the upstream path of the transmission network
- H04N21/6181—Network physical structure; Signal processing specially adapted to the upstream path of the transmission network involving transmission via a mobile phone network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/233—Processing of audio elementary streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/242—Synchronization processes, e.g. processing of PCR [Program Clock References]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
- H04N21/4307—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
- H04N21/43076—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of the same content streams on multiple devices, e.g. when family members are watching the same movie on different devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/4363—Adapting the video stream to a specific local network, e.g. a Bluetooth® network
- H04N21/43637—Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/439—Processing of audio elementary streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/61—Network physical structure; Signal processing
- H04N21/6106—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
- H04N21/6131—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via a mobile phone network
Definitions
- the present invention relates to a wireless communication system.
- the present invention relates to a wireless communication system for synchronously reproducing video data and audio data of the same program and a synchronous reproduction method using the same.
- a server device that records a number of moving images (for example, a TV program), a display that reproduces video, and a speaker that reproduces audio are connected to each other via a wireless network.
- a wireless video audio playback system has been proposed in which video data is transmitted, video data included in the video data is output with playback power, and audio data is played back from a speaker.
- Patent Document 1 Japanese Patent Application Laid-Open No. 2004-193868.
- the wireless transmission / reception system described in Patent Document 1 provides a common reference time for all devices constituting the system, and transmits / receives a time stamp request and a response packet before starting playback of a moving image. It measures the delay time required for data transmission and determines the playback time of video and audio based on the result.
- the delay caused by wireless data transmission can be absorbed by intentionally delaying the output timing of the video, and the difference between the playback time of the video and audio can be reduced. Although effective, it has the following problems at the same time.
- An object of the present invention is to solve the above-described problems, and is configured to include a server device that transmits moving image data, a display that reproduces and outputs video data, and a speaker that reproduces and outputs audio data. It is an object of the present invention to provide a radio communication system capable of executing synchronized reproduction of video data and audio data with higher accuracy than conventional techniques and a synchronized reproduction method using the same.
- a wireless communication system is a server device that wirelessly transmits video data with time data and audio data with time data,
- the above-mentioned server device power wirelessly receives video data with time data transmitted wirelessly,
- a video playback device that plays back and outputs video of the video data;
- the audio playback device that wirelessly receives audio data with time data transmitted wirelessly and reproduces and outputs the audio of the audio data.
- the video playback device When the video playback device plays back and outputs the video data, the video playback device adds a known data processing time tl required to play back and output the video data to the time data T1 of the video data.
- the estimated reproduction time information TT1 is calculated and wirelessly transmitted to the server device,
- the audio playback device When the audio of the audio data is played back and output, the audio playback device adds a known data processing time t2 necessary for playback and output of the audio data to the time data T2 of the audio data. Estimated playback time information TT 2 is calculated and transmitted to the server device,
- the server apparatus wirelessly receives the reproduction estimated time information TT1 from the video reproduction apparatus, and after wirelessly receiving the audio reproduction apparatus power reproduction estimated time information TT2, the reproduction estimated time information TT1 and the reproduction estimated time information Calculate the playback estimation time difference information ⁇ t with TT2 and wirelessly transmit it to the video playback device and audio playback device,
- the video playback device receives video data reproduced and output from the video playback device based on the wirelessly received playback estimated time difference information At after wirelessly receiving the playback estimated time difference information ⁇ t. And the time at which the video data is played back and output so that the audio of the audio data played back and output from the audio playback device is substantially synchronized and played back.
- the audio reproduction device wirelessly receives the reproduction estimated time difference information ⁇ t from the server device, and then reproduces and outputs from the audio reproduction device based on the wirelessly received reproduction estimated time difference information At.
- the time for reproducing and outputting the audio data is controlled so that the audio of the audio data and the video of the video data to be reproduced and reproduced are substantially reproduced and output in synchronization with each other.
- the estimated reproduction time information by the video reproduction device is also provided. Calculating and transmitting the information TTl and controlling the time for reproducing and outputting the video of the video data based on the wirelessly received reproduction estimated time difference information ⁇ t;
- It is characterized by being repeatedly executed periodically at predetermined time intervals.
- the video data is video data compressed and encoded
- the video reproduction device decompresses and decodes the video data received wirelessly
- the data processing time tl Includes the time for the decompression decoding
- the audio data is audio data that has been compression-encoded, and the audio reproduction device decompresses and decodes audio data received wirelessly, and the data processing time t2 includes a time related to the decompression decoding. To do.
- a synchronized playback method includes a server device that wirelessly transmits video data with time data and audio data with time data,
- the above-mentioned server apparatus power A video playback apparatus that wirelessly receives video data with time data that is wirelessly transmitted and plays back and outputs video of the video data;
- the server apparatus power according to a synchronous reproduction method using a wireless communication system including an audio reproduction apparatus that wirelessly receives audio data with time data transmitted wirelessly and reproduces and outputs audio of the audio data.
- the playback device when the video of the video data is played back and output, the playback is made by adding the known data processing time tl necessary for playback and output of the video data to the time data T1 of the video data. Calculating estimated time information TT1 and wirelessly transmitting to the server device;
- the known data processing time t2 required to play back and output the audio data is added to the time data T2 of the audio data. Calculating the estimated reproduction time information TT2 and wirelessly transmitting to the server device;
- the estimated playback time information TT1 is wirelessly transmitted from the video playback device.
- the estimated playback time difference information ⁇ t between the estimated playback time information TT1 and the estimated playback time information TT2 is calculated and Wirelessly transmitting to the video playback device and the audio playback device;
- the video playback device after wirelessly receiving the reproduction estimated time difference information At from the server device, the video reproduced and output from the video playback device based on the wirelessly received playback estimated time difference information At Controlling the time of reproducing and outputting the video data so that the video of the data and the audio of the audio data reproduced and output from the audio reproducing device are substantially reproduced and output;
- the audio reproduction device After the wireless reproduction reception time difference information ⁇ t from the server device is wirelessly received by the audio reproduction device, the audio reproduction device is based on the wirelessly received reproduction estimated time difference information ⁇ t.
- the estimated playback time information T T1 by the video playback device is calculated and wirelessly transmitted, and the video data based on the estimated playback time difference information ⁇ t received wirelessly is transmitted. Controlling the time of playback output;
- the method further includes the step of periodically repeating the process at predetermined time intervals.
- the video data is compressed and encoded video data
- the video playback device decompresses and decodes the video data received wirelessly, and performs the data processing time.
- tl includes the time related to the decompression decoding
- the audio data is audio data that has been compression-encoded, and the audio reproduction device decompresses and decodes the audio data received wirelessly and performs the data processing.
- the interval t2 includes a time related to the decompression decoding.
- a server device that transmits moving image data, a display that reproduces and outputs video data, and a speaker that reproduces and outputs audio data are provided.
- synchronized playback of video data and audio data can be executed with higher accuracy than in the prior art.
- FIG. 1 is a block diagram showing a configuration of a wireless communication system according to an embodiment of the present invention.
- FIG. 2 is a block diagram showing a configuration of packet signals transmitted and received in the wireless communication system of FIG.
- FIG. 3 is a flowchart showing a synchronized playback output control process executed by the audio playback device 110 and the video playback device 120 of FIG.
- FIG. 4 is a sequence diagram showing a synchronized reproduction output control process executed in the wireless communication system of FIG. 1.
- FIG. 5 is a sequence diagram showing a synchronized playback output control process executed in a wireless communication system according to a modification of the present invention.
- FIG. 1 is a block diagram showing a configuration of a wireless communication system according to an embodiment of the present invention.
- the wireless communication system according to the present embodiment is applied to a wireless video / audio reproduction system.
- a server device 100 such as a DVD recorder or a personal computer
- an audio reproduction device 110
- a video playback device 120 which are connected via a wireless network such as a wireless LAN.
- a server device 100 is configured to include a wireless transmission / reception circuit 101 to which an antenna 101A is connected, a system controller 102, a noffer memory 103, and a moving image data memory 104 such as a hard disk drive.
- the audio playback device 110 includes a radio transmission / reception circuit 111 to which an antenna 111A is connected, a system controller 112, a noffer memory 113, an audio decoder 114, and a digital Z analog conversion (hereinafter referred to as “digital Z analog conversion”). , D / A transformation. ) 115, a speaker 116, and a decoder controller 117.
- the system controller 112 controls the operation of the other circuit elements 111 to 115 of the audio playback device 110.
- the video playback device 120 includes a wireless transmission / reception circuit 121 to which an antenna 121A is connected, a system controller 122, a nota memory 123, a video decoder 124, a DZA converter 125, a display 126, and a decoder controller. LA 127.
- the system controller 122 controls the operation of the other circuit elements 121 to 125 of the video reproduction apparatus 120.
- moving image data stored in advance in moving image data memory 104 by a user operation for example, a plurality of program data is included, and each program data includes image data and audio data.
- a wireless transmission / reception circuit 101 having a wireless communication function compliant with a wireless network standard such as Bluetooth or wireless LAN, and the wireless transmission / reception circuit 101 determines a wireless carrier wave according to the input video data.
- the digital signal is digitally modulated into a wireless signal using the digital modulation method, and then transmitted to the audio playback device 110 and the video playback device 120 using the antenna 101A.
- the moving image data recorded in the moving image data memory 104 is, for example, compression-encoded video audio data (for example, moving image data formed in conformity with the Moving Picture Expert Group (MPEG) method, etc.).
- MPEG Moving Picture Expert Group
- the present invention is not limited to this, and may be data encoded by other methods.
- FIG. 2 is a block diagram showing a configuration of packet signals transmitted and received in the wireless communication system of FIG.
- information for transmitting and receiving in the wireless network (information such as a source address and a destination address) is added to the preceding stage of the one packet signal.
- the packet signal includes video data parameter information 301 including an ID related to video data, a compression encoding method, and specification information related thereto, and an ID, compression code key method related to audio data, and a specification related thereto.
- Audio data parameter information 302 including information, a video data packet 303, an audio data packet 304, and subsequent video data packet 303 and audio data packet 304 are configured.
- the video data packet 303 is timed with the first time of one program set to 0: 0: 0 and the first data of the video data 313 that has been compressed and encoded is recorded.
- Time stamp information 311 that is data time information
- key frame information 312 that is frame information of an I picture, for example, when compressed in MPEG format
- the audio data packet 304 includes time stamp information 321 which is time information of the first data of the audio data 322 which is timed with the first time of one program set to 0: 0: 0 and is compressed and encoded.
- audio data (audio content data) 322 that has been compressed and encoded. That is, as shown in FIG. 2, the compression-coded video data and audio data include time stamp information for determining the playback time.
- the audio playback device 110 receives a radio signal including moving image data wirelessly transmitted from the server device 100 using the antenna 111 A and outputs it to the radio transmission / reception circuit 111.
- the wireless transmission / reception circuit 111 has a wireless transmission / reception function similar to that of the wireless transmission / reception circuit 101.
- the wireless transmission / reception circuit 111 receives an input wireless signal and uses a digital demodulation method opposite to the above digital modulation method to convert it into moving image data. After demodulating, the data is stored in the buffer memory 113 through the system controller 112.
- the audio decoder 114 extracts the compression-coded audio data from the moving image data card stored in the buffer memory 113 and performs extension and decoding processing to convert the audio data into decompressed audio data. Later output to DZA Transform 115.
- the DZA conversion 115 converts the input audio data into analog audio data, and then outputs it to the speaker 116 to reproduce it as audio sound.
- the video reproduction device 120 receives a wireless signal including moving image data wirelessly transmitted from the server device 100 using the antenna 121 A, and outputs it to the wireless transmission / reception circuit 121.
- the wireless transmission / reception circuit 121 has the same wireless transmission / reception function as the wireless transmission / reception circuit 101, receives an input wireless signal, demodulates it into video data using a digital demodulation method opposite to the digital modulation method, The data is stored in the buffer memory 123 through the system controller 122.
- the video decoder 124 extracts the compressed and encoded video data from the moving image data stored in the buffer memory 123, performs decompression and decoding processing, converts the video data into decompressed video data, and then converts the video data to DZA. Output to variable 125.
- the DZA converter 125 converts the input video data to analog video data, and then displays 12 Output to 6 and play and display the video of the video data.
- the buffer memories 113 and 123 play a role of storing data so that video reproduction is not interrupted due to a transmission error in wireless communication.
- the data to be reproduced is stored in each of the buffer memories 113 and 123, so that the moving image data transmitted from the server device 100 is not always output from the speaker 116 and the display 126 at the same time. There was no problem. In order to solve this problem, the following radio reproduction output control process is executed in the radio communication system according to the present embodiment.
- FIG. 3 is a flowchart showing a synchronized playback output control process executed by the audio playback device 110 and the video playback device 120 of FIG.
- step S1 a T3 timer that measures time T3 (as shown in the processing of NO in step S10 in FIG. 3, within a predetermined time T3, the playback time at the time of decoding)
- the T3 timer is provided in order to control so that it is not necessary to correct and immediately decode.)
- step S4 it is determined whether or not the wirelessly received data is time difference information (At). If YES, the process proceeds to step S5. If NO, that is, if it is moving image data, the process proceeds to step S7. In step S5, it is determined based on the received time difference information (At) whether or not it is necessary to correct the decoding time. Here, for example, when IA t I> 100 milliseconds, it is determined that the user can recognize as a time difference between a noticeable video and audio, and in step S6, based on the received time difference information (A t). After the decoder 114 or 124 is controlled to delay the reproduction time of each data, the process returns to step S2. On the other hand, if NO in step S5, the process returns to step S2 without executing the process of step S6.
- step S7 the moving image data is stored in the buffer memory 113 or 123, and in step S8, a certain amount of moving image data such as 10 MB is stored in the buffer memory 113 or 123. If YES, the process proceeds to step S9. If NO, the process returns to step S2.
- step S9 the video data to be decoded is read from the buffer memory 113 or 123, and in step S10, the T3 timer counts for a fixed time T3 to determine whether it has timed out. Proceeds to step S11, while if NO, proceeds to step S14.
- step S11 the time stamp information T1 or T2 is also extracted from the read data force.
- tl is obtained from the video data extraction and decompression decoding processing by the video decoder 124 with respect to the video data amount that has been subjected to the predetermined compression coding.
- This is a known data processing time which can be determined in advance depending on the apparatus, which is required from the DZA conversion process to the playback output process by the display 126.
- t2 is the DZA conversion by the DZA converter 125 from the audio data extraction and decompression decoding processing by the audio decoder 114 for the predetermined compression-coded audio data amount.
- This is a known data processing time that can be determined in advance depending on the apparatus, which is required for the reproduction output processing by the speaker 116 after the processing.
- the known data processing time tl, t2 is almost constant regardless of the surrounding radio wave environment, temperature, humidity, etc.
- step S 13 the T3 timer is reset, and the process proceeds to step S 14.
- step S 14 the data to be decompressed and decoded is transferred to the decoder 114 or 124 to execute the decompression decoding process, and then the process returns to step S 2.
- FIG. 4 is a sequence diagram showing a synchronized playback output control process executed in the wireless communication system of FIG.
- the synchronized playback output control process for synchronizing the video and audio will be described.
- the server device 100 extracts moving image data to be played back (step 201), and wirelessly transmits it to the audio playback device 110 and the video playback device 120 (step 202).
- the audio playback device 110 wirelessly receives the moving image data wirelessly transmitted from the server device 100 (step 204) and stores it in the buffer memory 113 (step 206). Also, Similarly, the video reproduction device 120 wirelessly receives moving image data wirelessly transmitted from the server device 100 (step 203) and stores it in the buffer memory 123 (step 205).
- the audio decoder 114 of the audio playback device 110 extracts audio data that has been compression-encoded from the moving image data stored in the buffer memory 113, and extracts the compressed and encoded audio data. In this case, the decompression decoding process is performed, but the time stamp information T2 included in the audio data to be subjected to the decompression decoding process is notified to the decoder controller 117.
- the decoder controller 117 performs extraction, decompression decoding processing and DZA conversion processing on the notified time stamp information T2 after the compressed encoded audio data is read from the buffer memory 113, and the speaker 116
- the video decoder 124 of the video playback device 120 extracts the compressed and encoded video data from the video data stored in the buffer memory 123 and extracts the compressed and encoded video data.
- the decoder performs the decompression decoding process, and notifies the decoder controller 127 of the time stamp information T1 included in the video data to be subjected to the decompression decoding process.
- the decoder controller 127 reads out the compressed and encoded video data from the buffer memory 123 with respect to the notified time stamp information T1, extracts the force, performs decompression decoding processing, and DZA conversion processing, and displays the display 126.
- Estimated reproduction time information TT1 T1 + tl calculated by adding known data processing time tl required for display to output from TT1, and then calculating the estimated reproduction time information TT1 from the system controller 122 and the wireless transmission / reception circuit Wireless transmission is performed to the server apparatus 100 via 121 (step 207).
- video data extraction, decompression decoding processing, DZA conversion processing, and reproduction output processing are executed on the read video data (steps 209, 211, 213).
- Wireless transmission is performed to the audio playback device 120 and the audio playback device 110 (step 216).
- the system controller 112 of the audio playback device 110 controls the audio decoder 114 so that the time difference becomes small according to the wirelessly received playback estimation time difference information At (step 218). ).
- the decoder controller 127 controls the audio decoder 124 so that the time difference becomes small according to the wirelessly received estimated reproduction time difference information At (step 217).
- Each of the decoder controllers 117 and 127 corrects the operation clock of the audio decoder 114 or the video decoder 124 to be about several ppm earlier, or omits the decoding of several packets of data to control the playback time.
- the playback time can be delayed by correcting the operation clock of the audio decoder 114 or the video decoder 124 to be delayed by several ppm.
- a wireless communication system for reproducing audio with two left and right or more speakers as in a home theater, or a wireless with a plurality of displays.
- the present invention can also be applied to a communication system or the like.
- the playback time of the video data and the audio data can be controlled, in the case of a wireless communication system including a plurality of speakers, By intentionally delaying the time played from each speaker, it is possible to produce a realistic surround effect.
- the power described by using the wireless communication system as the moving image data including the video data and the audio data is described as the moving image data. It can also be applied to an audio system or the like that does not require video output, such as a mini-component that is not limited to this, or a radio receiver.
- FIG. 5 is a sequence diagram showing a synchronized playback output control process executed in a radio communication system according to a modification of the present invention.
- the modification is characterized in that the following points are different from the above embodiment.
- the server device 100 broadcasts a playback time recording request to the audio playback device 110 and the video playback device 120 at predetermined time intervals. Further, when receiving the playback time recording request from the server device 100, the audio playback device 110 and the video playback device 120 calculate the estimated playback time information TT1 and TT2, respectively.
- the server device 100 wirelessly transmits a reproduction estimated time information transmission request to the audio reproduction device 110 and the video reproduction device 120 at predetermined time intervals.
- the audio playback device 110 and the video playback device 120 receive the playback estimated time information TT1 and TT2 calculated when the playback estimated time information transmission request from the server device 100 is received, to the server device 100. Each transmits wirelessly.
- the server apparatus 100 performs a reproduction time recording request, a reproduction estimation time information transmission request to the audio reproduction apparatus 110, and a reproduction estimation time information transmission request to the video reproduction apparatus 120, for example, 10 seconds. Wireless transmission at intervals.
- (3C) Server device 100 wirelessly transmits a playback time recording request to audio playback device 110 and video playback device 120 (step 301).
- (3C1) The video decoder 124 of the video playback device 120 extracts the compressed and encoded video data from the moving image data stored in the buffer memory 123, and extracts the compressed and encoded video data.
- the decoding process is performed, and the time stamp information T1 included in the video data to be subjected to the decompression decoding process is notified to the decoder controller 127.
- the system controller 122 receives the reproduction time recording request wirelessly transmitted from the server apparatus 100 via the antenna 121A and the wireless transmission / reception circuit 121, and instructs the decoder controller 127 to calculate the estimated reproduction time information TT1.
- the decoder controller 127 receives a command for instructing the calculation of the estimated reproduction time information TT1
- the video data that has been compression-coded is read from the buffer memory 123 for the time stamp information T1 that is also notified of the video decoder 124 output.
- Estimated playback time information obtained by cal-calculating the known data processing time tl required to be displayed and output from the display 126 after being extracted, decompressed and decoded, and DZA converted.
- TTl Tl + tl is calculated and recorded in the temporary memory of the video playback device 120 (step 302).
- the audio decoder 114 of the audio playback device 110 extracts the compression-coded audio data from the moving image data stored in the nother memory 113, and performs the compression-coding on the audio data. Although the decompression decoding process is performed, the time stamp information T2 included in the audio data to be subjected to the decompression decoding process is notified to the decoder controller 117.
- the system controller 112 receives the reproduction time recording request wirelessly transmitted from the server apparatus 100 via the antenna 111A and the wireless transmission / reception circuit 111, and sends a command for instructing the decoder controller 117 to calculate the reproduction estimated time information TT2. I believe.
- the decoder controller 117 When the decoder controller 117 receives the instruction to calculate the estimated reproduction time information TT2, the audio data compressed and encoded for the time stamp information T2 notified from the audio decoder 114 is received from the buffer memory 113.
- Estimated playback time information TT2 T2 + t2 calculated by adding the known data processing time t2 required for extraction, decompression decoding, DZA conversion, and output from the speaker 116 after reading. Then, it is recorded in a temporary memory of the audio playback device 110 (step 303).
- server apparatus 100 wirelessly transmits a reproduction estimated time information transmission request to video reproduction apparatus 120 (step 304).
- (3D1) The system controller 122 of the video playback device 120 receives the playback estimated time information transmission request wirelessly transmitted from the server device 100 via the antenna 121A and the wireless transmission / reception circuit 121, and plays back to the decoder controller 127. Estimated time information TT1 transmission command is transmitted.
- the decoder controller 127 wirelessly transmits the calculated reproduction estimated time information TT1 to the server apparatus 100 via the system controller 122 and the wireless transmission / reception circuit 121 (step 305).
- video data extraction, decompression decoding processing, DZA conversion processing, and reproduction output processing are executed on the read video data (steps 209, 211, 213).
- server device 100 wirelessly transmits a reproduction estimated time information transmission request to audio reproduction device 110 (step 306).
- the system controller 112 of the audio playback device 110 receives the playback estimated time information transmission request wirelessly transmitted from the server device 100 via the antenna 111A and the wireless transmission / reception circuit 111, and sends it to the decoder controller 117.
- Estimated playback time information TT2 transmission command is transmitted.
- the decoder controller 117 wirelessly transmits the calculated reproduction estimated time information TT2 to the server apparatus 100 via the system controller 112 and the wireless transmission / reception circuit 111 (step 208).
- audio data extraction, decompression decoding processing, DZA conversion processing, and reproduction output processing are executed on the read audio data (steps 210, 212, 214).
- the server apparatus 100 broadcasts a reproduction time recording request to the audio reproduction apparatus 110 and the video reproduction apparatus 120.
- the present invention is not limited to this, and the server apparatus 100 100 may cast the playback time recording request to the audio playback device 110 and the video playback device 120.
- the server device 100 records the difference in time when the reproduction time recording request is wirelessly transmitted to each device, and adds or subtracts it to the reproduction estimated time difference information ⁇ t.
- the server device that transmits moving image data, the display that reproduces and outputs video data, and the speaker that reproduces and outputs audio data.
- a wireless communication system configured with Therefore, synchronized playback of video data and audio data can be performed with higher accuracy than in the prior art.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Mobile Radio Communication Systems (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/793,009 US20080013512A1 (en) | 2004-12-16 | 2005-12-06 | Wireless Communication System |
JP2006548777A JPWO2006064689A1 (ja) | 2004-12-16 | 2005-12-06 | 無線通信システム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-364056 | 2004-12-16 | ||
JP2004364056 | 2004-12-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006064689A1 true WO2006064689A1 (ja) | 2006-06-22 |
Family
ID=36587746
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/022379 WO2006064689A1 (ja) | 2004-12-16 | 2005-12-06 | 無線通信システム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20080013512A1 (ja) |
JP (1) | JPWO2006064689A1 (ja) |
CN (1) | CN101080926A (ja) |
WO (1) | WO2006064689A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010541354A (ja) * | 2007-09-28 | 2010-12-24 | トムソン ライセンシング | 他の装置に送信された受信ストリームを同期可能な通信技術 |
JP2017528009A (ja) * | 2015-06-17 | 2017-09-21 | シャオミ・インコーポレイテッド | マルチメディアファイルを再生するための方法及び装置 |
US9819839B2 (en) | 2012-10-30 | 2017-11-14 | Mitsubishi Electric Corporation | Audio/video reproduction system, video display device, and audio output device for synchronizing decoding of video frames by the video display device to decoding of audio frames by the audio output device |
JP2018502533A (ja) * | 2015-10-29 | 2018-01-25 | シャオミ・インコーポレイテッド | メディア同期方法、装置、プログラム及び記録媒体 |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9124769B2 (en) | 2008-10-31 | 2015-09-01 | The Nielsen Company (Us), Llc | Methods and apparatus to verify presentation of media content |
JP2011216178A (ja) * | 2010-03-18 | 2011-10-27 | Panasonic Corp | 再生装置、再生システム及びサーバ |
US9088818B2 (en) * | 2011-06-21 | 2015-07-21 | Harman International Industries, Incorporated | Adaptive media delay matching |
KR20180068069A (ko) * | 2016-12-13 | 2018-06-21 | 삼성전자주식회사 | 전자 장치 및 이의 제어 방법 |
CN111132087B (zh) * | 2018-10-31 | 2023-06-20 | 阿尔卑斯通信器件技术(上海)有限公司 | 通信装置、车载音频装置及车载音频系统 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002084261A (ja) * | 2000-09-07 | 2002-03-22 | Sony Corp | 無線受信装置及び方法 |
JP2002094950A (ja) * | 2000-09-13 | 2002-03-29 | Matsushita Electric Ind Co Ltd | 映像音声伝送装置および映像符号化装置および音声符号化装置および多重伝送装置 |
WO2003086003A1 (fr) * | 2002-04-04 | 2003-10-16 | Fujitsu Limited | Organe de commande de synchronisation et procede de commande dans un reseau radio |
JP2004193868A (ja) * | 2002-12-10 | 2004-07-08 | Alps Electric Co Ltd | 無線送受信システム及び無線送受信方法 |
JP2004254149A (ja) * | 2003-02-21 | 2004-09-09 | Nippon Telegr & Teleph Corp <Ntt> | データ伝送制御方法およびシステム |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3578069B2 (ja) * | 2000-09-13 | 2004-10-20 | 日本電気株式会社 | 長時間用画像・音声圧縮装置及びその方法 |
KR100782234B1 (ko) * | 2001-10-08 | 2007-12-05 | 엘지전자 주식회사 | 피브이알에서의 방송장애구간 자동처리 방법 |
EP1615433A4 (en) * | 2003-03-19 | 2010-05-26 | Panasonic Corp | DATA PROCESSING DEVICE |
WO2005109815A1 (ja) * | 2004-05-10 | 2005-11-17 | Fujitsu Limited | 通信装置、通信方法およびプログラム |
-
2005
- 2005-12-06 JP JP2006548777A patent/JPWO2006064689A1/ja active Pending
- 2005-12-06 WO PCT/JP2005/022379 patent/WO2006064689A1/ja not_active Application Discontinuation
- 2005-12-06 CN CNA2005800431660A patent/CN101080926A/zh active Pending
- 2005-12-06 US US11/793,009 patent/US20080013512A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002084261A (ja) * | 2000-09-07 | 2002-03-22 | Sony Corp | 無線受信装置及び方法 |
JP2002094950A (ja) * | 2000-09-13 | 2002-03-29 | Matsushita Electric Ind Co Ltd | 映像音声伝送装置および映像符号化装置および音声符号化装置および多重伝送装置 |
WO2003086003A1 (fr) * | 2002-04-04 | 2003-10-16 | Fujitsu Limited | Organe de commande de synchronisation et procede de commande dans un reseau radio |
JP2004193868A (ja) * | 2002-12-10 | 2004-07-08 | Alps Electric Co Ltd | 無線送受信システム及び無線送受信方法 |
JP2004254149A (ja) * | 2003-02-21 | 2004-09-09 | Nippon Telegr & Teleph Corp <Ntt> | データ伝送制御方法およびシステム |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010541354A (ja) * | 2007-09-28 | 2010-12-24 | トムソン ライセンシング | 他の装置に送信された受信ストリームを同期可能な通信技術 |
US9819839B2 (en) | 2012-10-30 | 2017-11-14 | Mitsubishi Electric Corporation | Audio/video reproduction system, video display device, and audio output device for synchronizing decoding of video frames by the video display device to decoding of audio frames by the audio output device |
JP2017528009A (ja) * | 2015-06-17 | 2017-09-21 | シャオミ・インコーポレイテッド | マルチメディアファイルを再生するための方法及び装置 |
US9961393B2 (en) | 2015-06-17 | 2018-05-01 | Xiaomi Inc. | Method and device for playing multimedia file |
JP2018502533A (ja) * | 2015-10-29 | 2018-01-25 | シャオミ・インコーポレイテッド | メディア同期方法、装置、プログラム及び記録媒体 |
Also Published As
Publication number | Publication date |
---|---|
CN101080926A (zh) | 2007-11-28 |
US20080013512A1 (en) | 2008-01-17 |
JPWO2006064689A1 (ja) | 2008-06-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2006064689A1 (ja) | 無線通信システム | |
JP4182437B2 (ja) | オーディオビデオ同期システム及びモニター装置 | |
US20080198930A1 (en) | Image information transmission system, image information transmitting apparatus, image information receiving apparatus, image information transmission method, image information transmitting method, and image information receiving method | |
US6862045B2 (en) | Moving image decoding and reproducing apparatus, moving image decoding and reproducing method, time control method, computer program product for decoding and reproducing moving image and multimedia information receiving apparatus | |
JP2006186580A (ja) | 再生装置およびデコード制御方法 | |
JP2007274607A (ja) | デジタル信号処理装置及びデータストリーム処理方法 | |
JP2007095163A (ja) | マルチメディア符号化データ分離伝送装置 | |
JP4359024B2 (ja) | 同期制御方法と装置およびそれを用いた同期再生装置およびテレビジョン受信装置 | |
JP4096915B2 (ja) | デジタル情報再生装置及び方法 | |
JP4564350B2 (ja) | 信号処理装置 | |
EP1533925A2 (en) | Synchronization of a stream data communication system | |
JP2006019888A (ja) | 記録装置および記録制御方法 | |
JP4823960B2 (ja) | 再生制御方法および受信装置 | |
JP2011004015A (ja) | 再生装置およびコンテンツ再生方法 | |
JP2004172864A (ja) | 字幕表示制御装置 | |
JP4953707B2 (ja) | デジタル放送受信機 | |
JP2014127213A (ja) | 同期再生制御装置及び同期再生制御方法 | |
JP2001245292A (ja) | データ受信装置、データ送受信システム及び記録媒体 | |
JP5857840B2 (ja) | エンコーダおよび制御方法 | |
JP6596363B2 (ja) | 時刻マッピング情報生成装置、同期再生システム、時刻マッピング情報生成方法及び時刻マッピング情報生成プログラム | |
JP3773892B2 (ja) | デジタル記録再生装置 | |
JP2008136027A (ja) | 映像音声再生システムと映像再生装置 | |
JP4127799B2 (ja) | 復号再生装置 | |
KR100527427B1 (ko) | 고출력 및 고음질의 오디오를 재생하는 동영상 재생장치및 방법 | |
JP2005267172A (ja) | コンテンツ受信システム、コンテンツ受信装置および方法、記録媒体、並びにプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KN KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2006548777 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200580043166.0 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11793009 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 11793009 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 05814214 Country of ref document: EP Kind code of ref document: A1 |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 5814214 Country of ref document: EP |