WO2009098669A2 - Audio streaming system and method for continuously synchronizing streamed audio data files - Google Patents

Audio streaming system and method for continuously synchronizing streamed audio data files Download PDF

Info

Publication number
WO2009098669A2
WO2009098669A2 PCT/IB2009/050501 IB2009050501W WO2009098669A2 WO 2009098669 A2 WO2009098669 A2 WO 2009098669A2 IB 2009050501 W IB2009050501 W IB 2009050501W WO 2009098669 A2 WO2009098669 A2 WO 2009098669A2
Authority
WO
WIPO (PCT)
Prior art keywords
unit
master
audio data
slave unit
audio
Prior art date
Application number
PCT/IB2009/050501
Other languages
French (fr)
Other versions
WO2009098669A3 (en
Inventor
Thomas J. Merritt
Winthrop L Saville
Anil Shetty
Andre Mccurdy
Parag Patel
Original Assignee
Nxp B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nxp B.V. filed Critical Nxp B.V.
Publication of WO2009098669A2 publication Critical patent/WO2009098669A2/en
Publication of WO2009098669A3 publication Critical patent/WO2009098669A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H40/00Arrangements specially adapted for receiving broadcast information
    • H04H40/18Arrangements characterised by circuits or components specially adapted for receiving
    • H04H40/27Arrangements characterised by circuits or components specially adapted for receiving specially adapted for broadcast systems covered by groups H04H20/53 - H04H20/95
    • H04H40/90Arrangements characterised by circuits or components specially adapted for receiving specially adapted for broadcast systems covered by groups H04H20/53 - H04H20/95 specially adapted for satellite broadcast receiving
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/09Arrangements for device control with a direct linkage to broadcast information or to broadcast space-time; Arrangements for control of broadcast-related services
    • H04H60/13Arrangements for device control affected by the broadcast information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/61Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
    • H04L65/611Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for multicast or broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J3/00Time-division multiplex systems
    • H04J3/02Details
    • H04J3/06Synchronising arrangements
    • H04J3/0635Clock or time synchronisation in a network
    • H04J3/0682Clock or time synchronisation in a network by delay compensation, e.g. by compensation of propagation delay or variations thereof, by ranging

Definitions

  • Audio streaming systems allow music and other audio data to be streamed to different remote locations.
  • the audio data can be stored at a central location, which eliminates the need to transfer the audio data and/or devices with the audio data to the different locations.
  • the audio data streamed to different remote locations can be the same audio data or different audio data.
  • Some audio streaming systems require the use of a personal computer (PC) as a base station to store and stream audio data to remote stations with speakers to play the streamed audio data.
  • PC personal computer
  • some audio streaming systems include a stand-alone base station that can store and stream audio data to remote stations without the use of a PC.
  • the PC or the base station functions as an audio server to store the audio data and to provide the audio data as streamed data to any of the remote stations so that the streamed audio data can be played at the remote stations.
  • a popular feature for audio streaming systems is the ability to simultaneously play the same audio data file on different remote stations, as well as the base station.
  • This feature will be referred to herein as a broadcast mode, which requires the "playing" stations to be synchronized so that all the playing stations begin playing a particular audio data file at the same moment in time.
  • one or more of the playing stations may become out of synchronization with the other playing stations, especially if the audio data file is particularly long, e.g., longer than four minutes.
  • An audio streaming system and method for continuously synchronizing streamed audio data files uses timing information from a master unit of the system at a slave unit of the system to synchronize the timing of a streamed audio data file being played at the slave unit with the timing of the same streamed audio data file being played at the master unit as the streamed audio data file is being played at the slave unit. Since the streamed audio data file is synchronized at the slave unit as the streamed audio data file is being played, the out of synchronization issue for long audio data files during a broadcast mode is addressed.
  • An audio streaming system in accordance with an embodiment of the invention comprises a master unit and a slave unit.
  • the master unit is configured to receive and play streamed audio data files.
  • the master unit is further configured to send a broadcast play message to play a streamed audio data file at a particular time to initiate a broadcast mode.
  • the slave unit is configured to receive and play to the streamed audio data files.
  • the slave unit is further configured to receive the broadcast play message from the master unit and timing information from the master unit as the streamed audio data file is simultaneously being played at the master unit and the slave unit.
  • the slave unit is further configured to synchronize the timing of the streamed audio data file being played at the slave unit with the timing of the streamed audio data file being played at the master unit using the timing information from the master unit as the streamed audio data file is being played at the slave unit.
  • a method for continuously synchronizing streamed audio data files during a broadcast mode comprises receiving a broadcast play message from a master unit of an audio streaming system at a slave unit of the audio streaming system to play a streamed audio data file at a particular time, receiving timing information from the master unit at the slave unit as the streamed audio data file is simultaneously being played at the master unit and the slave unit, and synchronizing the timing of the streamed audio data file being played at the slave unit with the timing of the streamed audio data file being played at the master unit using the timing information from the master unit as the streamed audio data file is being played at the slave unit.
  • FIG. 1 shows an audio streaming system in accordance with an embodiment of the invention.
  • Fig. 2 is a block diagram of an audio streaming base unit of the audio streaming system of Fig. 1 in accordance with an embodiment of the invention.
  • Fig. 3 is a block diagram of a streamed audio playing satellite unit of the audio streaming system of Fig. 1 in accordance with an embodiment of the invention.
  • Fig. 4 is a synchronizing digital audio player of the streamed audio playing satellite unit of Fig. 3 in accordance with an embodiment of the invention.
  • Fig. 5 is a flow diagram of a process of computing the relationship between master counter values and local counter values performed by a stream synchronizer of the synchronization digital audio player of Fig. 4 in accordance with an embodiment of the invention.
  • Fig. 6 is a process flow diagram of a method for continuously synchronizing streamed audio data files during a broadcast mode in accordance with an embodiment of the invention.
  • the audio streaming system 100 includes an audio streaming base unit 102 and a number of streamed audio playing satellite units 104A-104E, which are connected to the audio streaming base unit.
  • the connections between the audio streaming base unit 102 and the streamed audio playing satellite units 104A-104E are wireless connections, such as Wi-Fi connections.
  • the connections between the audio streaming base unit 102 and the streamed audio playing satellite units 104A-104E may be wired connections.
  • the audio streaming system 100 may include any number of audio streaming base units and audio streams satellite units.
  • the audio streaming base unit 102 is configured to stream audio data files, which are stored in the audio streaming base unit, to one or more of the streamed audio playing satellite units 104A-104E without the use of a personal computer (PC).
  • the audio streaming base unit 102 is a PC-independent audio streaming base unit.
  • the audio data files may be any digital audio segments, such as digital songs or segments of digital audio recordings.
  • the audio data files can be any type of audio files, such as Windows Media Audio (WMA) files, MPEG Layer-3 (MP3) files and Pulse Code Modulation (PCM) files.
  • WMA Windows Media Audio
  • MP3 MPEG Layer-3
  • PCM Pulse Code Modulation
  • the audio streaming base unit 102 is further configured to communicate with the streamed audio playing satellite units 104A-104E by transmitting and receiving messages.
  • the audio streaming system 100 may include a PC 106, which performs functions similar to the audio streaming base unit 102.
  • the audio streaming base unit 102 may be replaced by the PC 106.
  • the audio streaming base unit 102 and the PC 106 may both operate to share the audio data files stored in the audio streaming base unit and the PC.
  • the audio streaming base unit 102 may be configured to receive and play streamed audio data files form the PC 106.
  • the audio streaming base unit 102 may use the PC 106 to access the Internet through the PC.
  • the streamed audio playing satellite units 104A-104E are configured to receive and play the audio data files streamed from the audio streaming base station 102 or the PC 106.
  • the streamed audio playing satellite units 104A-104E are also configured to transmit and receive messages to communicate with the audio streaming base unit 102.
  • the audio streaming system 100 uses at least one synchronization technique to synchronize a streamed audio data file being played on multiple units of the audio streaming base unit 102 and the streamed audio playing satellite units 104A-104E during a broadcast mode, where the same streamed audio data file is played on multiple units.
  • the streamed audio data file is extremely long, e.g., ten minutes or more, the streamed audio data file being played on different units of the audio streaming system 100 will be synchronized for the entire play duration of the audio data file.
  • one of the participating or playing units of the audio streaming system 100 will function as a master unit and the rest of the participating units will function as slave units.
  • the master unit is the participating unit with which the slave units are synchronized with respect to the streamed audio files being played on the participating units.
  • any of the participating units may elect to exit the broadcast mode, including the master unit, without terminating the broadcast mode.
  • the role of the master unit can be assigned to one of the slave units so that the former master unit can exit the broadcast mode without terminating the broadcast mode.
  • the audio streaming base unit 102 includes an antenna 202, a transmitter 204 and a receiver 206.
  • the transmitter 204 and the receiver 206 are connected to the antenna 202.
  • the transmitter 204 is configured to transmit outgoing signals, including messages and audio data, to the streamed audio playing satellite units 104A-104E and/or the PC 106 using the antenna 202.
  • the receiver 206 is configured to receive incoming signals, including messages and audio data, from the streamed audio playing satellite units 104A-104E and/or the PC 106 using the antenna 202.
  • the transmitter 204 and the receiver 206 are configured to transmit and receive signals using Wi-Fi technology.
  • the transmitter 204 and the receiver 206 may be configured transmit and receive signals using other known technology.
  • the audio streaming base unit 102 also includes an FM tuner 208, a CD player 210, speakers 212, a speaker driver 214, a storage device 216, a processor 218, a server processor 220 and a synchronizing digital audio player 224.
  • the FM tuner 208 and CD player 210 are well known components that are commonly found in consumer audio products. Thus, these components are not described here in detail.
  • the FM tuner 208 allows the audio streaming base unit 102 to receive and play radio signals on the speakers 212 via the speaker driver 214.
  • the speaker driver 214 may include, among other components, a first-in first-out (FIFO) buffer and a digital-to-analog-converter (DAC), to receive digital audio data and output analog signals to drive the speakers 212 according to the received digital audio data.
  • the received radio signals may also be recorded in the storage device 216, which can be any type of a storage device, such as a computer hard drive.
  • the CD player 210 allows the audio streaming base unit 102 to play audio data from CDs (e.g., audio CDs, CD-Rs, CD-RWs and MP3 WMA-CDs) on the speakers 212 or to save the audio data from the CDs in the storage device 216.
  • CDs e.g., audio CDs, CD-Rs, CD-RWs and MP3 WMA-CDs
  • the processor 218 is connected to various components of the audio streaming base unit 102 to control those components.
  • the processor 216 is connected to the transmitter 204 and the receiver 206 to control the transmission and reception of signals, including messages and audio data.
  • the processor 218 is also connected to the FM turner 208 and the CD player 210 to control these components.
  • the processor 218 is also connected to the storage device 216 to access the data, including audio data files, stored in the storage device.
  • the processor 218 is also connected to the synchronizing digital audio player 224 to transmit and receive data to and from the synchronizing digital audio player.
  • the processor 218 may be a general-purpose digital processor such as a microprocessor or microcontroller. In other embodiments, the processor 218 may be a special-purpose processor such as a digital signal processor. In other embodiments, the processor 218 may be another type of controller or a field programmable gate array (FPGA).
  • FPGA field programmable gate array
  • the server processor 220 is connected to the processor 218, and thus, can communicate with the processor 218.
  • the server processor 220 is also connected to the storage device 216 to access the data, including audio data files, stored in the storage device.
  • the server processor 220 along with the storage device 216, functions as an audio server for the audio streaming system 100 to stream audio files to any of the streamed audio playing satellite units 104A-104E, as well as to the synchronizing digital audio player 224 of the audio streaming base unit 102.
  • the server processor 220 can be any type of a processor, such as a general-purpose digital processor, a digital signal processor, a controller or a FPGA.
  • the audio streaming base unit 102 is shown and described as having two processors, the audio streaming base unit may include any number of processors in other embodiments.
  • the audio streaming base unit 102 further includes a clock 222 and the synchronizing digital audio player 224.
  • the clock 222 is configured to generate a clock signal, which is used by various components of the audio streaming base unit 102, including the synchronizing digital audio player 224.
  • the clock 222 may be a crystal oscillator that provides the clock signal at a predefined frequency.
  • the synchronizing digital audio player 224 is configured to play received digital audio files on the speakers 212 via the speaker driver 214.
  • the synchronizing digital audio player 224 is configured to synchronize an audio data file being played on the speakers 212 with a master unit if the audio streaming base unit 102 is functioning as a slave unit.
  • the audio streaming base unit 102 includes other conventional components, which are not described herein so as to not obscure the inventive features of the audio streaming base unit.
  • Fig. 3 shows some of the components of the streamed audio playing satellite unit 104A, which is representative of the other streamed audio playing satellite units 104B-104E, in accordance with an embodiment of the invention.
  • the streamed audio playing satellite unit 104A includes an antenna 302, a transmitter 304 and a receiver 306.
  • the transmitter 304 and the receiver 306 are connected to the antenna 302.
  • the transmitter 304 is configured to transmit outgoing signals, including messages, to the audio streaming base unit 102, the other streamed audio playing satellite units 104B-104E and/or the PC 106 using the antenna 302.
  • the receiver 306 is configured to receive incoming signals, including messages, from the audio streaming base unit 102, the other streamed audio playing satellite units 104B-104E and/or the PC 106 using the antenna 302.
  • the transmitter 304 and the receiver 306 are configured to transmit and receive signals using Wi-Fi technology.
  • the transmitter 304 and the receiver 306 may be configured transmit and receive signals using other known technology.
  • the streamed audio playing satellite unit 104A also includes an FM tuner 308, speakers 312, a speaker driver 314, a processor 318, a clock 322 and a synchronizing digital audio player 324.
  • the FM tuner 308 allows the streamed audio playing satellite unit 104A to receive and play radio signals on the speakers 312 via the speaker driver 314.
  • the speaker driver 314 may include, among other components, a first-in first-out (FIFO) buffer and a digital-to-analog-converter (DAC), to receive digital audio data and output analog signals to drive the speakers 312 according to the received digital audio data.
  • FIFO first-in first-out
  • DAC digital-to-analog-converter
  • the processor 318 is connected to various components of the streamed audio playing satellite unit 104A to control those components.
  • the processor 318 is connected to the transmitter 304 and the receiver 306 to control the transmission and reception of signals, including messages.
  • the processor 318 is also connected to the FM tuner 308 to control the FM tuner.
  • the processor 318 is also connected to the synchronizing digital audio player 324 to transmit and receive data to and from the synchronizing digital audio player.
  • the processor 318 may be a general-purpose digital processor such as a microprocessor or microcontroller. In other embodiments, the processor 318 may be a special- purpose processor such as a digital signal processor. In other embodiments, the processor 318 may be another type of controller or a field programmable gate array (FPGA).
  • FPGA field programmable gate array
  • the streamed audio playing satellite unit 104A further includes a clock 322 and the synchronizing digital audio player 324.
  • the clock 322 is configured to generate a clock signal, which is used by various components of the streamed audio playing satellite unit 104A, including the synchronizing digital audio player 324.
  • the clock 322 may be a crystal oscillator that provides the clock signal at a predefined frequency.
  • the synchronizing digital audio player 324 is configured to play received digital audio files on the speakers 312 via the speaker driver 314.
  • the synchronizing digital audio player 324 is similar to the synchronizing digital audio player 224 of the audio streaming base unit 102, and thus, during a broadcast mode, the synchronizing digital audio player 324 is configured to synchronize an audio data file being played on the speakers 312 with a master unit if the streamed audio playing satellite unit 104A is functioning as a slave unit. In an embodiment, the synchronizing digital audio player 324 is identical to the synchronizing digital audio player 224 of the audio streaming base unit 102.
  • the streamed audio playing satellite unit 104A includes other conventional components, which are not described herein so as to not obscure the inventive features of the streamed audio playing satellite unit 104A.
  • Fig. 4 shows the components of the synchronizing digital audio player 324, which is representative of the other synchronizing digital audio players in the streamed audio playing satellite units 104B-104E and the audio streaming base unit 102, in accordance with an embodiment of the invention.
  • the synchronizing digital audio player 324 includes a first-in first- out (FIFO) buffer 426, an audio decoder 428, a resampler 430, a counter 432 and a stream synchronizer 434.
  • FIFO first-in first- out
  • the FIFO buffer 426 is configured to buffer a streamed audio data file received at the streamed audio playing satellite unit 104A.
  • the streamed audio data file is a compressed data.
  • the FIFO buffer 426 is shown to be connected to the processor 318. However, in other embodiments, the FIFO buffer 426 may be connected directly to the receiver 306 to receive the streamed audio data file.
  • the audio decoder 428 is configured to receive the streamed audio data file from the FIFO buffer 426 and to decode the audio data file, which is transmitted to the resampler 430.
  • the resampler 430 is configured to selectively resample a block of the decoded audio data file from the audio decoder 428 to add or drop samples according to control signals from the stream synchronizer 434.
  • the resampled data is then transmitted to the speaker driver 314 to play the resampled data on the speakers 312.
  • the resampling of the decoded audio data allows the audio data file being played at the streamed audio playing satellite unit 104A during a broadcast mode to become synchronized with the same audio data file being played at a master unit, which can be the audio streaming base unit 102, one of the other streamed audio playing satellite units 104B-104E or the PC 106.
  • the counter 432 is configured to generate counter values using the clock signal from the clock 322.
  • the counter values are provided to the stream synchronizer 434.
  • the stream synchronizer 434 is configured to use these local counter values to synchronize the audio data file being played at the streamed audio playing satellite unit 104A with the same audio data file being played at a master unit according to a synchronization technique, which will sometimes be referred to herein as a self-regulating roundtrip delay technique.
  • a synchronization technique which will sometimes be referred to herein as a self-regulating roundtrip delay technique.
  • this synchronization technique local and master counter values and roundtrip flight times are used to define a relationship between master counter values and local counter values, which in effect is the relationship between the master clock and the local clock. This relationship between master counter values and local counter values is then used for synchronization, as described in more detail below.
  • an initial local counter value from the local counter 432 is read.
  • the initial local counter value is sent to the master unit.
  • a copy of the initial local counter value and a master counter value from the master unit is received.
  • the master counter value is the counter value at the time when a request message containing the initial local counter value is received at the master unit.
  • the initial local counter value, the master counter value and a final local counter value are recorded.
  • the final local counter value is the local counter value at the time when the master value and the copy of the initial local counter value are received at the streamed audio playing satellite unit 104A.
  • the roundtrip flight time is the duration of time between transmission of a message containing the initial local counter value from a slave unit and receipt of a message containing the copy of the initial local counter value and the master counter value from the master unit.
  • estimated local counter value (initial counter value + final counter value)/2.
  • the local counter value that corresponds to the moment in time when the master counter value was read at the master unit is estimated using the above equation, which assumes that sending and receiving delays are symmetric.
  • a new frequency factor and a new delay factor are calculated to determine the current relationship between master counter values and local counter values.
  • master counter value (master counter value * frequency factor) + delay factor.
  • the master unit when a broadcast mode is initiated, the master unit sends the title of an audio data file to be played and the master counter value at which the master unit will begin to play the audio data file as a broadcast play message to at least one slave unit.
  • the stream synchronizer 434 will compute the current relationship between master counter value and local counter value. Using this computed relationship, the stream synchronizer 434 will transmit a control signal to the resampler 430 to synchronize the audio data being played at the streamed audio playing satellite unit 104A with the audio data being played at the master unit.
  • the two clocks of the master unit and the streamed audio playing satellite unit 104A have a maximum error of 50 parts per million (ppm) and the two clocks have a maximum relative drift of 100 ppm
  • the two clocks will drift by a maximum of one sample for every ten thousand samples.
  • the computed local counter value at which the next sample will be played is compared with the actual local counter value at which the next sample will be played.
  • the stream synchronizer 434 will transmit a control signal that instructs the resampler 430 to drop one sample during resampling of a block of audio data output (e.g., 1024) so that the block has one less sample (e.g., 1023) in order for the audio data file being played at the streamed audio playing satellite unit 104A to catch up to the audio data file being played at the master unit.
  • a block of audio data output e.g. 1024
  • the stream synchronizer 434 will transmit a control signal that instructs the resampler 430 to add one sample during resampling of the block of audio data output so that the block has one more sample (e.g., 1025) in order for the audio data file being played at the streamed audio playing satellite unit 104A to slow down to match the audio data file being played at the master unit.
  • a control signal that instructs the resampler 430 to add one sample during resampling of the block of audio data output so that the block has one more sample (e.g., 1025) in order for the audio data file being played at the streamed audio playing satellite unit 104A to slow down to match the audio data file being played at the master unit.
  • the stream synchronizer 434 will transmit a control signal that instructs the resampler 430 to not change the number of samples in the block of audio data output. In this fashion, the audio data files being simultaneously played at a master unit and one or more slave units can be synchronized as the audio data files are being played.
  • the process of resampling has been described herein as dropping or adding one sample for a block of audio data, the resampling process may involve dropping or adding more than one sample for a block of audio data in other embodiments.
  • the synchronization process is performed periodically as a streamed audio data file is being played.
  • the audio data output may be synchronized at the slave units with the master unit every second as an audio data file is being simultaneously played at the slave units and the master unit.
  • the audio data output may be synchronized at the slave units with the master unit at a different interval or even at varying intervals.
  • the synchronization at a slave unit may be performed more frequently when the slave unit is initially powered up. However, as time passes, the frequency of the synchronization at the slave unit may slow down to minimize interruptions on the overall system.
  • the audio streaming base unit 102 and the streamed audio playing satellite units 104A-104E may be configured to use more than one technique for synchronization by the respective stream synchronizers.
  • the stream synchronizer of a slave unit may initially use the self-regulating roundtrip delay technique so that the synchronization can be performed quickly, e.g., after being powered up. Then, after some time has passed, the stream synchronizer of the slave unit may switch to a more accurate technique for synchronization, such as network time protocol (NTP) technique.
  • NTP network time protocol
  • the stream synchronizer of the slave unit may switch to one or more other known techniques for synchronization.
  • any of the participating units of the audio streaming system during a broadcast mode may elect to exit without terminating the broadcast mode. That is, the remaining participating units will still be operating in the broadcast mode.
  • the election to exit the broadcast mode may involve sending a message to the master unit to inform the master unit of the election.
  • the master role of the master unit is assigned to another unit, such as one of the participating slave units, without terminating the broadcast mode. This reassigning the master role may involve exchanging messages between the current master unit and one or more potential units of the audio streaming system 100.
  • the master role can be assigned to any unit of the audio streaming system 100 because this function is independent of the function of the streaming audio server, which may be performed by the server processor 220 and the storage device 216 of the audio streaming base unit 102 or the PC 106.
  • the designated audio server streams the audio data file to the participating units, regardless of whether these units are functioning as the master or slave units for the broadcast mode.
  • the role of the master unit is simply to provide play information of an audio data file, which is to be played during the broadcast mode, to the slave units and to play the streamed audio file from the audio server at the designated time.
  • the play information includes the title of the streamed audio data file to be played and the master counter value at which the master unit will play the streamed audio data file.
  • the role of a slave unit is to play the streamed audio data files from the audio server in synchronization with the streamed audio data files being played at the master unit.
  • a method for continuously synchronizing streamed audio data files during a broadcast mode in accordance with an embodiment of the invention is described with reference to a process flow diagram of Fig. 6.
  • a broadcast play message from a master unit of an audio streaming system is received at a slave unit of the audio streaming system to play a streamed audio data file at a particular time.
  • timing information from the master unit is received at the slave unit as the streamed audio data file is simultaneously being played at the master unit and the slave unit.
  • the timing of the streamed audio data file being played at the slave unit is synchronized with the timing of the streamed audio data file being played at the master unit using the timing information from the master unit as the streamed audio data file is being played at the slave unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Astronomy & Astrophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

An audio streaming system and method for continuously synchronizing streamed audio data files uses timing information from a master unit of the system at a slave unit of the system to synchronize the timing of a streamed audio data file being played at the slave unit with the timing of the same streamed audio data file being played at the master unit as the streamed audio data file is being played at the slave unit.

Description

AUDIO STREAMING SYSTEM AND METHOD FOR CONTINUOUSLY SYNCHRONIZING STREAMED AUDIO DATA FILES
Audio streaming systems allow music and other audio data to be streamed to different remote locations. Thus, the audio data can be stored at a central location, which eliminates the need to transfer the audio data and/or devices with the audio data to the different locations. Depending on the audio streaming system, the audio data streamed to different remote locations can be the same audio data or different audio data. Some audio streaming systems require the use of a personal computer (PC) as a base station to store and stream audio data to remote stations with speakers to play the streamed audio data. However, some audio streaming systems include a stand-alone base station that can store and stream audio data to remote stations without the use of a PC. Thus, depending on the audio streaming systems, the PC or the base station functions as an audio server to store the audio data and to provide the audio data as streamed data to any of the remote stations so that the streamed audio data can be played at the remote stations.
A popular feature for audio streaming systems is the ability to simultaneously play the same audio data file on different remote stations, as well as the base station. This feature will be referred to herein as a broadcast mode, which requires the "playing" stations to be synchronized so that all the playing stations begin playing a particular audio data file at the same moment in time. However, during the play of the audio data file, one or more of the playing stations may become out of synchronization with the other playing stations, especially if the audio data file is particularly long, e.g., longer than four minutes. In view of the above concern, there is a need for an audio streaming system and method for synchronizing streamed audio data files so that even long audio data files do not become out of synchronization when being played on multiple stations of the audio streaming system. An audio streaming system and method for continuously synchronizing streamed audio data files uses timing information from a master unit of the system at a slave unit of the system to synchronize the timing of a streamed audio data file being played at the slave unit with the timing of the same streamed audio data file being played at the master unit as the streamed audio data file is being played at the slave unit. Since the streamed audio data file is synchronized at the slave unit as the streamed audio data file is being played, the out of synchronization issue for long audio data files during a broadcast mode is addressed.
An audio streaming system in accordance with an embodiment of the invention comprises a master unit and a slave unit. The master unit is configured to receive and play streamed audio data files. The master unit is further configured to send a broadcast play message to play a streamed audio data file at a particular time to initiate a broadcast mode. The slave unit is configured to receive and play to the streamed audio data files. The slave unit is further configured to receive the broadcast play message from the master unit and timing information from the master unit as the streamed audio data file is simultaneously being played at the master unit and the slave unit. The slave unit is further configured to synchronize the timing of the streamed audio data file being played at the slave unit with the timing of the streamed audio data file being played at the master unit using the timing information from the master unit as the streamed audio data file is being played at the slave unit.
A method for continuously synchronizing streamed audio data files during a broadcast mode in accordance with an embodiment of the invention comprises receiving a broadcast play message from a master unit of an audio streaming system at a slave unit of the audio streaming system to play a streamed audio data file at a particular time, receiving timing information from the master unit at the slave unit as the streamed audio data file is simultaneously being played at the master unit and the slave unit, and synchronizing the timing of the streamed audio data file being played at the slave unit with the timing of the streamed audio data file being played at the master unit using the timing information from the master unit as the streamed audio data file is being played at the slave unit.
Other aspects and advantages of the present invention will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrated by way of example of the principles of the invention. Fig. 1 shows an audio streaming system in accordance with an embodiment of the invention.
Fig. 2 is a block diagram of an audio streaming base unit of the audio streaming system of Fig. 1 in accordance with an embodiment of the invention. Fig. 3 is a block diagram of a streamed audio playing satellite unit of the audio streaming system of Fig. 1 in accordance with an embodiment of the invention.
Fig. 4 is a synchronizing digital audio player of the streamed audio playing satellite unit of Fig. 3 in accordance with an embodiment of the invention. Fig. 5 is a flow diagram of a process of computing the relationship between master counter values and local counter values performed by a stream synchronizer of the synchronization digital audio player of Fig. 4 in accordance with an embodiment of the invention.
Fig. 6 is a process flow diagram of a method for continuously synchronizing streamed audio data files during a broadcast mode in accordance with an embodiment of the invention.
With reference to Fig. 1, an audio streaming system 100 in accordance with an embodiment of the invention is described. The audio streaming system 100 includes an audio streaming base unit 102 and a number of streamed audio playing satellite units 104A-104E, which are connected to the audio streaming base unit. In the illustrated embodiment, the connections between the audio streaming base unit 102 and the streamed audio playing satellite units 104A-104E are wireless connections, such as Wi-Fi connections. However, in other embodiments, the connections between the audio streaming base unit 102 and the streamed audio playing satellite units 104A-104E may be wired connections. Although only a single audio streaming base unit and five streamed audio playing satellite units are shown in Fig. 1, the audio streaming system 100 may include any number of audio streaming base units and audio streams satellite units. The audio streaming base unit 102 is configured to stream audio data files, which are stored in the audio streaming base unit, to one or more of the streamed audio playing satellite units 104A-104E without the use of a personal computer (PC). Thus, the audio streaming base unit 102 is a PC-independent audio streaming base unit. The audio data files may be any digital audio segments, such as digital songs or segments of digital audio recordings. The audio data files can be any type of audio files, such as Windows Media Audio (WMA) files, MPEG Layer-3 (MP3) files and Pulse Code Modulation (PCM) files. The audio streaming base unit 102 is further configured to communicate with the streamed audio playing satellite units 104A-104E by transmitting and receiving messages. However, in some embodiments, the audio streaming system 100 may include a PC 106, which performs functions similar to the audio streaming base unit 102. Thus, in some embodiments, the audio streaming base unit 102 may be replaced by the PC 106. In other embodiments, the audio streaming base unit 102 and the PC 106 may both operate to share the audio data files stored in the audio streaming base unit and the PC. In these embodiments, the audio streaming base unit 102 may be configured to receive and play streamed audio data files form the PC 106. In some embodiments, the audio streaming base unit 102 may use the PC 106 to access the Internet through the PC.
The streamed audio playing satellite units 104A-104E are configured to receive and play the audio data files streamed from the audio streaming base station 102 or the PC 106. The streamed audio playing satellite units 104A-104E are also configured to transmit and receive messages to communicate with the audio streaming base unit 102.
As described in more detail below, the audio streaming system 100 uses at least one synchronization technique to synchronize a streamed audio data file being played on multiple units of the audio streaming base unit 102 and the streamed audio playing satellite units 104A-104E during a broadcast mode, where the same streamed audio data file is played on multiple units. Thus, even if the streamed audio data file is extremely long, e.g., ten minutes or more, the streamed audio data file being played on different units of the audio streaming system 100 will be synchronized for the entire play duration of the audio data file. During a broadcast mode, one of the participating or playing units of the audio streaming system 100 will function as a master unit and the rest of the participating units will function as slave units. As used herein, the master unit is the participating unit with which the slave units are synchronized with respect to the streamed audio files being played on the participating units. In an embodiment of the invention, any of the participating units may elect to exit the broadcast mode, including the master unit, without terminating the broadcast mode. Furthermore, during the broadcast mode, the role of the master unit can be assigned to one of the slave units so that the former master unit can exit the broadcast mode without terminating the broadcast mode.
Fig. 2 shows some of the components of the audio streaming base unit 102. As shown in Fig. 2, the audio streaming base unit 102 includes an antenna 202, a transmitter 204 and a receiver 206. The transmitter 204 and the receiver 206 are connected to the antenna 202. The transmitter 204 is configured to transmit outgoing signals, including messages and audio data, to the streamed audio playing satellite units 104A-104E and/or the PC 106 using the antenna 202. The receiver 206 is configured to receive incoming signals, including messages and audio data, from the streamed audio playing satellite units 104A-104E and/or the PC 106 using the antenna 202. In an embodiment, the transmitter 204 and the receiver 206 are configured to transmit and receive signals using Wi-Fi technology. However, in other embodiments, the transmitter 204 and the receiver 206 may be configured transmit and receive signals using other known technology.
The audio streaming base unit 102 also includes an FM tuner 208, a CD player 210, speakers 212, a speaker driver 214, a storage device 216, a processor 218, a server processor 220 and a synchronizing digital audio player 224. The FM tuner 208 and CD player 210 are well known components that are commonly found in consumer audio products. Thus, these components are not described here in detail. The FM tuner 208 allows the audio streaming base unit 102 to receive and play radio signals on the speakers 212 via the speaker driver 214. The speaker driver 214 may include, among other components, a first-in first-out (FIFO) buffer and a digital-to-analog-converter (DAC), to receive digital audio data and output analog signals to drive the speakers 212 according to the received digital audio data. The received radio signals may also be recorded in the storage device 216, which can be any type of a storage device, such as a computer hard drive. The CD player 210 allows the audio streaming base unit 102 to play audio data from CDs (e.g., audio CDs, CD-Rs, CD-RWs and MP3 WMA-CDs) on the speakers 212 or to save the audio data from the CDs in the storage device 216.
The processor 218 is connected to various components of the audio streaming base unit 102 to control those components. The processor 216 is connected to the transmitter 204 and the receiver 206 to control the transmission and reception of signals, including messages and audio data. The processor 218 is also connected to the FM turner 208 and the CD player 210 to control these components. The processor 218 is also connected to the storage device 216 to access the data, including audio data files, stored in the storage device. The processor 218 is also connected to the synchronizing digital audio player 224 to transmit and receive data to and from the synchronizing digital audio player. The processor 218 may be a general-purpose digital processor such as a microprocessor or microcontroller. In other embodiments, the processor 218 may be a special-purpose processor such as a digital signal processor. In other embodiments, the processor 218 may be another type of controller or a field programmable gate array (FPGA).
The server processor 220 is connected to the processor 218, and thus, can communicate with the processor 218. The server processor 220 is also connected to the storage device 216 to access the data, including audio data files, stored in the storage device. The server processor 220, along with the storage device 216, functions as an audio server for the audio streaming system 100 to stream audio files to any of the streamed audio playing satellite units 104A-104E, as well as to the synchronizing digital audio player 224 of the audio streaming base unit 102. Similar to the processor 218, the server processor 220 can be any type of a processor, such as a general-purpose digital processor, a digital signal processor, a controller or a FPGA. Although the audio streaming base unit 102 is shown and described as having two processors, the audio streaming base unit may include any number of processors in other embodiments.
As shown in Fig. 2, the audio streaming base unit 102 further includes a clock 222 and the synchronizing digital audio player 224. The clock 222 is configured to generate a clock signal, which is used by various components of the audio streaming base unit 102, including the synchronizing digital audio player 224. As an example, the clock 222 may be a crystal oscillator that provides the clock signal at a predefined frequency. The synchronizing digital audio player 224 is configured to play received digital audio files on the speakers 212 via the speaker driver 214. As described in more detail below, during a broadcast mode, the synchronizing digital audio player 224 is configured to synchronize an audio data file being played on the speakers 212 with a master unit if the audio streaming base unit 102 is functioning as a slave unit. The audio streaming base unit 102 includes other conventional components, which are not described herein so as to not obscure the inventive features of the audio streaming base unit.
Fig. 3 shows some of the components of the streamed audio playing satellite unit 104A, which is representative of the other streamed audio playing satellite units 104B-104E, in accordance with an embodiment of the invention. As shown in Fig. 3, the streamed audio playing satellite unit 104A includes an antenna 302, a transmitter 304 and a receiver 306. The transmitter 304 and the receiver 306 are connected to the antenna 302. The transmitter 304 is configured to transmit outgoing signals, including messages, to the audio streaming base unit 102, the other streamed audio playing satellite units 104B-104E and/or the PC 106 using the antenna 302. The receiver 306 is configured to receive incoming signals, including messages, from the audio streaming base unit 102, the other streamed audio playing satellite units 104B-104E and/or the PC 106 using the antenna 302. In an embodiment, the transmitter 304 and the receiver 306 are configured to transmit and receive signals using Wi-Fi technology. However, in other embodiments, the transmitter 304 and the receiver 306 may be configured transmit and receive signals using other known technology.
The streamed audio playing satellite unit 104A also includes an FM tuner 308, speakers 312, a speaker driver 314, a processor 318, a clock 322 and a synchronizing digital audio player 324. The FM tuner 308 allows the streamed audio playing satellite unit 104A to receive and play radio signals on the speakers 312 via the speaker driver 314. The speaker driver 314 may include, among other components, a first-in first-out (FIFO) buffer and a digital-to-analog-converter (DAC), to receive digital audio data and output analog signals to drive the speakers 312 according to the received digital audio data.
The processor 318 is connected to various components of the streamed audio playing satellite unit 104A to control those components. The processor 318 is connected to the transmitter 304 and the receiver 306 to control the transmission and reception of signals, including messages. The processor 318 is also connected to the FM tuner 308 to control the FM tuner. The processor 318 is also connected to the synchronizing digital audio player 324 to transmit and receive data to and from the synchronizing digital audio player. The processor 318 may be a general-purpose digital processor such as a microprocessor or microcontroller. In other embodiments, the processor 318 may be a special- purpose processor such as a digital signal processor. In other embodiments, the processor 318 may be another type of controller or a field programmable gate array (FPGA). Although the streamed audio playing satellite unit 104A is shown and described as having a single processor, the streamed audio playing satellite unit 104A may include multiple processors in some embodiments.
As shown in Fig. 3, the streamed audio playing satellite unit 104A further includes a clock 322 and the synchronizing digital audio player 324. The clock 322 is configured to generate a clock signal, which is used by various components of the streamed audio playing satellite unit 104A, including the synchronizing digital audio player 324. As an example, the clock 322 may be a crystal oscillator that provides the clock signal at a predefined frequency. The synchronizing digital audio player 324 is configured to play received digital audio files on the speakers 312 via the speaker driver 314. The synchronizing digital audio player 324 is similar to the synchronizing digital audio player 224 of the audio streaming base unit 102, and thus, during a broadcast mode, the synchronizing digital audio player 324 is configured to synchronize an audio data file being played on the speakers 312 with a master unit if the streamed audio playing satellite unit 104A is functioning as a slave unit. In an embodiment, the synchronizing digital audio player 324 is identical to the synchronizing digital audio player 224 of the audio streaming base unit 102. The streamed audio playing satellite unit 104A includes other conventional components, which are not described herein so as to not obscure the inventive features of the streamed audio playing satellite unit 104A.
Fig. 4 shows the components of the synchronizing digital audio player 324, which is representative of the other synchronizing digital audio players in the streamed audio playing satellite units 104B-104E and the audio streaming base unit 102, in accordance with an embodiment of the invention. As shown in Fig. 4, the synchronizing digital audio player 324 includes a first-in first- out (FIFO) buffer 426, an audio decoder 428, a resampler 430, a counter 432 and a stream synchronizer 434. These components of the synchronizing digital audio player 324 may be implemented in any combination of software, hardware and/or firmware.
The FIFO buffer 426 is configured to buffer a streamed audio data file received at the streamed audio playing satellite unit 104A. In an embodiment, the streamed audio data file is a compressed data. In Fig. 4, the FIFO buffer 426 is shown to be connected to the processor 318. However, in other embodiments, the FIFO buffer 426 may be connected directly to the receiver 306 to receive the streamed audio data file. The audio decoder 428 is configured to receive the streamed audio data file from the FIFO buffer 426 and to decode the audio data file, which is transmitted to the resampler 430. The resampler 430 is configured to selectively resample a block of the decoded audio data file from the audio decoder 428 to add or drop samples according to control signals from the stream synchronizer 434. The resampled data is then transmitted to the speaker driver 314 to play the resampled data on the speakers 312. The resampling of the decoded audio data allows the audio data file being played at the streamed audio playing satellite unit 104A during a broadcast mode to become synchronized with the same audio data file being played at a master unit, which can be the audio streaming base unit 102, one of the other streamed audio playing satellite units 104B-104E or the PC 106.
The counter 432 is configured to generate counter values using the clock signal from the clock 322. The counter values are provided to the stream synchronizer 434. In an embodiment, the stream synchronizer 434 is configured to use these local counter values to synchronize the audio data file being played at the streamed audio playing satellite unit 104A with the same audio data file being played at a master unit according to a synchronization technique, which will sometimes be referred to herein as a self-regulating roundtrip delay technique. In this synchronization technique, local and master counter values and roundtrip flight times are used to define a relationship between master counter values and local counter values, which in effect is the relationship between the master clock and the local clock. This relationship between master counter values and local counter values is then used for synchronization, as described in more detail below.
The process of computing the relationship between master counter values and local counter values by the stream synchronizer 434 in accordance with an embodiment of the invention is now described with reference to a process flow diagram of Fig. 5. At block 502, an initial local counter value from the local counter 432 is read. Next, at block 504, the initial local counter value is sent to the master unit. Next, at block 506, a copy of the initial local counter value and a master counter value from the master unit is received. The master counter value is the counter value at the time when a request message containing the initial local counter value is received at the master unit.
Next, at block 508, the initial local counter value, the master counter value and a final local counter value are recorded. The final local counter value is the local counter value at the time when the master value and the copy of the initial local counter value are received at the streamed audio playing satellite unit 104A. Next, at block 510, the roundtrip flight time from the streamed audio playing satellite units 104A to the master unit and back is calculated using the following equation: roundtrip flight time = final counter value - initial counter value. As used herein, the roundtrip flight time is the duration of time between transmission of a message containing the initial local counter value from a slave unit and receipt of a message containing the copy of the initial local counter value and the master counter value from the master unit.
Next, at block 512, a determination is made whether the calculated roundtrip flight time is within an acceptable window. If not, then the process proceeds to block 514, where the results are disregarded. In some embodiments, if the calculated roundtrip flight time is significantly outside of the acceptable window, then the acceptable window may be expanded at block 514. The process then proceeds back to block 502, to read another initial local counter value. However, if the calculated roundtrip flight time is within the acceptable window, then the process proceeds to block 516.
At block 516, an estimated local counter value corresponding to the master counter value with respect to time is calculated based on the computed roundtrip flight time using the following equation: estimated local counter value = (initial counter value + final counter value)/2. In other words, the local counter value that corresponds to the moment in time when the master counter value was read at the master unit is estimated using the above equation, which assumes that sending and receiving delays are symmetric. Next at block 518, a new frequency factor and a new delay factor are calculated to determine the current relationship between master counter values and local counter values. The new frequency factor and the new delay factor are computed using the current estimated local counter value, the corresponding current master counter value, previous estimated local counter values and corresponding previous master counter values using the following equation: local counter value = (master counter value * frequency factor) + delay factor. The above relationship between master counter value and local counter value can be used to synchronize the timing of an audio data file being played at the streamed audio playing satellite unit 104A with the timing of the same audio data file being played at the master unit.
In an embodiment, when a broadcast mode is initiated, the master unit sends the title of an audio data file to be played and the master counter value at which the master unit will begin to play the audio data file as a broadcast play message to at least one slave unit. As the audio data file is being played on the master unit and at least one slave unit, e.g., the streamed audio playing satellite unit 104A, the stream synchronizer 434 will compute the current relationship between master counter value and local counter value. Using this computed relationship, the stream synchronizer 434 will transmit a control signal to the resampler 430 to synchronize the audio data being played at the streamed audio playing satellite unit 104A with the audio data being played at the master unit.
Assuming that the clocks of the master unit and the streamed audio playing satellite unit 104A have a maximum error of 50 parts per million (ppm) and the two clocks have a maximum relative drift of 100 ppm, the two clocks will drift by a maximum of one sample for every ten thousand samples. To correct for clock drift, the computed local counter value at which the next sample will be played is compared with the actual local counter value at which the next sample will be played. If this comparison indicates that the master unit is more than one sample ahead of the streamed audio playing satellite unit 104A, the stream synchronizer 434 will transmit a control signal that instructs the resampler 430 to drop one sample during resampling of a block of audio data output (e.g., 1024) so that the block has one less sample (e.g., 1023) in order for the audio data file being played at the streamed audio playing satellite unit 104A to catch up to the audio data file being played at the master unit. However, if the comparison indicates that the master unit is more than one sample behind the streamed audio playing satellite unit 104A, the stream synchronizer 434 will transmit a control signal that instructs the resampler 430 to add one sample during resampling of the block of audio data output so that the block has one more sample (e.g., 1025) in order for the audio data file being played at the streamed audio playing satellite unit 104A to slow down to match the audio data file being played at the master unit. Lastly, if the comparison indicates that the master unit is not more than one sample ahead of or behind the streamed audio playing satellite unit 104A, the stream synchronizer 434 will transmit a control signal that instructs the resampler 430 to not change the number of samples in the block of audio data output. In this fashion, the audio data files being simultaneously played at a master unit and one or more slave units can be synchronized as the audio data files are being played. Although the process of resampling has been described herein as dropping or adding one sample for a block of audio data, the resampling process may involve dropping or adding more than one sample for a block of audio data in other embodiments.
In an embodiment, the synchronization process is performed periodically as a streamed audio data file is being played. As an example, the audio data output may be synchronized at the slave units with the master unit every second as an audio data file is being simultaneously played at the slave units and the master unit. However, in other embodiments, the audio data output may be synchronized at the slave units with the master unit at a different interval or even at varying intervals. As an example, the synchronization at a slave unit may be performed more frequently when the slave unit is initially powered up. However, as time passes, the frequency of the synchronization at the slave unit may slow down to minimize interruptions on the overall system. In some embodiment, when operating as slave units, the audio streaming base unit 102 and the streamed audio playing satellite units 104A-104E may be configured to use more than one technique for synchronization by the respective stream synchronizers. As an example, the stream synchronizer of a slave unit may initially use the self-regulating roundtrip delay technique so that the synchronization can be performed quickly, e.g., after being powered up. Then, after some time has passed, the stream synchronizer of the slave unit may switch to a more accurate technique for synchronization, such as network time protocol (NTP) technique. In some embodiments, the stream synchronizer of the slave unit may switch to one or more other known techniques for synchronization. In an embodiment, any of the participating units of the audio streaming system during a broadcast mode may elect to exit without terminating the broadcast mode. That is, the remaining participating units will still be operating in the broadcast mode. The election to exit the broadcast mode may involve sending a message to the master unit to inform the master unit of the election. When the master unit elects to exit a broadcast mode, the master role of the master unit is assigned to another unit, such as one of the participating slave units, without terminating the broadcast mode. This reassigning the master role may involve exchanging messages between the current master unit and one or more potential units of the audio streaming system 100. The master role can be assigned to any unit of the audio streaming system 100 because this function is independent of the function of the streaming audio server, which may be performed by the server processor 220 and the storage device 216 of the audio streaming base unit 102 or the PC 106. Thus, in a broadcast mode, the designated audio server streams the audio data file to the participating units, regardless of whether these units are functioning as the master or slave units for the broadcast mode. The role of the master unit is simply to provide play information of an audio data file, which is to be played during the broadcast mode, to the slave units and to play the streamed audio file from the audio server at the designated time. In an embodiment, the play information includes the title of the streamed audio data file to be played and the master counter value at which the master unit will play the streamed audio data file. The role of a slave unit is to play the streamed audio data files from the audio server in synchronization with the streamed audio data files being played at the master unit. A method for continuously synchronizing streamed audio data files during a broadcast mode in accordance with an embodiment of the invention is described with reference to a process flow diagram of Fig. 6. At block 602, a broadcast play message from a master unit of an audio streaming system is received at a slave unit of the audio streaming system to play a streamed audio data file at a particular time. Next, at block 604, timing information from the master unit is received at the slave unit as the streamed audio data file is simultaneously being played at the master unit and the slave unit. Next, at block 606, the timing of the streamed audio data file being played at the slave unit is synchronized with the timing of the streamed audio data file being played at the master unit using the timing information from the master unit as the streamed audio data file is being played at the slave unit.
Although specific embodiments of the invention have been described and illustrated, the invention is not to be limited to the specific forms or arrangements of parts so described and illustrated. The scope of the invention is to be defined by the claims appended hereto and their equivalents.

Claims

What is claimed is:
1. An audio streaming system comprising: a master unit configured to receive and play streamed audio data files, the master unit being further configured to send a broadcast play message to play a streamed audio data file at a particular time to initiate a broadcast mode; and a slave unit configured to receive and play to the streamed audio data files, the slave unit being further configured to receive the broadcast play message from the master unit and timing information from the master unit as the streamed audio data file is simultaneously being played at the master unit and the slave unit, the slave unit being further configured to synchronize the timing of the streamed audio data file being played at the slave unit with the timing of the streamed audio data file being played at the master unit using the timing information from the master unit as the streamed audio data file is being played at the slave unit.
2. The system of claim 1 wherein the slave unit includes a stream synchronizer that is configured to calculate a roundtrip flight time between the slave unit and the master unit using the timing information from the master unit, wherein the timing information includes a master counter value corresponding to a receipt time of a request message from the slave unit.
3. The system of claim 2 wherein the stream synchronizer of the slave unit is further configured to determine a relationship between master counter values at the master unit and local counter values at the slave unit using the roundtrip flight time, the relationship being used for synchronization.
4. The system of claim 3 wherein the stream synchronizer of the slave unit is further configured to calculate an estimated local counter value that corresponds to the master counter value with respect to time, the estimated local counter value and the master value being used to define the relationship between the master counter values at the master unit and the local counter values at the slave unit.
5. The system of claim 1 wherein the slave unit includes a resampler configured to resample a block of the streamed audio data file to add at least one sample into the block or to drop at least one sample from the block for synchronization.
6. The system of claim 1 wherein the slave unit is configured to use a self- regulating roundtrip delay technique for synchronization and then subsequently use a network time protocol technique for synchronization.
7. The system of claim 1 wherein the slave unit is configured to elect to exit the broadcast mode without terminating the broadcast mode.
8. The system of claim 1 wherein the master unit is configured to elect to exit the broadcast mode without terminating the broadcast mode.
9. The system of claim 8 further comprising another slave unit configured to receive and play the streamed audio data files, the another slave unit being configured to serve a master role of the master unit when the master role is assigned to the another slave unit so that the another slave unit can function as a new master unit for the broadcast mode.
10. The system of claim 1 further comprising an audio server operatively connected to the master unit and the slave unit, the audio server being configured to provide the streamed audio data files to the master unit and the slave unit, the audio server being part of an audio streaming base unit of the audio streaming system or part of a personal computer of the audio streaming system.
1 1. A method for continuously synchronizing streamed audio data files during a broadcast mode, the method comprising: receiving a broadcast play message from a master unit of an audio streaming system at a slave unit of the audio streaming system to play a streamed audio data file at a particular time; receiving timing information from the master unit at the slave unit as the streamed audio data file is simultaneously being played at the master unit and the slave unit; and synchronizing the timing of the streamed audio data file being played at the slave unit with the timing of the streamed audio data file being played at the master unit using the timing information from the master unit as the streamed audio data file is being played at the slave unit.
12. The method of claim 11 further comprising calculating a roundtrip flight time between the slave unit and the master unit using the timing information from the master unit, wherein the timing information includes a master counter value corresponding to a receipt time of a request message from the slave unit.
13. The method of claim 12 further comprising determining a relationship between master counter values at the master unit and local counter values at the slave unit using the roundtrip flight time, the relationship being used for the synchronizing.
14. The method of claim 13 wherein the determining includes calculating an estimated local counter value that corresponds to the master counter value with respect to time, the estimated local counter value and the master value being used to define the relationship between the master counter values at the master unit and the local counter values at the slave unit.
15. The method of claim 11 wherein the synchronizing includes resampling a block of the streamed audio data file to add at least one sample into the block or to drop at least one sample from the block.
16. The method of claim 11 further comprising synchronizing other audio data files being played at the slave unit using a self-regulating roundtrip delay technique and then subsequently using a network time protocol technique.
17. The method of claim 11 further comprising electing to exit the broadcast mode by the slave unit without terminating the broadcast mode.
18. The method of claim 11 further electing to exit the broadcast mode by the master unit without terminating the broadcast mode.
19. The method of claim 18 further comprising assigning a master role of the master unit to another participating unit of the audio streaming system so that the another participating unit can function as a new master unit for the broadcast mode.
20. The method of claim 11 further comprising receiving the streamed audio data file from an audio server at the slave unit, the audio server being part of an audio streaming base unit of the audio streaming system or part of a personal computer of the audio streaming system.
PCT/IB2009/050501 2008-02-08 2009-02-07 Audio streaming system and method for continuously synchronizing streamed audio data files WO2009098669A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US2740908P 2008-02-08 2008-02-08
US61/027,409 2008-02-08

Publications (2)

Publication Number Publication Date
WO2009098669A2 true WO2009098669A2 (en) 2009-08-13
WO2009098669A3 WO2009098669A3 (en) 2010-07-01

Family

ID=40952522

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2009/050501 WO2009098669A2 (en) 2008-02-08 2009-02-07 Audio streaming system and method for continuously synchronizing streamed audio data files

Country Status (1)

Country Link
WO (1) WO2009098669A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2632066A1 (en) * 2012-02-27 2013-08-28 OMS Software GmbH Method and and devices for synchronising an output of machine-readable data
ITMI20121617A1 (en) * 2012-09-28 2014-03-29 St Microelectronics Srl METHOD AND SYSTEM FOR SIMULTANEOUS PLAYING OF AUDIO TRACKS FROM A PLURALITY OF DIGITAL DEVICES.

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1398931A1 (en) * 2002-09-06 2004-03-17 Sony International (Europe) GmbH Synchronous play-out of media data packets
US20060280182A1 (en) * 2005-04-22 2006-12-14 National Ict Australia Limited Method for transporting digital media

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1398931A1 (en) * 2002-09-06 2004-03-17 Sony International (Europe) GmbH Synchronous play-out of media data packets
US20060280182A1 (en) * 2005-04-22 2006-12-14 National Ict Australia Limited Method for transporting digital media

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2632066A1 (en) * 2012-02-27 2013-08-28 OMS Software GmbH Method and and devices for synchronising an output of machine-readable data
ITMI20121617A1 (en) * 2012-09-28 2014-03-29 St Microelectronics Srl METHOD AND SYSTEM FOR SIMULTANEOUS PLAYING OF AUDIO TRACKS FROM A PLURALITY OF DIGITAL DEVICES.
US9286382B2 (en) 2012-09-28 2016-03-15 Stmicroelectronics S.R.L. Method and system for simultaneous playback of audio tracks from a plurality of digital devices

Also Published As

Publication number Publication date
WO2009098669A3 (en) 2010-07-01

Similar Documents

Publication Publication Date Title
US11271666B2 (en) Methods for transporting digital media
US7805210B2 (en) Synchronizing multi-channel speakers over a network
US9954671B2 (en) Information processing apparatus, synchronization correction method and computer program
US8762580B2 (en) Common event-based multidevice media playback
US9338208B2 (en) Common event-based multidevice media playback
EP1398931A1 (en) Synchronous play-out of media data packets
US9843489B2 (en) System and method for synchronous media rendering over wireless networks with wireless performance monitoring
US20060072695A1 (en) System and method for synchronizing audio-visual devices on a power line communications (PLC) network
JP6290915B2 (en) Common event-based multi-device media playback
US9219938B2 (en) System and method for routing digital audio data using highly stable clocks
US9804633B2 (en) Indirect clock measuring and media adjustment
US7769476B2 (en) Data reproducing system and data streaming system
WO2009098669A2 (en) Audio streaming system and method for continuously synchronizing streamed audio data files
EP3868043B1 (en) Wireless audio synchronization
WO2016134186A1 (en) Synchronous media rendering over wireless networks with wireless performance monitoring

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09708839

Country of ref document: EP

Kind code of ref document: A2