GB2616444A - System and method of clock synchronization in a wireless multichannel audio system (WMAS) - Google Patents

System and method of clock synchronization in a wireless multichannel audio system (WMAS) Download PDF

Info

Publication number
GB2616444A
GB2616444A GB2203234.6A GB202203234A GB2616444A GB 2616444 A GB2616444 A GB 2616444A GB 202203234 A GB202203234 A GB 202203234A GB 2616444 A GB2616444 A GB 2616444A
Authority
GB
United Kingdom
Prior art keywords
audio
clock
circuit
base station
wmas
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2203234.6A
Other versions
GB202203234D0 (en
Inventor
Wolberg Dan
Tal Nir
Shirazi Gadi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Waves Audio Ltd
Original Assignee
Waves Audio Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Waves Audio Ltd filed Critical Waves Audio Ltd
Priority to GB2203234.6A priority Critical patent/GB2616444A/en
Publication of GB202203234D0 publication Critical patent/GB202203234D0/en
Priority to PCT/IL2023/050242 priority patent/WO2023170688A1/en
Publication of GB2616444A publication Critical patent/GB2616444A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W56/00Synchronisation arrangements
    • H04W56/001Synchronization between nodes
    • H04W56/0015Synchronization between nodes one node acting as a reference for the others
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J3/00Time-division multiplex systems
    • H04J3/02Details
    • H04J3/06Synchronising arrangements
    • H04J3/062Synchronisation of signals having the same nominal but fluctuating bit rates, e.g. using buffers
    • H04J3/0632Synchronisation of packets and cells, e.g. transmission of voice via a packet network, circuit emulation service [CES]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L7/00Arrangements for synchronising receiver with transmitter
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W84/00Network topologies
    • H04W84/02Hierarchically pre-organised networks, e.g. paging networks, cellular networks, WLAN [Wireless Local Area Network] or WLL [Wireless Local Loop]
    • H04W84/10Small scale networks; Flat hierarchical networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Hardware Design (AREA)
  • Mobile Radio Communication Systems (AREA)
  • Synchronisation In Digital Transmission Systems (AREA)

Abstract

An apparatus for system-wide clock synchronisation in a wireless multi-channel audio system (WMAS) comprising a base station and a plurality of wireless audio devices. The base station comprises; a master clock source, a framer to generate frames containing audio data and related audio-clock timing derived from the master clock source, and a transmitter to transmit frames over the WMAS. Each of the plurality of wireless audio devices comprises; a local clock source, a receiver to receive frames from the base station over the WMAS; a frame synchronisation circuit to generate audio data and related timing from the received frames, and circuit to synchronise the local clock source to timing generated by the frame synchronisation circuit to generate a synchronised audio clock. Audio clocks in the base station and plurality of wireless audio devices are synchronised to the master clock source. Also provided is a similar apparatus wherein the base station and each of the plurality of audio devices comprise respective first and second pluralities of clocks. Also provided is a method of clock synchronisation for use in a WMAS and a microphone for use in a WMAS.

Description

Intellectual Property Office Application No G132203234.6 RTM Date:1 September 2022 The following terms are registered trade marks and should be read as such wherever they occur in this document: - Bluetooth - Wi-Fi Intellectual Property Office is an operating name of the Patent Office www.gov.uk/ipo
SYSTEM AND METHOD OF CLOCK SYNCHRONIZATION
IN A WIRELESS MULTICHANNEL AUDIO SYSTEM (WMAS)
FIELD OF THE DISCLOSURE
[0001] The subject matter disclosed herein relates to the field of communications and more par-ticularly relates to systems and methods of clock synchronization in a multidevice bidirectional communication system such a Wireless Multichannel Audio System (WMAS) also referred to as a wireless venue area network (WVAN).
BACKGROUND OF ME INVENTION
[0002] Wireless audio (and video) (AN) equipment used for realtime production of audio-visual information such as for entertainment or live events and conferences are denoted by the term program making and special events (PMSE). Typically, the wireless A/V production equipment includes cameras, microphones, in-ear monitors (1EM5), conference systems, and mixing consoles. PMSE use cases can be diverse, while each commonly being used for a limited duration in a confined local geographical area. Typical live audio/video production setups require very low latency and very reliable transmissions to avoid failures and perceptible corruption of the media content.
[0003] Accurate synchronization is also important to minimize jitter among captured samples by multiple devices to properly render audio video content. For example, consider a live audio performance where the microphone signal is streamed over a wireless channel to an audio mixing console where different incoming audio streams are mixed. In-ear audio mixes are streamed back to the microphone users via the wireless 1EM system. To achieve this, the audio sampling of microphones' signals should be synchronized to the system clock, which is usually integrated into the mixing console used for capturing, mixing, and playback of the audio signals.
[0004] Wireless microphones are in common use today in a variety of applications including large venue concerts and other events where use of wired microphones is not practical or preferred. A wireless microphone has a small, battery-powered radio transmitter in the microphone body, which transmits the audio signal from the microphone by radio waves to a nearby receiver unit, which recovers the audio. The other audio equipment is connected to the receiver unit by cable. Wireless microphones are widely used in the entertainment industry, television broadcasting, and public speaking to allow public speakers, interviewers, performers, and entertainers to move about freely while using a microphone without requiring a cable attached to the microphone.
[0005] Wireless microphones usually use the VHF or UHF frequency bands since they allow the transmitter to use a small unobtrusive antenna. Inexpensive units use a fixed frequency but most units allow a choice of several frequency channels, in case of interference on a channel or to allow the use of multiple microphones at the same time, FM modulation is usually used, although some models use digital modulation to prevent unauthorized reception by scanner radio receivers; these operate in the 900 MHz, 2.4 GHz or 6 GHz ISM bands. Some models use antenna diversity (i.e. two antennas) to prevent nulls from interrupting transmission as the performer moves around.
[0006] Most analog wireless microphone systems use wideband FM modulation, requiring ap-proximately 200 kHz of bandwidth. Because of the relatively large bandwidth requirements, wireless microphone use is effectively restricted to VHF and above. Older wireless microphone systems operate in the VHF part of the electromagnetic spectrum [0007] Most modern wireless microphone products operate in the UHF television band. In the United States, this band extends from 470 MHz to 614 MHz. Typically, wireless microphones operate on unused TV channels ('white spaces'), with room for one to two microphones per megahertz of spectrum available.
[0008] Pure digital wireless microphone systems are also in use that use a variety of digital mod-ulation schemes. Some use the same UHF frequencies used by analog FM systems for transmission of a digital signal at a fixed bit rate. These systems encode an RF carrier with one channel, or in some cases two channels, of digital audio. Advantages offered by purely digital systems include low noise, low distortion, the opportunity for encryption, and enhanced transmission reliability.
[0009] Some digital systems use frequency hopping spread spectrum technology, similar to that used for cordless phones and radio-controlled models. As this can require more bandwidth than a wideband FM signal, these microphones typically operate in the unlicensed 900 MHz, 2.4 GHz or 6 GHz bands.
[0010] Several disadvantages of wireless microphones include (1) limited range (a wired balanced XLR microphone can run up to 300 ft or 100 meters); (2) possible interference from other radio equipment or other radio microphones; (3) operation time is limited relative to battery life; it is shorter than a normal condenser microphone due to greater drain on batteries from transmitting circuitry; (4) noise or dead spots, especially in non-diversity systems; (5) limited number of operating microphones at the same time and place, due to the limited number of radio channels (i.e. frequencies); (6) lower sound quality.
[0011] Another important factor with the use of wireless microphones is latency which is the amount of time it takes for the audio signal to travel from input (i.e. microphone) to audio output (i.e. receiver or mixing console). In the case of analogue wireless systems, the microphone converts the acoustical energy of the sound source into an electrical signal, which is then transmitted over radio waves. Both the electrical and RF signal travel at the speed of light making the latency of analogue wireless systems negligible.
[0012] In the case of digital wireless systems, the acoustic to electrical transformation remains the same, however, the electrical audio signal is converted a digital bit stream. This conversion from analog audio to digital takes time thus introducing latency into the system. The amount of latency in a digital wireless system depends on the amount of signal processing involved, and also the RF mechanisms employed.
[0013] For typical performers in a live performance using stage monitors, 5 to 10 ms of latency is acceptable. Beyond 10 ms the signal delay becomes noticeable which can have a detrimental effect on performers timing and overall delivery. Latency is especially critical for certain performance such as vocalists and drummers, for example, during live applications that utilize in-ear monitor systems. This is because performers hear their performance both from the monitoring system and through vibrations in their bones that lead to the ear making latency more critical. In such scenarios, round trip latency should be no more than 6 ms to avoid compromising performance.
SUMMARY OF THE INVENTION
[0014] This disclosure describes a system and method of clock synchronization in a multidevice bidirectional communication system such as a wireless multichannel audio system (WMAS) also referred to as a wireless venue area network (WVAN). The WMAS of the invention comprises a base station and a plurality of wireless audio devices such as microphones, in ear monitors, etc. that can be used for live events, concerts, night clubs, churches, etc. The WMAS is a multichannel digital wideband system as opposed to most commercially available narrowband (e.g., GFSK) and analog prior art wireless microphone systems. The system is designed to provide very low latency of up to 4 ms for the round trip audio delay from the microphone to in ear monitors (shorting base station terminals). Latency is up to 6 ms including a 2 ms budget for audio console mixing and processing.
[0015] Low latency is achieved by synchronization of the entire system including the codec, trans-mit and receive frames, local clocks, messages, and frame synchronization. In one embodiment, the entire OSI stack is synchronous. The system uses a single master clock in the base station from which all other clocks both in the base station and devices are locked and derived from.
[0016] In addition, the size of the TX packet buffer in both the devices and base station is an integral number of the size of the audio compressor output buffer in the transmitter (and the audio expander input buffer in the receiver). This enables the elimination of the audio compressor output buffer (and the audio expander input buffer) where compressed packets are directly written from the compressor to the TX packet buffer (and from the RX packet buffer directly to the expander). The elimination of the audio compressor output buffer (and the audio expander input buffer) significantly reduces the overall latency of the audio. System wide synchronization enables the elimination of the audio compressor output buffer and the audio expander input buffer.
[0017] This, additional, and/or other aspects and/or advantages of the embodiments of the present invention are set forth in the detailed description which follows; possibly inferable from the detailed description; and/or learnable by practice of the embodiments of the present invention.
[0018] There is thus provided in accordance with the invention, an apparatus for system wide clock synchronization of audio signals for use in a wireless multichannel audio system (W MA S), comprising at least one base station, including a master dock source, a framer operative to generate frames containing audio data arid related audio clock timing derived from said master clock source, a transmitter operative to transmit said frames over said WMAS, a plurality of wireless audio devices, each wireless audio device including a local clock source, a receiver operative to receive frames from said at least one base station over said WMAS, a frame synchronization circuit operative to generate audio data and related timing from said received frames, a circuit operative to synchronize said local clock source to timing generated by said frame synchronization circuit to generate a synchronized audio clock thereby, and wherein audio clocks in said at least one base station and said plurality of wireless audio devices are synchronized to said master clock source.
[0019] There is also provided in accordance with the invention, an apparatus for clock synchroni-zation for use in a multichannel audio system (WMAS), comprising at least one base station, including a master clock source, a first clock generator circuit operative to generate a first plurality of clocks including a first audio clock synchronized to said master clock source, a framer operative to generate frames containing audio data and timing derived from said master clock source, a transmitter operative to transmit said frames over said WMAS, a plurality of wireless audio devices, each wireless audio device including a local clock source, a receiver operative to receive frames from said at least one base station over said WMAS, a frame synchronization circuit operative to generate clock timing from said received frames, and a second clock generator circuit operative to generate a second plurality of clocks including a second audio clock synchronized to clock timing generated by said frame synchronization circuit to provide system wide time synchronization of audio clocks across said WMAS.
[0020] There is further provided in accordance with the invention, a method of clock synchroni-zation for use in a multichannel audio system (WMAS) that includes at least one base station and a plurality of wireless audio devices, the method comprising in said at least one base station providing a master clock source, generating a first plurality of clocks including a first audio clock synchronized to said master clock source, generating frames containing audio data and timing derived from said master clock, transmitting said frames over said WMAS, in each wireless audio device providing a local clock source, receiving frames from said at least one base station over said WMAS, generating clock timing from said received frames, generating a second plurality of clocks including a second audio clock synchronized to clock timing generated by said frame synchronization circuit, and wherein first audio clocks in said at least one base station and second audio clocks in said plurality of wireless audio devices are all synchronized to said master clock source.
[00211 There is also provided in accordance with the invention, a microphone for use in a multi-channel audio system (WMAS), the microphone comprising a local clock source, a receiver operative to receive frames over said WMAS, the frames containing timing derived from a master clock source in the WMAS, a frame synchronization circuit operative to extract clock timing from the received frames, and a clock generator circuit operative to generate a plurality of clocks including an audio clock synchronized to clock timing generated by the frame synchronization circuit to provide system wide time synchronization of audio clocks across the WMAS,
BRIEF DESCRIPTION OF THE DRAWINGS
[0023] The present invention is explained in further detail in the following exemplary embodi-ments and with reference to the figures, where identical or similar elements may be partly indicated by the same or similar reference numerals, and the features of various exemplary embodiments being combinable. The invention is herein described, by way of example only, with reference to the accompanying drawings, wherein: [0024] Fig. 1 is a diagram illustrating an example wireless multichannel audio system (WMAS) incorporating the system and method of clock synchronization of the present invention; [0025] Fig. 2 is a high level block diagram illustrating an example unidirectional link buffering and clocking scheme; [0026] Fig 3 is a high level block diagram illustrating an example uplink device and base station scheme; [0027] Fig 4 is a block diagram illustrating a First example audio synchronization scheme using analog feedback synchronization; [0028] Fig. 5 is a block diagram illustrating a second example audio synchronization scheme using digital feedback synchronization; [0029] Fig. 6 is a high level block diagram illustrating an example downlink device and base sta-tion; [0030] Fig. 7 is a diagram illustrating timing for an example WA/1AS system; [0031] Fig. 8 is a high level block diagram illustrating an example uplink buffering and clocking scheme; [0032] Fig. 9 is a high level block diagram illustrating an example downlink buffering and clock-ing scheme; [0033] Fig. 10 is a high level block diagram illustrating an example frame synchronizer; [0034] Fig. 11 is a flow diagram illustrating am example method of clock synchronization for use in the base station. and [0035] Fig. 12 is a flow diagram illustrating am example method of clock synchronization for use in the wireless audio device
DETAILED DESCRIPTION
[0036] In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. It will be understood by those skilled in the art, however, that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, and components have not been described in detail so as not to obscure the present invention.
[0037] Among those benefits and improvements that have been disclosed, other objects and ad-vantages of this invention will become apparent from the following description taken in conjunction with the accompanying figures. Detailed embodiments of the present invention are disclosed herein; however, it is to be understood that the disclosed embodiments are merely illustrative of the invention that may be embodied in various forms. In addition, each of the examples given in connection with the various embodiments of the invention which are intended to be illustrative, and not restrictive [0038] The subject matter regarded as the invention is particularly pointed out and distinctly claimed in the concluding portion of the specification. The invention, however, both as to organization and method of operation, together with objects, features, and advantages thereof, may best be understood by reference to the following detailed description when read with the accompanying drawings.
[0039] The figures constitute a part of this specification and include illustrative embodiments of the present invention and illustrate various objects and features thereof Further, the figures are not necessarily to scale, some features may be exaggerated to show details of particular components. In addition, any measurements, specifications and the like shown in the figures are intended to be illustrative, and not restrictive. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the present invention. Further, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements.
[0040] Because the illustrated embodiments of the present invention may for the most part, be implemented using electronic components and circuits known to those skilled in the art, details will not be explained in any greater extent than that considered necessary, for the understanding and appreciation of the underlying concepts of the present invention and in order not to obfuscate or distract from the teachings of the present invention.
[0041] Any reference in the specification to a method should be applied mutatis mutandis to a system capable of executing the method. Any reference in the specification to a system should be applied mutatis mutandis to a method that may be executed by the system.
[0042] Throughout the specification and claims, the following terms take the meanings explicitly associated herein, unless the context clearly dictates otherwise. The phrases -in one embodiment," "in an example embodiment," and "in some embodiments" as used herein do not necessarily refer to the same embodiment(s), though it may. Furthermore, the phrases "in another embodiment," "in an alternative embodiment, and "in some other embodiments" as used herein do not necessarily refer to a different embodiment, although it may. Thus, as described below, various embodiments of the invention may be readily combined, without departing from the scope or spirit of the invention.
[0043] In addition, as used herein, the term or" is an inclusive "or" operator, and is equivalent to the term "and/or," unless the context clearly dictates otherwise. The term "based on" is not exclusive and allows for being based on additional factors not described, unless the context clearly dictates otherwise. In addition, throughout the specification, the meaning of "a," "an, and 'the include plural references. The meaning of "in" includes "in" and "on." [0044] As will be appreciated by one skilled in the art, the present invention may be embodied as a system, method, computer program product or any combination thereof. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit,-"module" or "system." Furthermore, the present invention may take the form of a computer program product embodied in any tangible medium of expression having computer usable program code embodied in the medium.
[0045] The present invention is described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented or supported by computer program instructions. These computer program instructions may be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
[0046] These computer program instructions may also be stored in a computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks [0047] The computer program instructions may also be loaded onto a computer or other program-mable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
[0048] The invention is operational with numerous general purpose or special purpose computing system environments or configurations. Examples of well-known computing systems, environments, and/or configurations that may be suitable for use with the invention include, but are not limited to, personal computers, server computers, cloud computing, hand-held or laptop devices, multiprocessor systems, microprocessor, microcontroller or microcomputer based systems, set top boxes, programmable consumer electronics, ASIC or FPGA core, DSP core, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
System Architecture [0049] A diagram illustrating an example wireless multichannel audio system (WMAS) incorpo-rating the system and method of clock synchronization of the present invention is shown in Figure 1 The example WNIAS, generally referenced 10, comprises a base station 14 which is typically coupled to a mixing console 12 via one or more cables, and a plurality of wireless devices including wireless microphones 16, monophonic in ear monitors (LEMs) 18, and stereo IEMs equipped with an inertial measurement unit (Th/ILT) 20 [0050] Wireless microphone devices 16 include an uplink (UL) 98 that transmits audio and man-agement information and a downlink (DL) 180 that receives management information. IEM devices 18 include an uplink 98 that transmits management information and a downlink 180 that receives mono audio and management information. IEM devices 20 include an uplink 98 that transmits IMU and management information and a downlink 180 that receives stereo audio and management information.
[0051] The WMAS comprises a star topology network with a central base station unit (BS) 14 that communicates and controls all the devices within the WMAS (also referred to as network"). The network is aimed to provide highly reliable communication during a phase of a live event referred to as ShowTime". The network at that time is set, and secured in a chosen configuration. This 1.3 minimizes the overhead, which is typically present in existing wireless standards, that is needed by the network.
[0052] In one embodiment, the features of the WMAS include (1) star topology; (2) point to mul-tipoint audio with predictable schedule including both DL and UL audio on the same channel (typically on a TVB frequency); (3) all devices are time synchronized to base station frames; (4) support for fixed and defined devices; (5) support for frequency division multiplexing (FDM) for extended diversity schemes; (6) TDM network where each device transmits its packet based on an a priori schedule; (7) wideband BS with one or two transceivers receiving and transmitting many (e.g., greater than four) audio channels; (8) TDM/OFDM for audio transmissions and Wideband OFDM(A) in DL and a packet for each device in UL; (10) main and auxiliary wireless channels are supported by all network entities; and (11) all over the air (OTA) audio streams are compressed with 'zero' latency.
[0053] Regarding latency, the WMAS of the present invention is adapted to provide extremely low latency system (i.e. audio to audio path) of a maximum of 4 ms not including audio console mixing and processing time of 2 ms. An audio event is received by a wireless microphone device. Audio is then wirelessly transmitted over the uplink to the base station (BS). Wired handover to a general purpose audio mixing console occurs with a fixed latency of up to 2 ms, from receiving an audio stream to the return audio stream. The processed audio stream returned to the base station is then wirelessly transmitted over the downlink to an IEM device which plays the audio stream to the user. Uplink latency is defined as an audio event received by a wireless microphone device, then wirelessly transmitted to the base station for output over the audio input/output (TO) and should be no more than 2 ms.
[0054] In one embodiment, the WMAS system of the present invention achieves performance having (I) low packet error rate (PER) (e.g., 5e-8) where retransmissions are not applicable; (2) a short time interval of missing audio due to consecutive packet loss and filled by an audio concealment algorithm (e.g., 15 ms); and (3) acceptable range which is supported under realistic scenarios including body shadowing.
[0055] In addition, the WMAS system is adapted to operate on the DTV white space UHF chan-nels (i.e. channels 38-51). Note that the system may use a white space channel that is adjacent an extremely high power DTV station channel while still complying with performance and latency requirements.
[0056] It is noted that currently the requirements for a network that ensures low latency across layers for all devices to meet desired performance (i.erange of 100 m and packet error rate (PER) of 5e-8) are not supported by any standard today. For example, the latency of the Bluetooth (BT) compander by itself is more than the overall required latency (-6 ms). The inherent buffering of the BT between layers is measured in several milliseconds. Note also that the Wi-Fi 802.11ax standard can support a minimum bandwidth of 20 MHz and a maximum number of eight devices where a nan-owband interferer is likely to cause a full loss of connection. Applying the above solutions to the TV frequency band whitespace will fail to comply with most of the attributes outlined supra.
[0057] To meet the desired low round trip latency, the system of the present invention utilizes several techniques including (1) all network entities are synchronized to the base station baseband (BB) clock, which is achieved using PHY synchronization signals (time placement calculation) that are locked to the wireless frame time as established by the base station, thus minimizing the buffering to a negligible level; (2) an audio components are synchronized to the baseband clock by a feedback signal from the synchronization buffers; (3) the TX/RX PHY packets contain an integer number of compressed audio buffers (4) efficient network design; and (5) use of a low latency compander where the delay of the input buffer is the main contributor of latency.
[0058] The following definitions apply throughout this document. The latency (expressed as time difference) of an audio system refers to the time difference from the moment a signal is fed into the system to the moment it appears at the output. Note that any system compression operation applied might be lossy meaning the signal at the output might not be identical to the signal at the input.
[0059] Uplink latency is defined as latency (time difference) of the system (i.e. device and the base station) from the moment an audio event appears on the input of the ADC until the event appears on the analog or digital audio output of the base station.
[0060] Downlink latency is defined as the latency (time difference) of the audio system (device and base station) from the moment an audio event appears on analog or digital input of the base station until it appears on the DAC output of the device.
[0061] Round trip latency is defined as the latency (time difference) of the audio system from the time an audio event appears on the uplink device input until the time it appears on a downlink device output, while looping back at the base station terminals.
[0062] Synchronized clocks are defined as clocks that appear to have no long term drift between them The clocks may have short term jitter differences but no long term drift.
[0063] A high level block diagram illustrating an example unidirectional link buffering and clock-ing scheme is shown in Figure 2. The system, generally referenced 30, comprises a centralized base station 66 in communication with one or more devices 64, such as a wireless microphone, over a radio link 48 that make up the WMAS. The base station 66 comprises, inter alia, a receiver (RX) 67, digital to analog converter (DAC) buffer 60, and digital to analog converter (DAC) 62. The RX circuit comprises an RX packet buffer 50 coupled to audio expander 54 and an audio clock regenerator circuit 52. The audio expander 54 comprises audio expander unit 56 and audio expander output 58. Note that multiplexing, combining or mixing of several device audio streams in the base station is typically performed using analog means (not shown).
[0064] The device 64 comprises clock management circuit 32, audio clock 34, ADC 36 and TX circuit 65. The TX circuit comprises ADC buffer 38, audio compressor circuit 40, and TX packet buffer 42. The audio compressor circuit 40 comprises audio compressor input buffer 44 and audio compressor output buffer 46.
[0065] Note that a typical audio compressor (e.g., MP3, AAC, LDAC) works in blocks where each input block is compressed into an output block. The audio expander relies on the current compressed block as well as historically received blocks in order to reproduce output blocks corresponding and close to the audio compressor input blocks. Many wireless audio compressors perform lossy compression in order to reduce the bandwidth significantly (e.g., up to 1.10 compression ratio).
[0066] Note also that the compressed buffer size is directly related to the compression ratio wherein higher compression ratio is equivalent to increased buffer size. A typical estimation of a such a time gap is approximately 0.2 -0.5 ms. In addition, from an audio latency perspective, it is preferable to match the size of the audio compressor buffer to the packet buffer size.
[0067] In this embodiment, the device clock management scheme features free running clocks with respect to the base station. The device main clock is the audio clock 34, and a clock management unit 32 derives the rest of the system clocks by digitally dividing or using clock multiplication schemes such as phase locked loops (PLLs), frequency locked loops (FLLs) or delay locked loops (DLLs). Within base station each unit locks on the corresponding clock. The RX PHY locks onto the frame clock and regenerates the audio clock for the audio expander 54 and DAC component 62 which functions to output analog audio signal 63. Since the system includes an arbitrary packet size (i.e. general purpose or non-deterministic) the audio compressor 40, which accepts an input block from the ADC buffer 38, stores it in the input buffer 44, and compresses it into an output block that is stored in the output buffer 46. Thus, the audio compressor must maintain an output block buffer 46 that significantly contributes to the delay.
[0068] The components contributing to latency in the system across devices and the base station is indicated in Figure 2. The latency contributors can be divided into three types: (1) core latency which is a latency that cannot be minimized by faster clocking or hardware layout (e.g., buffer delay); (2) hardware dependent which is a latency that theoretically can be minimized to zero using faster clocking and/or hardware layout; and (3) medium and filters wherein the PHY layer delay delays some hardware dependencies but the main contributors are essential delays to achieve performance, e.g., receiver rejection.
Table 1: Latency contributors time tags
Label Description Typical Latency Type
an Synchronization buffer delay used to Mick and adaptively Assuming an audio sampling rate of 48 kHz. Then 2/48 ms for a relaxed synchronization Can work with a minimal timc of 1/48 ins (typically 10 core latency synchronize the audio card clock to the RF clock Ms) 6.T2 Compressor processing delay -10 ps hardware dependent latency,' 4T3 TX delay including packet buffering 2-3 ms (derived from overall latency requirement). Tradeoff between efficiency and latency. core latency 4T4 RX delay including PHY filtering and medium <50 ps medium and filters AT5 Expander processing delay -10 ps hardware dependent latency,' AT6 Compressor buffering delay -250 ps core latency 477 DAC synchronization buffer delay Typically between 5-10 ps [0069] As indicated, the core latency of this scheme includes the ADC input buffer 38 duration Arland DAC input buffer 60 duration An, the packet duration and other PRY related delays (e.g., filters, etc.) AT3 the audio compressor input buffer duration Ar6 which is inherently equivalent to the expander output buffer duration. Other hardware related delays include the audio compressor and expander operation durations Ar2 and Ars, respectively. Modem latency (e.g., receiver operation) is denoted by ArI. A summary of the various latencies in the system 30 of Figure 2 is provided below.
CoreLatency a ± Ars ± AT7 ± AT6 (I) HardwareDependentLatency AT2 AT5 (2) ModemLatency An (3) [0070] A high level block diagram illustrating an example uplink device and base station scheme in a multi-device bidirectional communication system wireless multichannel audio system (WNIAS) is shown in Figure 3. The system, generally referenced 70, comprises base station 74 in communication with one or more devices 72 over uplink radio links 98 that make up the WN4AS.
The base station 74 comprises, inter alia, a master clock 106, clock generation circuit 108, TX circuit including framer 112, receiver (RX) 90, DAC buffer 124, audio block 114 including DAC 116 and digital interface circuit 118. The RX circuit 90 comprises an RX packet buffer 100 coupled to audio expander 102 including audio expander output buffer 104. The DAC functions to generate analog audio output signal 120 while the digital interface 118 generates a digital audio output signal 122 sent to the mixing console.
[0071] The device 72 comprises audio circuit 81, local clock source 83 (e.g., TCXO), RX circuit 76, clock generation circuit 80, synchronization buffer 86, and TX circuit 88. The ADC 82 converts analog audio input 84 to digital format which is fed to the synchronization buffer 86. The frame synchronization circuit 78 provides synchronization to the clock gen circuit 80. The output of the synchronization buffer 86 is input to the compressor input buffer 94 in the audio compressor 92. The output of the audio compressor is input to the TX packet buffer 96.
[0072] Figure 3 also indicates the various delays that make up system latency. In accordance with the present invention, overall latency in the system is minimized by keeping the audio system tightly locked to the RF clock. In operation, the base station (BS) 74 serves multiple devices which can coexist in the overall system. The device shown in this example system functions to send audio in an uplink direction, i.e. from device to BS. Examples of the device may include a wireless microphone, etc. as described in connection with Figure 1 supra.
[0073] Both the device and the base station include a receiver and a transmitter, which aid in the clock recovery and locking process. The BS includes a master clock 106 on whose output the rest of the BS clocks are locked (e.g., transmitter clock, DAC clock, receiver clock, console clock, etc.). It is appreciated that the master clock may be selected by the designer without loss of generality and is not critical to the invention.
On the device side, the receiver uses a periodic over-the-air temporal signal, such as a multicast downlink packet, generated and sent by the base station to generate and lock the receiver clock, transmitter clock and the ADC clock. An example of a periodic over-the-air temporal signal is the frame synchronization signals generated by frame synchronizer circuit 78 in the RX 76.
[0074] Note that the system may either have analog outputs from a DAC 116 or a digital console interface 118, which may contain uncompressed audio signals and optionally a master clock 123 for the entire system to synchronize on. Using the digital console interface saves another A T7 delay. This delay, however, is reintroduced when the console outputs its analog output into actual speakers. If the signal is used for loopback (e.g., performer monitor signal), however, this delay is completely saved and does not get reintroduced.
[0075] Several key characteristics of this system allows for a significant reduction of the overall latency. They include (1) use of a single master clock in the base station from which all other clocks both in the base station and devices are locked and derived from; (2) the system is deterministic and contains no changes in the schedule while in ShowTime and (3) the size of the packets used is an integer multiple of the size of the audio compressor output buffer (as well as the audio expander input buffer).
[0076] Making the size of the TX packet buffer an integer multiple of the size of the compressor output buffer enables the elimination of the audio compressor output buffer (and the audio expander input buffer) where compressed packets are directly written from the compressor to the TX packet buffer (and from the RX packet buffer directly to the expander). The elimination of the audio compressor output buffer (and the audio expander input buffer) significantly reduces the overall latency of the audio path. Note synchronization of the base station with the devices enables the audio compressor output buffer and the audio expander input buffer to be eliminated.
[0077] Note that the core latency of this scheme includes the ADC input buffer 86 duration An and DAC input buffer duration iT7, the packet duration and other PHY related delays AT3 (e.g., filters, etc.). Since, however, the packet size is an integer multiple of the compressor output buffer (and expander input buffer) size, there is no need for extra buffering. The output blocks generated by the audio compressor 92 output are simply inserted into the TX packet buffer 96 and once the last block has been written, the packet is transmitted by the transmitter 88. Conversely, on the base station side, once the receiver 90 has received the complete packet, the audio expander 102 starts its operation on the first block therein. This scheme saves a significant round trip latency of typically 0.5 ms -1 ms (assuming a compressor buffer size of 0.25 ms -0.5 ms). Considering the sensitivity of performers and artists to latency, this is a significant and well appreciated delay in the industry. A summary of the various latencies in the system 70 of Figure 3 is provided below.
CoreLatency "="AT1 ± AT3 AT7 (4) HardwareDependentLatency AT2 AT5 (5) ModemLatency "a" AT4 (6) [0078] The present invention provides two alternative techniques for providing audio synchroni-zation. In one embodiment, the PHY digital clocks of all devices are synchronized to the PHY clock in the base station by locking onto transmitted frames. To achieve maximum overall latency goal of 6 ms the system should achieve full audio synchronization of all audio devices within the WM AS. The audio codecs which are free running clocks as well as the network PHY locked clocks (i.e. locked to frames), which is tagged as an output clock, can be synchronized in the following manner.
[0079] The first technique uses digital feedforward synchronization where the entire synchroniza-tion is performed in the digital domain which is driven by the output clock. A block diagram illustrating a first example audio synchronization scheme using analog feedback synchronization is shown in Figure 4. The circuit, generally referenced 130, comprises an audio sampling clock 132, synchronization buffer 134, synchronization tracking block 138, and Farrow polyphase filter 136.
[0080] In operation, the Farrow polyphase circuit 136 can interpolate the signal at any fractional point of timing T 135 with considerable accuracy and is commanded by synchronization tracking circuit 138. Since the input and output clocks are independently free running, the synchronization elastic buffer 134 may contain a variable delay, whose length changes (i.e. increases or decreases) based on the clock drift between the two clocks. Synchronization and tracking circuit 138 tracks this buffer length and assumes the correct number of samples per frame. In order to compensate for the clock drift it changes the sampling point t 135 given to the farrow polyphase filter 136 and changes (i.e. via a skip/add process) synchronization buffer switch position 137.
[0081] The second technique uses analog feedback synchronization where the synchronization tracking circuitry changes the audio sampling rate (i.e. the input clock) by a feedback signal. A block diagram illustrating a second example audio synchronization scheme using digital feedback synchronization is shown in Figure 5. The circuit, generally referenced 140, comprises an audio sampling clock 142, synchronization buffer 144, and synchronization tracking block 146.
[0082] In operation, the synchronization tracking 146 generates a feedback signal 145 that controls the audio sampling rate. The output audio clock is derived from the output of the synchronization buffer. Since the input and output clocks are independently free running the synchronization elastic buffer 144 may contain a variable delay, whose length changes (i.e. increases or decreases) based on the clock drift between the two clocks. Synchronization and tracking circuit 146 tracks this buffer length and assumes the correct number of samples per frame. It uses variable input clock 142 in order to compensate in feedback form for the drifts and make sure that synchronization buffer 144 does not over or underflow.
[0083] A high level block diagram illustrating an example downlink device and base station in a multi-device bidirectional communication system is shown in Figure 6. The system, generally referenced 150, comprises base station 152 in communication with one or more devices 154 over a radio link 180 that make up the WNIAS. The base station 74 comprises, inter al a, a master clock 106, clock generation circuit 108, TX circuit 171, RX circuit 160, audio circuit block 114, and synchronization buffer 168 The audio circuit 114 comprises ADC 164 and digital interface circuit 118. The TX circuit 110 comprises framer 112, audio compressor 174 including input buffer 176, and TX packet buffer 178.
[0084] The ADC 164 converts analog audio input 200 to digital format which is fed to the syn-chronization buffer 168. The framer circuit 112 provides synchronization to the devices on the network. The output of the synchronization buffer 168 is input to the compressor input buffer 176 in the audio compressor 174. The output of the audio compressor is input to the TX packet buffer 178.
[0085] The device 72 comprises local clock source, e.g., temperature controlled crystal oscillator (TCXO) 83, clock gen circuit 80, RX circuit 76, DAC buffer 194, TX circuit 88 including framer 208, and audio circuit 81. The RX circuit 76 comprises an RX packet buffer 186 coupled to audio expander 188 including audio expander output buffer 190. The DAC functions to generate analog audio output signal 202. In the receiver, the frame synchronization circuit 78 derives clock timing from the inbound frames which is used by the clock gen circuit to synchronize all the clocks in the device to the base station.
[0086] The various delays (i.e. AT1 to An) that contribute to the overall latency are indicated in Figure 6. By keeping the audio system tightly locked to the RE clock, the system minimizes the overall latency. The base station (BS) on the left-hand side functions to serve multiple devices and an example device is shown on the right. The device receives audio in the downlink direction (i.e. from the BS to the device). An example of the device is an in-ear monitor (IEM), whether mono or stereo.
[0087] Both the device and the base station include a receiver and a transmitter, which aid in the clock recovery and locking process The BS includes a master clock 106 on whose output the rest of the BS clocks are locked (e.g., transmitter clock, DAC clock, receiver clock, console clock, etc.).
[0088] On the device side, the receiver uses a periodic over-the-air temporal signal generated and sent by the base station to generate and lock the receiver clock, transmitter clock and the ADC clock. An example of a periodic over-the-air temporal signal is the frame synchronization signals generated by frame synchronizer circuit 78, transmitted as downlink packets and received in the RX 76.
[0089] Note that the system may either have analog audio input to an ADC 164 or digital audio input 201 to a digital console interface 166, which may contain uncompressed audio signals and optionally a master clock for the entire system to synchronize on.
[0090] Several key characteristics of this system allows for a significant reduction of the overall latency. They include (1) use of a single master clock from which all other clocks are locked and derived from; (2) the system is deterministic and contains no changes in the schedule while in ShowTime; and (3) the size of the packets used is an integer multiple of the size of the audio compressor output buffer as well as the audio expander input buffer.
[0091] Note that the core latency of this scheme includes the ADC input buffer 168 duration An and DAC input buffer 194 duration An, the packet duration and other PHY related delays An (e.g., filters, etc.). Since, however, the packet size is an integer multiple of the compressor output buffer (and expander input buffer) size, there is no need for extra buffering. The output blocks generated by the audio compressor 174 output are simply inserted into the TX packet buffer 178 and once the last block has been written, the packet is transmitted by the transmitter 110. Conversely, on the device side, once the receiver 76 has received the complete packet, the audio expander 188 starts its operation on the first block therein. This scheme saves a significant round trip latency of typically 0.5-1.0 ms (assuming a compressor buffer size of 0.25 -0.5 ms). A summary of the various latencies in the system 150 of Figure 6 is provided below.
Caret atencyA + -T1 A = * -T3 * + A -T7 (7) HardwareDependentLatency AT2 ATS (8) ModemLatency API. (9) [0092] A flow diagram illustrating am example method of clock synchronization for use in the base station is shown in Figure 11. Clock synchronization in the system is derived from the master clock source in the base station (step 370). A clock generator circuit uses the master clock (or one provided externally) to generate the various clocks used in the system including an audio clock all of which are synchronized to the master clock (step 372). The base station generates frames containing audio data and timing derived from the master clock (step 374) which are then transmitted over the WM AS to the wireless devices (step 376).
[0093] A flow diagram illustrating an example method of clock synchronization for use in the wireless audio device is shown in Figure 12. A lock clock source is provided in each wireless audio device (step 380). Frames sent from the base station are received over the WMAS at each device (step 382). Clock timing for the device is extracted from the received frames using the techniques shown in Figures 4 and 5 (step 384). The various clock signals required are then generated Including an audio clock synchronized to clock timing produced by the frame synchronization circuit (step 386).
[0094] A diagram illustrating timing for an example WMAS system in accordance with an em-bodiment of the present invention is shown in Figure 7. The network in this example embodiment comprises a base station and three microphones. The base station determines the PHY frames. Those frames are recovered using the frame synchronizers in each microphone and are therefore substantially common to all the members in the network. Each frame begins with a downlink transmission from the base station to the devices used by the frame synchronizers in the microphones to lock onto the PHY frame structure. Following the downlink packets, each microphone transmits its uplink packet in designated and predetermined time slots.
[0095] Each microphone runs its own audio block consisting of the time between start of trans-mission of the respective UL packet and the subsequent UL packet. The audio frame duration is identical to the PHY frame duration but time shifted.
[0096] During each audio block, each microphone processes the samples of the captured audio, compresses them and stores them in the TX buffer. To minimize latency, the audio block completes its cycle immediately before the designated TX slot. Therefore, the audio blocks for the various microphones are time shifted with respect to each other.
[0097] Similarly, in the case of in ear monitors (IEMs) the base station generates multiple audio frames that match the DL transmissions for each device.
[0098] A high level block diagram illustrating an example uplink buffering and clocking scheme is shown in Figure 8. The system, generally referenced 210, comprises base station 74 in communication with one or more devices 72 over a radio link that make up the WMAS. The base station 74 comprises, inter alia, a master clock 106, clock generation circuit 108, RF circuit 270, TX circuit 110, RX circuit 90, and audio circuit block 114. The audio circuit 114 comprises DAC 116 that generates analog audio out 120 and digital interface circuit 118 that receives an optional external master clock 123 and generates an optional output master clock 246 to the clock gen circuit and also generates digital audio out 122. The TX circuit 110 comprises framer 112 and modulator 222 to generate RF samples output to the RF circuit for transmission. The RX circuit 90 comprises demodulator 234 and audio expander 102 and receives RF samples from the RF circuit to generate audio samples output to the DAC.
[0099] The device 72 comprises RF circuit 268, TX circuit 88, RX circuit 76, audio circuit block 81, local clock source (e.g., TCXO) 83, and clock generation circuit 80. The audio circuit 81 comprises ADC 82. The TX circuit 88 comprises modulator 256 and audio compressor 92. The RX circuit 76 comprises demodulator 262 and frame synchronizer 78.
[00100] The ADC functions to convert analog audio in 84 to digital samples which are input to the TX circuit 88. The RF samples output of the TX circuit input to the RF circuit 268 for transmission. On the receive side, the RF circuit outputs received RF samples to the RX circuit 76 where they are demodulated. The frame synchronizer generates timing from the received frames to synchronize its clocks with the base station master clock. The derived timing is input to the clock generator circuit 80 and used to generate the various clocks in the device including the audio clock.
[00101] The system shown in Figure 7 highlights the clocking scheme for the base station and an uplink device in accordance with the present invention. The master clock 106 in the base station is used to derive and synchronize digital clocks within the entire system. This clock may comprise a local oscillator (e.g., TCXO, etc.) in the base station or optionally can be supplied from the digital interface 118 coupled to a mixing console.
[00102] With reference to Figure 8, the clock generator circuit 108 generates the required clocks including for example TX, RX, and audio clocks. The TX circuit 110 includes a framer 112 and a modulator 222, while the RX circuit 90 includes a demodulator 234 and an audio expander 102. The audio expander outputs digital samples after the expander process to either the DAC 116 in the audio system 114 or a digital interface 118. The base station also includes an RF circuit 270 which converts RF samples from the TX into RF waves and receives RF waves to output RF samples to the RX.
[00103] The uplink device (e.g wireless microphone, IEM, etc.), shown on the left hand side in-cludes the receiver RX 76, a transmitter TX 88, an audio sub system 81 and a clock generator module 80. It is noted that in one embodiment uplink devices have two-way communications for management and synchronization purposes.
[00104] The clock gen module functions to generate the clocks (e.g., PHY clock, audio clock, etc.) for the RX, TX, RF circuit, and audio systems by locking and deriving digital clocks from the frame synchronization in the RX module. The RX includes a demodulator and a frame synchronizer, which locks onto the frame rate and phase using techniques such as packet detection, correlators, PLLs, DLLs, FLLs, etc. [00105] The TX includes a modulator, and an audio compressor and the audio contains an ADC converting the input analog signals into digital audio samples. Furthermore, the device contains an RF subsystem which is operative to converts RF samples from the TX into RF waves and receives RF waves to output RF samples to the RX.
[00106] A high level block diagram illustrating an example downlink buffering and clocking scheme is shown in Figure 9. The system, generally referenced 280, comprises base station 74 in communication with one or more devices 72 over a radio link that make up the WMAS. The base station 74 comprises, inter al a, a master clock 106, clock generation circuit 108, RF circuit 270, TX circuit 110, RX circuit 90, and audio circuit block 114. The audio circuit comprises ADC 164 that converts analog audio in 200 to digital audio samples and a digital interface 118. The digital interface circuit 118 receives an optional digital audio in signal 201 from a mixing console and generates output audio samples and an optional master clock 246 to the clock gen circuit 108. The TX circuit 110 comprises framer 112, audio compressor 174, and modulator 222 to receive the audio samples and generate RF samples output to the RF circuit for transmission. The RX circuit 90 comprises demodulator 234 that receives RF samples from the RF circuit to generate audio samples output to the DAC (not shown).
[00107] The device 72 comprises RF circuit 268, TX circuit 88, RX circuit 76, audio circuit block 81, local clock source (e.g., TCXO) 83, and clock generation circuit 80. The audio circuit 81 comprises DAC 198. The TX circuit 88 comprises modulator 256 and audio compressor (not shown). The RX circuit 76 comprises demodulator 262, audio expander 188, and frame synchronizer 78.
[00108] The RF samples output of the TX circuit are input to the RF circuit 268 for transmission.
On the receive side, the RF circuit outputs received RF samples to the RX circuit 76 where they are demodulated. The frame synchronizer generates timing (frame sync signal) from the received frames to synchronize its clocks with the base station master clock. The derived timing is input to the clock gen circuit 80 and used to generate the various clocks in the device including the audio clock.
[00109] The system shown in Figure 9 highlights the clocking scheme for the base station and a downlink device (e.g., IEM, etc.) in accordance with the present invention. The master clock 106 in the base station is used to derive and synchronize digital clocks within the entire system. This clock may comprise a local clock source such as an oscillator (e.g., TCXO, etc.) in the base station or optionally can be generated by the digital interface 118 from an input digital audio signal 201 from a mixing console.
[00110] The clock generator circuit 108 generates the required clocks including for example TX, RX, RF, and audio clocks. The TX circuit 110 includes a framer 112, audio compressor 174, and a modulator 222, while the RX circuit 90 includes a demodulator 234. Analog audio in 200 is converted by the ADC to digital audio samples. The base station also includes an RF unit 270 which converts RF samples from the TX into RF waves and receives RF waves to output RF samples to the RX.
[00111] The downlink device (e.g., LEM, etc.), shown on the left hand side includes the RF circuit 268, receiver RX 76, a transmitter TX 88, an audio sub system 81 and a clock generator module 80. It is noted that in one embodiment downlink devices have two-way communications for management and synchronization purposes.
[00112] The clock gen module functions to generate the clocks (e.g., PHY clock, audio clock, etc.) for the RX, TX, RF circuit, and audio systems by locking and deriving digital clocks from the frame synchronization in the RX module. The RX includes a demodulator and a frame synchronizer, which locks onto the frame rate and phase using techniques such as packet detection, correlators, PLLs, DLLs, FLLs, etc. [00113] The TX includes a modulator and the audio circuit 81 contains a DAC 198 that converts the audio samples output of the audio expander to analog audio out 202. Furthermore, the device contains an RF subsystem which is operative to converts RF samples from the TX into RF waves and receives RF waves to output RF samples to the RX.
[00114] A high level block diagram illustrating an example frame synchronizer is shown in Figure 10. Note that the description provided herein assumes that there is at least one downlink packet in every frame. Without loss of generality, it is assumed that the first packet within a frame is a downlink packet.
[00115] [00116] The example frame synchronizer circuit, generally referenced 340, essentially comprises a phase locked loop (PLL) circuit that includes an error detector circuit 342, loop filter 360, a digitally controlled oscillator (DCO) implemented using a mod N counter 362, and comparator 364. The error detector 342 comprises boundary detect/fine placement circuit 344, sample and hold circuit 346, error signal subtractor 350, mux 354, sample and hold circuit 356, and packet end detect 352.
[00117] In operation, the error detector uses the PHY boundary detector and fine placement circuit 344, which functions to detect a precise position within received packets which can vary based on the type of modulation used. The strobe output of this block functions to provide timing for the sample and hold block 346, which samples the output of the DCO (i.e. the output of the mod N counter 362) and therefore holds the counter value at which the boundary detect/fine placement was obtained.
[00118] The target boundary value 348 which is expressed as a number indicating the number of samples from the beginning of a packet to the ideal boundary detect point. This number is subtracted from the output of the sample and hold 346 via subtractor 350 to yield the raw error expressed as a number of samples. This raw error is input to a mux 354, whose output is determined by the 'CRC check OK' signal 358 received from the PHY at the end of the packet. If the CRC check is valid, then the raw error is output from the mux, otherwise a zero is injected (i.e. no correction is input into the loop filter). The mux output is input to another sample and hold 356, which is triggered at the end of the packet 352 since the CRC OK signal is valid only at the end of the packet.
[00119] The error signal 357 is input to the loop filter 360, which can be realized by a bang bang controller, 1" order loop, 2nd order loop, PID controller, etc. The loop filter outputs a positive number (i.e. advance or increment the counter), negative number (i.e. retard or decrement the counter), or zero (i.e. NOP or no operation). Thus, the mod N counter is advanced, retarded, or unchanged depending on the error output. The DCO modulo N counter 362 increments by one each clock and is free running using the system local oscillator. A frame strobe is generated every time the counter resets to zero. The output of the DCO is compared with zero and the output of the comparator 364 generates the frame strobe to the rest of the system which is then used to derive the various clocks in the device, e.g., audio clock, RE clock, PHY clocks, etc. [00120] Those skilled in the art will recognize that the boundaries between logic and circuit blocks are merely illustrative and that alternative embodiments may merge logic blocks or circuit elements or impose an alternate decomposition of functionality upon various logic blocks or circuit elements. Thus, it is to be understood that the architectures depicted herein are merely exemplary, and that in fact many other architectures may be implemented which achieve the same functionality.
[00121] Any arrangement of components to achieve the same functionality is effectively "associ-ated" such that the desired functionality is achieved. Hence, any two components herein combined to achieve a particular functionality may be seen as "associated with" each other such that the desired functionality is achieved, irrespective of architectures or intermediary components. Likewise, any two components so associated can also be viewed as being "operably connected," or "operably coupled,-to each other to achieve the desired functionality.
[00122] Furthermore, those skilled in the art will recognize that boundaries between the above de-scribed operations merely illustrative. The multiple operations may be combined into a single operation, a single operation may be distributed in additional operations and operations may be executed at least partially overlapping in time. Moreover, alternative embodiments may include multiple instances of a particular operation, and the order of operations may be altered in various other embodiments.
[00123] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof [00124] In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The use of introductory phrases such as "at least one" and "one or more" in the claims should not be construed to imply that the introduction of another claim element by the indefinite articles "a" or "an" limits any particular claim containing such introduced claim element to inventions containing only one such element, even when the same claim includes the introductory phrases one or more" or "at least one" and indefinite articles such as "a" or "an.-The same holds true for the use of definite articles. Unless stated otherwise, terms such as "first, second," etc. are used to arbitrarily distinguish between the elements such terms describe. Thus, these terms are not necessarily intended to indicate temporal or other prioritization of such elements. The mere fact that certain measures are recited in mutually different claims does not indicate that a combination of these measures cannot be used to advantage.
[00125] The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. As numerous modifications and changes will readily occur to those skilled in the art, it is intended that the invention not be limited to the limited number of embodiments described herein, Accordingly, it will be appreciated that all suitable variations, modifications and equivalents may be resorted to, falling within the spirit and scope of the present invention. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.

Claims (1)

  1. What is claimed is.An apparatus for system wide clock synchronization of audio signals for use in a wireless multichannel audio system (WMAS), comprising: at least one base station including: a master clock source; a framer operative to generate frames containing audio data and related audio clock timing derived from said master clock source; a transmitter operative to transmit said frames over said WMAS; a plurality of wireless audio devices, each wireless audio device including: a local clock source; a receiver operative to receive frames from said at least one base station over said WMAS; a frame synchronization circuit operative to generate audio data and related timing from said received frames; a circuit operative to synchronize said local clock source to timing generated by said frame synchronization circuit to generate a synchronized audio clock thereby; and wherein audio clocks in said at least one base station and said plurality of wireless audio devices are synchronized to said master clock source.The apparatus according to claim 1, wherein said master clock source comprises a local oscillator in said base station or a clock signal from a digital interface to an audio console.The apparatus according to claim 1, wherein said local clock source in each device is free running with respect to said master clock source in said base station.The apparatus according to claim 1, wherein said wireless audio device is selected from the group consisting of a microphone or in ear monitor.An apparatus for clock synchronization for use in a multichannel audio system (WMAS), comprising: at least one base station, including: a master clock source; a first clock generator circuit operative to generate a first plurality of clocks including a first audio clock synchronized to said master clock source; a framer operative to generate frames containing audio data and timing derived from said master clock source; a transmitter operative to transmit said frames over said WMAS; a plurality of wireless audio devices, each wireless audio device including: a local clock source; a receiver operative to receive frames from said at least one base station over said WMAS, a frame synchronization circuit operative to generate clock timing from said received frames; and a second clock generator circuit operative to generate a second plurality of clocks including a second audio clock synchronized to clock timing generated by said frame synchronization circuit to provide system wide time synchronization of audio clocks across said WMAS.The apparatus according to claim 5, wherein said master clock source comprises a local oscillator in said base station or a clock signal from a digital interface to an audio console The apparatus according to claim 5, wherein said local clock source in each device is free running with respect to said master clock source in said base station.The apparatus according to claim 5, wherein all clocks in said WNIAS are synchronized to and derived from said master clock source in said base station The apparatus according to claim 5, further comprising, in each device, a synchronization circuit operative to provide digital feedforward synchronization of an audio clock to frame synchronization clock timing.10. The apparatus according to claim 5, further comprising, in each device, a synchronization circuit operative to provide analog feedback synchronization of an audio clock to frame synchronization clock timing 11. The apparatus according to claim 5, wherein said second plurality of clocks include at least one of an audio clock, ADC clock, DAC clock, TX clock, RX clock, and RF clock 12. The apparatus according to claim 5, wherein said frame synchronization circuit comprises a packet detector circuit, correlator circuit, phase locked loop (PLL) circuit, delay locked loop (DLL) circuit, and/or frequency locked loop (FLL) circuit.13. A method of clock synchronization for use in a multichannel audio system (WMAS) that includes at least one base station and a plurality of wireless audio devices, the method comprising: in said at least one base station: providing a master clock source; generating a first plurality of clocks including a first audio clock synchronized to said master clock source; generating frames containing audio data and timing derived from said master clock; transmitting said frames over said WMA S; in each wireless audio device: providing a local clock source; receiving frames from said at least one base station over said WMAS, generating clock timing from said received frames; generating a second plurality of clocks including a second audio clock synchronized to clock timing generated by said frame synchronization circuit; and wherein first audio clocks in said at least one base station and second audio clocks in said plurality of wireless audio devices are all synchronized to said master clock source.14. The method according to claim 13, wherein said master clock source comprises a local oscillator in said base station or a clock signal from a digital interface to an audio console 15. The method according to claim 13, wherein said local clock source in each device is free running with respect to said master clock source in said base station.16. The method according to claim 13, wherein all clocks in said WMAS are synchronized to and derived from said master clock source in said base station 17. The method according to claim 13, further comprising synchronizing; in each device, an audio clock to frame synchronization clock timing using digital feedforward synchronization.18. The method according to claim 13, further comprising synchronizing, in each device, an audio clock to frame synchronization clock timing using analog feedback synchronization 19. The method according to claim 13, wherein said second plurality of clocks include at least one of an audio clock, ADC clock, DAC clock, TX clock, RX clock, and RF clock.20. The method according to claim 13, wherein generating clock timing from said received frames is performed using a packet detector circuit, correlator circuit, phase locked loop (?LL) circuit, delay locked loop (DLL) circuit, and/or frequency locked loop (FLL) circuit.21. A microphone for use in a multichannel audio system (WMAS), the microphone comprising: a local clock source; a receiver operative to receive frames over said WMAS, said frames containing timing derived from a master clock source in said WMAS; a frame synchronization circuit operative to extract clock timing from said received frames; and a clock generator circuit operative to generate a plurality of clocks including an audio clock synchronized to clock timing generated by said frame synchronization circuit to provide system wide time synchronization of audio clocks across said WMAS
GB2203234.6A 2022-03-09 2022-03-09 System and method of clock synchronization in a wireless multichannel audio system (WMAS) Pending GB2616444A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB2203234.6A GB2616444A (en) 2022-03-09 2022-03-09 System and method of clock synchronization in a wireless multichannel audio system (WMAS)
PCT/IL2023/050242 WO2023170688A1 (en) 2022-03-09 2023-03-08 Clock synchronization and latency reduction in an audio wireless multichannel audio system (wmas)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2203234.6A GB2616444A (en) 2022-03-09 2022-03-09 System and method of clock synchronization in a wireless multichannel audio system (WMAS)

Publications (2)

Publication Number Publication Date
GB202203234D0 GB202203234D0 (en) 2022-04-20
GB2616444A true GB2616444A (en) 2023-09-13

Family

ID=81175355

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2203234.6A Pending GB2616444A (en) 2022-03-09 2022-03-09 System and method of clock synchronization in a wireless multichannel audio system (WMAS)

Country Status (1)

Country Link
GB (1) GB2616444A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1995910A2 (en) * 2007-05-23 2008-11-26 Broadcom Corporation Synchronization of a split audio, video, or other data stream with separate sinks
US20190089472A1 (en) * 2017-09-18 2019-03-21 Qualcomm Incorporated Audio synchronization over wlan

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1995910A2 (en) * 2007-05-23 2008-11-26 Broadcom Corporation Synchronization of a split audio, video, or other data stream with separate sinks
US20190089472A1 (en) * 2017-09-18 2019-03-21 Qualcomm Incorporated Audio synchronization over wlan

Also Published As

Publication number Publication date
GB202203234D0 (en) 2022-04-20

Similar Documents

Publication Publication Date Title
US11375312B2 (en) Method, device, loudspeaker equipment and wireless headset for playing audio synchronously
US8149817B2 (en) Systems, apparatus, methods and computer program products for providing ATSC interoperability
FI112144B (en) Audio / video synchronization in a digital broadcast system
US20090298420A1 (en) Apparatus and methods for time synchronization of wireless audio data streams
US8842218B2 (en) Video/audio data output device and method
KR20210032988A (en) Use of broadcast physical layer for one-way time transmission in Coordinated Universal Time to receivers
US8295365B2 (en) Wireless receiver
JP2017005611A (en) Dynamic image decoding device and dynamic image decoding method
WO2015130546A1 (en) Apparatuses and methods for wireless synchronization of multiple multimedia devices using a common timing framework
CA2986568C (en) Reception apparatus and data processing method
KR20090056961A (en) Method of synchronizing two electronic devices of a wirless link, in particular of a mobile telephone network and system for implementing this method
JP2008104074A (en) Terrestrial digital broadcast signal retransmitter
US10615951B2 (en) Transmission apparatus, reception apparatus, and data processing method
US20040125787A1 (en) Method and apparatus for synchronized channel transmission
GB2616444A (en) System and method of clock synchronization in a wireless multichannel audio system (WMAS)
GB2616445A (en) System and method of minimizing latency in a wireless multichannel audio system (WMAS)
WO2023170688A1 (en) Clock synchronization and latency reduction in an audio wireless multichannel audio system (wmas)
JP6943148B2 (en) Broadcast retransmission device, broadcast retransmission method and monitor method
JP2004129009A (en) Streaming transmission device and reception device
US20040235507A1 (en) Radio transmission system
WO2024038450A1 (en) System and method of time, frequency, and spatial diversity in a wireless multichannel audio system (wmas)
JP2008278151A (en) Ts signal transmission delay time adjusting device, its operation method and terrestrial digital broadcast transmission system
JP5857840B2 (en) Encoder and control method
JP2007189584A (en) Digital radio transmission system
JP2005229469A (en) Radio transmitter