GB2226718A - Aligning two audio signals - Google Patents

Aligning two audio signals Download PDF

Info

Publication number
GB2226718A
GB2226718A GB8925698A GB8925698A GB2226718A GB 2226718 A GB2226718 A GB 2226718A GB 8925698 A GB8925698 A GB 8925698A GB 8925698 A GB8925698 A GB 8925698A GB 2226718 A GB2226718 A GB 2226718A
Authority
GB
United Kingdom
Prior art keywords
signals
time
similarity
aligning
audio signals
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB8925698A
Other versions
GB8925698D0 (en
GB2226718B (en
Inventor
David Graham Kirby
Andrew James Mason
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
British Broadcasting Corp
Original Assignee
British Broadcasting Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to GB888826927A priority Critical patent/GB8826927D0/en
Application filed by British Broadcasting Corp filed Critical British Broadcasting Corp
Publication of GB8925698D0 publication Critical patent/GB8925698D0/en
Publication of GB2226718A publication Critical patent/GB2226718A/en
Application granted granted Critical
Publication of GB2226718B publication Critical patent/GB2226718B/en
Anticipated expiration legal-status Critical
Application status is Expired - Lifetime legal-status Critical

Links

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs

Abstract

In a method for aligning two audio signals A and B, e.g. for automatic editing between recordings, repeated measurements are made of the similarity between the two signals and an optimum time offset for aligning the signals is determined. Sample sections of the two signals around the 'out' and 'in' points chosen by the user are outputted by facility 11 and sub-sections are analysed by Fast Fourier Transform circuits 12, 13 to derive a corresponding series of frequency spectra. Peaks in the correlation function performed at 14 between the two spectra are detected at 15 and from the position of the peaks, the best shift to apply to one of signals A, B to bring them into time alignment is deduced at 16. The hardware 12-16 may be replaced by a computer or microprocessor. <IMAGE>

Description

ALIGNING TWO AUDIO SIGNALS IN TIME, FOR EDITING This invention relates to the field of audio recording and specifically to a method for aligning two audio signals in time, for instance for automating the adjustment of edits between recordings to obtain a high quality edit without manual intervention.

BACKGROUND OF THE INVENTION In audio recording work it is frequently necessary to edit material together to remove mistakes or intrusive noises, for example. This is traditionally carried out by locating a suitable point in a first audio recording prior to the error and then finding the matching point in a second recording of the same material. The edit is then carried out between these two points, joining the former to the latter to remove the flawed section of the material.

The perceived quality of the resulting edited audio material is critically dependent on the accuracy with which these two edit points are located.

The timing of the second recording relative to the first at the instant of the edit will determine whether audio material is repeated or lost as the edit is replayed. This will affect the extent to which the edit is imperceptible when replayed.

Traditionally the location of these edit points is carried out by listening to the audio recordings at law speed and identifying the appropriate instants so as to align the two recordings in time.

This is a skilled operation for which considerable experience is required.

SUMMARY OF IRE INVENTION The object of the present invention is to provide a method of aligning two audio recordings in time such as for the purpose of performing an edit between them thereby eliminating the need for the manual adjustment of the timing of one recording relative to the other.

The invention is defined in the appended claims to which reference should now be made.

Briefly described in its preferred edbodint, the invention uses a method of comparing the similarity of two audio signals in a multiplicity of frequency bands with varying time offsets between the two signals. The similarity measurements are then used to derive a measurement of the relative timing of the two audio signals and hence the time offset which must be applied to one of the audio signals to bring it into time alignment with the other.

BRIEF DESCRIPTION OF THE DRAWINGS In order that the manner in which the foregoing can be understood in detail, a particularly advantageous embodiment thereof will be described with reference to the accompanying drawings, in which: Figure 1 is a representation of the editing process indicating the necessary time alignment of the two audio recordings and the position of the edit between them; Figure 2 is a block circuit diagram of apparatus for aligning two audio recordings in time embodying the invention; Figure 3 is a representation of two audio signals varying with time; Figure 4 is a representation of the frequency spectra of the two signals varying in time; Figure 5 is a representation of the power contained in a frequency band of each of the two signals varying with time;; Figure 6 is a representation of the correlation function of the two functions represented in Figure 5; Figure 7 is a histogram of the position of the peaks in the correlation functions (one of which is shown in Figure 6) of all the frequency bands of the signals; Figure 8 is a block hardware diagram of a oomputer-based embodiment of the invention; Figure 9 is a flowchart illustrating the processor operations in the system of Figure 8; and Figure 10 illustrates the fast Fourier transform operation employed.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Overview.

Consider an edit made to join two overlappping audio recordings together. The first recording contains audio up to a point where, for exançole, a mistake was made; see Figure la. The second contains audio starting at a point before the mistake, continuing on to the end; see Figure Ib. To make the edit, the user marks where he wants to go out of the first recording (the "out point") and where he wants to go into the second (the "in point"), see Figure lc. The edit is performed by playing material from the first take up to the out point and then material fram the second take starting at the in point.

In practice automated edit adjustment may be carried out in accordance with this invention as follows.

The user chooses, say, the out point that he wants. He then roughly positions the in point. The audio samples around both in point and the out point are then analysed by calculating a correlation function between the two signals. This should indicate where the best match between the two audio signals occurs and hence the optimum position for the in point. The required adjustment is then either made autanatical ly or indicated to the user.

The automated adjustment can be carried out as follows. A section fram each signal (see Figure 3) is divided into blocks of samples. The power spectrum of each of the blocks of samples is then calculated. This produces a series of spectra of the signals at regular time intervals (see Figure 4).

By selecting the same frequency band fran each of the spectra, the variation in the power in that frequency band as a function of time is determined (see Figure 5).

The correlation function of the temporal variation of the power in a frequency band fran one signal with that fran the other has a peak. The position of the peak is related to the temporal shift which, when applied to one signal, brings it into time alignment with the other (for the frequency band in question); see Figure 6.

The position of the peaks of correlation functions fran all the frequency bands are collected together (see Figure 7). The best shift to apply to the audio signals to bring them into time alignment is deduced frown this assortment of peak positions.

First Embodiment.

Figure 2 shows a suitable jitplettmatation of the invention including a disc store 10 holding the two audio recordings to be aligned and an editing facility 11 of Imclwn type connected to write to and read fram the store. The editing facility 11 makes available the two signals A and B to be o3mparéd and supplies them to two fast Fourier transform (FFT) circuits 12 and 13 respectively.

Such circuits are commercially available and execute a Fourier transform (or frequency analysis) on the input signal applied thereto. A correlator 14 then oompares the outputs of the two FFT circuits, in a manner described below. The output of the correlator is applied to a peak detector 15 which in conjunction with a peak position analayser 16 determines where the peak lies and hence the amount of temporal adjustment required to align the two recordings.

The system of Figure 2 operates as follows. The editing facility outputs two sections of audio data, one from each of the two signals A and B. Typical signal sections are shown in Figure 3. Typically the sections may be 32k (32768) samples long, sampled at a sampling rate of 48kHz, corresponding to two-thirds of a second in duration. The sampling will typically be to 16-bit accuracy. Each 32k sample is then divided in time into 128 blocks each of 256 samples.

The FFT circuits 12, 13 then perform fast Fourier transforms on each of the 128 blocks of each of the two signals to provide for each block a frequency spectre Each frequency spectrum will be defined by a block of 128 samples. Figure 4 is a three dimensional diagram illustrating the two frequency spectra for two typical signals. For each time period corresponding to the duration of one block of the signal the diagram provides a plot of power against frequency. Most of the power is at relatively low frequencies though for signal 1 (as it is here labelled) there is a notable power component at a relatively high frequency. Figure 4 thus represents the inputs to the correlator 14.

The e correlator 14 calculates the correlation function of the temporal variation in each frequency band of one signal with the corresponding variation derived from the other signal. Each spectrum is stored as a 128 word block, and is of the form shown in Figure 5. The power in the first spectral oowponent of each block thus provides a measure of the temporal variation in power for that spectral component, and similarly for the other subsequent spectral frequency o xponents. The correlation function of these two temporal variations in power content of the first spectral band is calculated to find out where variations in power are most alike in the two signals. The correlation function produced is of the type shown in Figure 6.This correlation is carried out by further FFT circuits within the correlator 14. Such correlation is carried out in time for all the spectral frequency oomponents.

The correlation function is: F.T. { [ F(w)) [ G*(w))} where F.T. denotes the Fourier transform, the asterisk * denotes the complex conjugate, and F(w) and G(w) are the Fourier transforms of the two time series, i.e. the outputs of the circuits 12 and 13.

During the correlation process it can be beneficial to apply weighting to the functions being correlated. This may be done while the data is in the frequency domain, i.e. after the Fourier transforms of the two functions have been calculated and one has been multiplied by the conjugate of the other, but before the inverse Fourier transform is performed.

An example of such a weighting function is the magnitude squared coherence spectrum which can be considered to be a measure of how much the spectral oomponents of one function are consistent with those of the other function. The spectra of the functions would be divided into segment pairs and the magnitude squared coherence spectrum calculated as follows: magnitude squared coherence spectrum <img class="EMIRef" id="027402377-00060001" />

<img class="EMIRef" id="027402377-00060002" />

where spectra F(w) and G(w) have been divided into n segment pairs: FO, Fl, F2 .......... Fn, GO, G1, G2 .... .... Gn.

Less relevant components of the functions being correlated can be subdued by multiplying the frequency domain data by the magnitude squared coherence spectrum.

The correlation function would be modified: - F.T. ( [ F(w) ] [ G*(w) ) [mscs (w)) The position of the peak in the correlation function of the two arrays shows by how much one array should be tempOrally offset relative to the other so that they fit best.

In this way 128 plots of correlation against displacement are obtained, one for each frequency band The peak detector 15 detects the peak of each of these plots. Thus 128 such peak values are obtained, and these are 'plotted" as a histogram in the peak position analyser 16 showing how many times a peak occurs at each displacement, as shown in Figure 7. This analyser thus determines the most "popular" displacement of the 128 values obtained for the different frequency bands; this value being used as the required displacement value.

Preferably, rather than just increasing the value in the histogram by one if a peak is found, the size of the peak is added.

The size of the peak depends on the original signal amplitude.

Additionally, weighting the different spectral bands may be available as an option to the user, and the range of frequency bands may also be definable by the user. In any event the peak in the resultant histogram is used as the shift required to bring the signals into time alignment for all the frequency bands considered.

The standard deviation of the peak positions plotted on the histogram may be calculated to act as a confidence indicator.

Second Embodiment.

In practice it is convenient to implement the method in a computer or microprocessor as shown in Figure 8 where the special purpose hardware of Figure 2 is replaced by a signal processor 20.

The processor 20, which may be a Motorola DSP 56000, operates in accordance with a program which is summarised in the flow chart of Figure 9 which will essentially be self explanatory in view of the description of the first embcdiment.

It is particularly convenient to undertake the fast Fourier transforms, required first to produce the frequency spectrum and then in the correlation operation, in the following way. In this method two rrrs can be calculated at the same time when the two signals requiring transforming are both entirely real. One signal is put into the real part of the elements of an array of complex numbers, the other into the imaginary part. When the FFT is performed in place on this array it produces an array containing the complex spectra of both of the signals. The two separate, complex, spectra can be extracted from the array, since the two original signals were entirely real.

Figures 10(a) and (b) show examples of two "real" time series and their Fourier transform (real and imaginary parts).

Figure 10(c) shows the effect of interchanging the real and imaginary parts of a "real" signal.

Figure 10(d) shows the effect of adding together one "real" signal as it is to the other "real" signal with its real and imaginary components interchanged.

A priori knowledge of the even-ness of real part of the Fourier transform of a purely real signal and the odd-ness of a purely imaginary signal makes it possible to extract the real and imaginary parts of the Fourier transforms of the two original "real" signals as follows:: Re [ F(u) ] = Re[F.T. {f(t) + i.g(t)} ] + Re [ F.T. {f(-t) + i.g(-t)}) Im [ F(u)) = Im(F.T. {f(t) + i.g(t)}) - Im [ F.T. {f(-t) + i.g(-t)} ] Re [ G(u)) = Im [ F.T. {f(t) + i.g(t))) + Im[F.T. {f(-t) + i.g(-t))l 1m [ G(u)) = -(Re[F.T. {f(t) + i.g(t)}) - Re[F.T. {f(-t) + i.g(-t)}]) where F(u) and G(u) denote the Fourier transform of f(t) and g(t) respectively, and Re [ x ] and 1m [ x ] denote the real and imaginary part of a complex number x respectively.

The methods described, which involve splitting the signals into frequency bands and determining their similarities in the separate frequency bands, have been found to be particularly successful in reliably identifying the amount of displacement required to bring the two signals into alignment.

Claims (15)

1. A method for aligning two audio signals in time, comprising the steps of: determining the similarity of the two signals for varying time offsets between them, and deriving from the similarity measurements an optimum time offset to bring the signals into time alignment.
2. A method according to claim 1, in which the similarity of the audio signals is measured in a multiplicity of frequency bands, and the multiplicity of similarity measurements is processed to provide a preferred time offset.
3. A method according to claims 1 or 2, in which when making similarity measurements between any two signals the coherence between those two signals is used to weight the similarity measurements of the signals.
4. A method according to any preceding claim, in which the power of the harmonics in the various frequency bands is used to weight the similarity measurements.
5. A method according to any of claims 2 to 4, in which the similarity measurements are weighted according to frequency band.
6. A method according to claim 1, in which the signals are divided into blocks, the frequency spectrum of each block is determined, the power variation with time is determined for each of a plurality of frequency bands, the power variation of the two signals is correlated for each frequency band, the peak of each correlation function is determined, and the peak value of the peaks thus obtained determined to provide a desired offset.
7. A method of aligning two audio signals in time, substantially as herein described with reference to the drawings.
8. Apparatus for aligning two audio signals in time, oomprising: means for determining the similarity of the two signals for varying time offsets between them, and means for deriving fron the similarity measurements an optimum time offset.
9. Apparatus according to claim 8, in which the similarity of the audio signals is measured in a multiplicity of frequency bands, and the multiplicity of similarity measurements is processed to provide a preferred time offset.
10. Apparatus according to claim 9, in which the power of the components in the various frequency bands is used to weight the similarity measurements.
11. Apparatus according to claim 9 or 10, in which the similarity measurements are weighted according to frequency band.
12. Apparatus according to claim 8, in which the signals are divided into blocks, and including means for determining the frequency spectrum of each block, means for determining the power variation with time for each of a plurality of frequency bands, means for correlating the power variation of the two signals for each frequency band, means for determining the peak of each correlation function, and means for determining the peak value of the peaks thus obtained to provide a desired offset.
13. Apparatus for aligning two audio signals in time, substantially as herein described with reference to the drawings.
14. A method of editing audio signals, including aligning the audio signals by a method in accordance with any of claims 1 to 7.
15. Audio signal editing apparatus, including apparatus for aligning two audio signals in accordance with any of claims 8 to 13.
GB8925698A 1988-11-17 1989-11-14 Aligning two audio signals in time,for editing Expired - Lifetime GB2226718B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB888826927A GB8826927D0 (en) 1988-11-17 1988-11-17 Aligning two audio signals in time for editing

Publications (3)

Publication Number Publication Date
GB8925698D0 GB8925698D0 (en) 1990-01-04
GB2226718A true GB2226718A (en) 1990-07-04
GB2226718B GB2226718B (en) 1992-09-23

Family

ID=10647025

Family Applications (2)

Application Number Title Priority Date Filing Date
GB888826927A Pending GB8826927D0 (en) 1988-11-17 1988-11-17 Aligning two audio signals in time for editing
GB8925698A Expired - Lifetime GB2226718B (en) 1988-11-17 1989-11-14 Aligning two audio signals in time,for editing

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GB888826927A Pending GB8826927D0 (en) 1988-11-17 1988-11-17 Aligning two audio signals in time for editing

Country Status (1)

Country Link
GB (2) GB8826927D0 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0540403A1 (en) * 1991-10-30 1993-05-05 Etablissement Public Télédiffusion de France Video analysis system for editing a broadcasted or recorded television program and its use for post-production techniques, particularly multilingual
WO1993017427A1 (en) * 1992-02-24 1993-09-02 Thorn Emi Plc Aligning a given signal with a corresponding reference signal
US5790236A (en) * 1994-05-12 1998-08-04 Elop Electronics Industries Ltd. Movie processing system
US6574349B1 (en) * 1998-11-17 2003-06-03 Koninklijke Philips Electronics N.V. Embedding and extracting supplemental data in an information signal
EP1387514A2 (en) * 2002-07-31 2004-02-04 British Broadcasting Corporation Signal comparison method and apparatus
US7076316B2 (en) * 2001-02-02 2006-07-11 Nortel Networks Limited Method and apparatus for controlling an operative setting of a communications link
US20080243407A1 (en) * 2005-09-08 2008-10-02 The Mathworks, Inc. Alignment of mass spectrometry data
EP2230666A3 (en) * 2009-02-25 2012-07-11 Magix AG System and method for synchronized multi-track editing
EP2846330A1 (en) * 2013-09-06 2015-03-11 Immersion Corporation Haptic warping system that transforms a haptic signal into a collection of vibrotactile haptic effect patterns
EP2774391A4 (en) * 2011-10-31 2016-01-20 Nokia Technologies Oy Audio scene rendering by aligning series of time-varying feature data
US9245429B2 (en) 2013-09-06 2016-01-26 Immersion Corporation Haptic warping system
EP2917852A4 (en) * 2012-11-12 2016-07-13 Nokia Technologies Oy A shared audio scene apparatus
WO2019002179A1 (en) * 2017-06-27 2019-01-03 Dolby International Ab Hybrid audio signal synchronization based on cross-correlation and attack analysis

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2020080A (en) * 1978-04-27 1979-11-07 Mitsubishi Electric Corp Editing system for PCM signals

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2020080A (en) * 1978-04-27 1979-11-07 Mitsubishi Electric Corp Editing system for PCM signals

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2683415A1 (en) * 1991-10-30 1993-05-07 Telediffusion Fse Video analysis system for mounting a diffused or recorded television program and use thereof for post production techniques, in particular multilingual.
US5339166A (en) * 1991-10-30 1994-08-16 Telediffusion De France Motion-dependent image classification for editing purposes
EP0540403A1 (en) * 1991-10-30 1993-05-05 Etablissement Public Télédiffusion de France Video analysis system for editing a broadcasted or recorded television program and its use for post-production techniques, particularly multilingual
WO1993017427A1 (en) * 1992-02-24 1993-09-02 Thorn Emi Plc Aligning a given signal with a corresponding reference signal
US5623431A (en) * 1992-02-24 1997-04-22 Central Research Laboratories Limited Aligning a given signal with a corresponding reference signal
US5790236A (en) * 1994-05-12 1998-08-04 Elop Electronics Industries Ltd. Movie processing system
US6574349B1 (en) * 1998-11-17 2003-06-03 Koninklijke Philips Electronics N.V. Embedding and extracting supplemental data in an information signal
US7076316B2 (en) * 2001-02-02 2006-07-11 Nortel Networks Limited Method and apparatus for controlling an operative setting of a communications link
EP1387514A3 (en) * 2002-07-31 2008-12-10 British Broadcasting Corporation Signal comparison method and apparatus
GB2391322B (en) * 2002-07-31 2005-12-14 British Broadcasting Corp Signal comparison method and apparatus
GB2391322A (en) * 2002-07-31 2004-02-04 British Broadcasting Corp Signal comparison method using correlation
EP1387514A2 (en) * 2002-07-31 2004-02-04 British Broadcasting Corporation Signal comparison method and apparatus
US8280661B2 (en) * 2005-09-08 2012-10-02 The Mathworks, Inc. Alignment of mass spectrometry data
US20080243407A1 (en) * 2005-09-08 2008-10-02 The Mathworks, Inc. Alignment of mass spectrometry data
EP2230666A3 (en) * 2009-02-25 2012-07-11 Magix AG System and method for synchronized multi-track editing
EP2774391A4 (en) * 2011-10-31 2016-01-20 Nokia Technologies Oy Audio scene rendering by aligning series of time-varying feature data
EP2917852A4 (en) * 2012-11-12 2016-07-13 Nokia Technologies Oy A shared audio scene apparatus
EP2846330A1 (en) * 2013-09-06 2015-03-11 Immersion Corporation Haptic warping system that transforms a haptic signal into a collection of vibrotactile haptic effect patterns
US9158379B2 (en) 2013-09-06 2015-10-13 Immersion Corporation Haptic warping system that transforms a haptic signal into a collection of vibrotactile haptic effect patterns
US9245429B2 (en) 2013-09-06 2016-01-26 Immersion Corporation Haptic warping system
US9454881B2 (en) 2013-09-06 2016-09-27 Immersion Corporation Haptic warping system
US9508236B2 (en) 2013-09-06 2016-11-29 Immersion Corporation Haptic warping system that transforms a haptic signal into a collection of vibrotactile haptic effect patterns
WO2019002179A1 (en) * 2017-06-27 2019-01-03 Dolby International Ab Hybrid audio signal synchronization based on cross-correlation and attack analysis

Also Published As

Publication number Publication date
GB8826927D0 (en) 1988-12-21
GB2226718B (en) 1992-09-23
GB8925698D0 (en) 1990-01-04

Similar Documents

Publication Publication Date Title
Fyfe et al. Analysis of computed order tracking
Melton et al. Multiple signal correlators
Strube Determination of the instant of glottal closure from the speech wave
EP0113425B1 (en) Method of and apparatus for establishing the characteristics of a prerecorded two-channel audio program
US5818240A (en) Method and system for measuring signal transfer characteristics in the presence of WOW, flutter and intermodulation distortion
EP1595247B1 (en) Audio coding
US5406955A (en) ECG recorder and playback unit
US8140331B2 (en) Feature extraction for identification and classification of audio signals
CN1272765C (en) Comparing audio using characterizations based on auditory events
CA1204855A (en) Method and apparatus for use in processing signals
US4843562A (en) Broadcast information classification system and method
US9093120B2 (en) Audio fingerprint extraction by scaling in time and resampling
CN100485399C (en) Method of characterizing the overlap of two media segments
Clements et al. Detection of spontaneous synaptic events with an optimally scaled template
US20060075237A1 (en) Fingerprinting multimedia contents
US20040064209A1 (en) System and method for generating an audio thumbnail of an audio track
AU2002240461B2 (en) Comparing audio using characterizations based on auditory events
Yegnanarayana et al. Extraction of vocal-tract system characteristics from speech signals
EP1393299B1 (en) A method and system for the automatic detection of similar or identical segments in audio recordings
US3988667A (en) Noise source for transfer function testing
US8076566B2 (en) Beat extraction device and beat extraction method
JP5266343B2 (en) Method and apparatus for generating a signature
JP5362178B2 (en) Extracting and matching characteristic fingerprints from audio signals
US6512796B1 (en) Method and system for inserting and retrieving data in an audio signal
Dolson The phase vocoder: A tutorial

Legal Events

Date Code Title Description
PE20 Patent expired after termination of 20 years

Expiry date: 20091113