EP1845519B1 - Encoding and decoding of multi-channel audio signals based on a main and side signal representation - Google Patents

Encoding and decoding of multi-channel audio signals based on a main and side signal representation Download PDF

Info

Publication number
EP1845519B1
EP1845519B1 EP07109801A EP07109801A EP1845519B1 EP 1845519 B1 EP1845519 B1 EP 1845519B1 EP 07109801 A EP07109801 A EP 07109801A EP 07109801 A EP07109801 A EP 07109801A EP 1845519 B1 EP1845519 B1 EP 1845519B1
Authority
EP
European Patent Office
Prior art keywords
signal
mono
decoded
residual
encoding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP07109801A
Other languages
German (de)
English (en)
French (fr)
Other versions
EP1845519A2 (en
EP1845519A3 (en
Inventor
Ingemar Johansson
Anisse Taleb
Stefan Bruhn
Daniel ENSTRÖM
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Telefonaktiebolaget LM Ericsson AB
Original Assignee
Telefonaktiebolaget LM Ericsson AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from SE0303501A external-priority patent/SE0303501D0/xx
Application filed by Telefonaktiebolaget LM Ericsson AB filed Critical Telefonaktiebolaget LM Ericsson AB
Publication of EP1845519A2 publication Critical patent/EP1845519A2/en
Publication of EP1845519A3 publication Critical patent/EP1845519A3/en
Application granted granted Critical
Publication of EP1845519B1 publication Critical patent/EP1845519B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L19/00Speech or audio signals analysis-synthesis techniques for redundancy reduction, e.g. in vocoders; Coding or decoding of speech or audio signals, using source filter models or psychoacoustic analysis
    • G10L19/008Multichannel audio signal coding or decoding using interchannel correlation to reduce redundancy, e.g. joint-stereo, intensity-coding or matrixing
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L19/00Speech or audio signals analysis-synthesis techniques for redundancy reduction, e.g. in vocoders; Coding or decoding of speech or audio signals, using source filter models or psychoacoustic analysis
    • G10L19/04Speech or audio signals analysis-synthesis techniques for redundancy reduction, e.g. in vocoders; Coding or decoding of speech or audio signals, using source filter models or psychoacoustic analysis using predictive techniques
    • G10L19/16Vocoder architecture
    • G10L19/18Vocoders using multiple modes

Definitions

  • the present invention relates in general to encoding of audio signals, and in particular to encoding of multi-channel audio signals.
  • stereophonic or multi-channel coding of audio signals is to encode the signals of the different channels separately as individual and independent signals.
  • Another basic way used in stereo FM radio transmission and which ensures compatibility with legacy mono radio receivers is to transmit a sum and a difference signal of the two involved channels.
  • M/S stereo coding is similar to the described procedure in stereo FM radio, in a sense that it encodes and transmits the sum and difference signals of the channel sub-bands and thereby exploits redundancy between the channel sub-bands.
  • the structure and operation of an encoder based on M/S stereo coding is described, e.g. in US patent 5,285,498 by J.D. Johnston .
  • Intensity stereo on the other hand is able to make use of stereo irrelevancy. It transmits the joint intensity of the channels (of the different sub-bands) along with some location information indicating how the intensity is distributed among the channels. Intensity stereo does only provide spectral magnitude information of the channels. Phase information is not conveyed. For this reason and since the temporal inter-channel information (more specifically the inter-channel time difference) is of major psycho-acoustical relevancy particularly at lower frequencies, intensity stereo can only be used at high frequencies above e.g. 2 kHz. An intensity stereo coding method is described, e.g. in the European patent 0497413 by R. Veldhuis et al .
  • a recently developed stereo coding method is described, e.g. in a conference paper with the title " Binaural cue coding applied to stereo and multi-channel audio compression", 112th AES convention, May 2002, Kunststoff, Germany by C. Faller et al.
  • This method is a parametric multi-channel audio coding method.
  • the basic principle is that at the encoding side, the input signals from N channels c 1 , c 2 , ... c N are combined to one mono signal m.
  • the mono signal is audio encoded using any conventional monophonic audio codec.
  • parameters are derived from the channel signals, which describe the multi-channel image.
  • the parameters are encoded and transmitted to the decoder, along with the audio bit stream.
  • the decoder first decodes the mono signal m' and then regenerates the channel signals c 1 ', c 2 ',..., c N ', based on the parametric description of the multi-channel image.
  • the principle of the Binaural Cue Coding (BCC) method is that it transmits the encoded mono signal and so-called BCC parameters.
  • the BCC parameters comprise coded inter-channel level differences and inter-channel time differences for sub-bands of the original multi-channel input signal.
  • the decoder regenerates the different channel signals by applying sub-band-wise level and phase adjustments of the mono signal based on the BCC parameters.
  • M/S or intensity stereo is that stereo information comprising temporal inter-channel information is transmitted at much lower bit rates.
  • this technique requires computational demanding time-frequency transforms on each of the channels, both at the encoder and the decoder.
  • BCC does not handle the fact that a lot of the stereo information, especially at low frequencies, is diffuse, i.e. it does not come from any specific direction. Diffuse sound fields exist in both channels of a stereo recording but they are to a great extent out of phase with respect to each other. If an algorithm such as BCC is subject to recordings with a great amount of diffuse sound fields the reproduced stereo image will become confused, jumping from left to right as the BCC algorithm can only pan the signal in specific frequency bands to the left or right.
  • a possible means to encode the stereo signal and ensure good reproduction of diffuse sound fields is to use an encoding scheme very similar to the technique used in FM stereo radio broadcast, namely to encode the mono (Left+Right) and the difference (Left-Right) signals separately.
  • a technique, described in US patent 5,434,948 by C.E. Holt et al. uses a similar technique as in BCC for encoding the mono signal and side information.
  • side information consists of predictor filters and optionally a residual signal.
  • the predictor filters estimated by a least-mean-square algorithm, when applied to the mono signal allow the prediction of the multi-channel audio signals. With this technique one is able to reach very low bit rate encoding of multi-channel audio sources, however, at the expense of a quality drop, discussed further below.
  • This technique synthesises the right and left channel signals by filtering sound source signals with so-called head-related filters.
  • this technique requires the different sound source signals to be separated and can thus not generally be applied for stereo or multi-channel coding.
  • FIG. 7a-b diagrams are illustrating such an artefact.
  • a signal component having the time development as shown by curve 100.
  • the signal component is not present in the audio sample.
  • the signal component suddenly appears.
  • the signal component is encoded, using a frame length of t2-t1
  • the occurrence of the signal component will be "smeared out” over the entire frame, as indicated in curve 101.
  • the signal component appears a time ⁇ t before the intended appearance of the signal component, and a "pre-echo" is perceived.
  • An object of the present invention is therefore to provide an encoding method and device improving the perception quality of multi-channel audio signals, in particular to avoid artefacts such as pre-echoing.
  • a further object of the present invention is to provide an encoding method and device requiring less processing power and having more constant transmission bit rate requirements.
  • a method of encoding multi-channel audio signals comprises generating of a first output signal, being encoding parameters representing a main signal.
  • the main signal is a first linear combination of signals of at least a first and a second channel.
  • the method further comprises generating of a second output signal, being encoding parameters representing a side signal.
  • the side signal is a second linear combination of signals of at least the first and the second channel within an encoding frame.
  • the method is characterised in that the generating of the second output signal further comprises scaling of the side signal to an energy contour of the main signal.
  • a method of decoding multi-channel audio signals comprises generating of a decoded main signal from encoding parameters representing a main signal.
  • the main signal is a first linear combination of signals of at least a first and a second channel.
  • the method further comprises generating of a decoded side signal from encoding parameters representing a side signal.
  • the side signal is a second linear combination of signals of at least a first and a second channel, within an encoding frame and is scaled to an energy contour of the main signal.
  • the method further comprises combining of at least the decoded main signal and the decoded side signal into signals of at least the first and the second channel.
  • the method is characterised in that the generating of a decoded side signal further comprises inverse scaling of the decoded side signal to an energy contour of the decoded main signal.
  • an encoder apparatus comprises input means for multi-channel audio signals comprising at least a first and a second channel.
  • the encoder apparatus comprises means for generating a first output signal, being encoding parameters representing a main signal.
  • the main signal is a first linear combination of signals of at least the first and the second channel.
  • the encoder apparatus further comprises means for generating a second output signal, being encoding parameters representing a side signal.
  • the side signal is a second linear combination of signals of at least the first and the second channel, within an encoding frame.
  • the encoder apparatus further comprises output means.
  • the encoder apparatus is characterised in that the means for generating a second output signal further comprises means for scaling the side signal to an energy contour of the main signal.
  • a decoder apparatus comprises input means for encoding parameters representing a main signal and encoding parameters representing a side signal.
  • the main signal is a first linear combination of a first and a second channel.
  • the side signal is a second linear combination of a first and a second channel and scaled to an energy contour of the main signal.
  • the decoder apparatus further comprises means for generating a decoded main signal from the encoding parameters representing the main signal and means for generating a decoded side signal from the encoding parameters representing the side signal within an encoding frame.
  • the decoder apparatus further comprises means for combining at least the decoded main signal and the decoded side signal into signals of at least a first and a second channel, and output means.
  • the decoder apparatus is characterised in that the means for generating a decoded side signal in turn comprises means for inverse scaling the decoded side signal to an energy contour of the decoded main signal.
  • an audio system comprises at least one of an encoder apparatus according to the third aspect and a decoder apparatus according to the fourth aspect.
  • the main advantage with the present invention is that the preservation of the perception of the audio signals is improved. Furthermore, the present invention still allows multi-channel signal transmission at very low bit rates.
  • FIG. 1 illustrates a typical system 1, in which the present invention advantageously can be utilised.
  • a transmitter 10 comprises an antenna 12 including associated hardware and software to be able to transmit radio signals 5 to a receiver 20.
  • the transmitter 10 comprises among other parts a multi-channel encoder 14, which transforms signals of a number of input channels 16 into output signals suitable for radio transmission. Examples of suitable multi-channel encoders 14 are described in detail further below.
  • the signals of the input channels 16 can be provided from e.g. an audio signal storage 18, such as a data file of digital representation of audio recordings, magnetic tape or vinyl disc recordings of audio etc.
  • the signals of the input channels 16 can also be provided in "live", e.g. from a set of microphones 19.
  • the audio signals are digitised, if not already in digital form, before entering the multi-channel encoder 14.
  • an antenna 22 with associated hardware and software handles the actual reception of radio signals 5 representing polyphonic audio signals.
  • typical functionalities such as e.g. error correction, are performed.
  • a decoder 24 decodes the received radio signals 5 and transforms the audio data carried thereby into signals of a number of output channels 26.
  • the output signals can be provided to e.g. loudspeakers 29 for immediate presentation, or can be stored in an audio signal storage 28 of any kind.
  • the system 1 can for instance be a phone conference system, a system for supplying audio services or other audio applications.
  • the communication has to be of a duplex type, while e.g. distribution of music from a service provider to a subscriber can be essentially of a one-way type.
  • the transmission of signals from the transmitter 10 to the receiver 20 can also be performed by any other means, e.g. by different kinds of electromagnetic waves, cables or fibres as well as combinations thereof.
  • Fig. 2a illustrates an embodiment of an encoder.
  • the polyphonic signal is a stereo signal comprising two channels a and b, received at input 16A and 16B, respectively.
  • the signals of channel a and b are provided to a pre-processing unit 32, where different signal conditioning procedures may be performed.
  • the (perhaps modified) signals from the output of the pre-processing unit 32 are summed in an addition unit 34.
  • This addition unit 34 also divides the sum by a factor of two.
  • the signal x mono produced in this way is a main signal of the stereo signals, since it basically comprises all data from both channels. In this embodiment the main signal thus represents a pure "mono" signal.
  • the main signal x mono is provided to a main signal encoder unit 38, which encodes the main signal according to any suitable encoding principles. Such principles are available within prior-art and are thus not further discussed here.
  • the main signal encoder unit 38 gives an output signal p mono , being encoding parameters representing a main signal.
  • a difference (divided by a factor of two) of the channel signals is provided as a side signal x side .
  • the side signal represents the difference between the two channels in the stereo signal.
  • the side signal x side is provided to a side signal encoding unit 30. Preferred embodiments of the side signal encoding unit 30 will be discussed further below.
  • the side signal x side is transferred into encoding parameters p side representing a side signal x side .
  • this encoding takes place utilising also information of the main signal x mono .
  • the arrow 42 indicates such a provision, where the original uncoded main signal x mono is utilised.
  • the main signal information that is used in the side signal encoding unit 30 can be deduced from the encoding parameters p mono representing the main signal, as indicated by the broken line 44.
  • the encoding parameters p mono representing the main signal x mono is a first output signal
  • the encoding parameters p side representing the side signal x side is a second output signal.
  • these two output signals p mono , p side together representing the full stereo sound, are multiplexed into one transmission signal 52 in a multiplexor unit 40.
  • the transmission of the first and second output signals p mono , p side may take place separately.
  • a decoder 24 is illustrated as a block scheme.
  • the received signal 54 comprising encoding parameters representing the main and side signal information are provided to a demultiplexor unit 56, which separates a first and second input signal, respectively.
  • the first input signal corresponding to encoding parameters p mono of a main signal, is provided to a main signal decoder unit 64.
  • the encoding parameters p mono representing the main signal are used to generate an decoded main signal x" mono , being as similar to the main signal x mono ( Fig. 2a ) of the encoder 14 ( Fig. 2a ) as possible.
  • the second input signal corresponding to a side signal
  • the encoding parameters p side representing the side signal are used to recover a decoded side signal x" side .
  • the decoding procedure utilises information about the main signal x" mono , as indicated by an arrow.
  • the decoded main and side signals x" mono , x" side are provided to an addition unit 70, which provides an output signal that is a representation of the original signal of channel a.
  • a difference provided by a subtraction unit 68 provides an output signal that is a representation of the original signal of channel b.
  • These channel signals may be post-processed in a post-processor unit 74 according to prior-art signal processing procedures.
  • the channel signals a and b are provided at the outputs 26A and 26B of the decoder.
  • encoding is typically performed in one frame at a time.
  • a frame comprises audio samples within a pre-defined time period.
  • a frame SF2 of time duration L is illustrated.
  • the audio samples within the unhatched portion are to be encoded together.
  • the preceding samples and the subsequent samples are encoded in other frames.
  • the division of the samples into frames will in any case introduce some discontinuities at the frame borders. Shifting sounds will give shifting encoding parameters, changing basically at each frame border. This will give rise to perceptible errors.
  • One way to compensate somewhat for this is to base the encoding, not only on the samples that are to be encoded, but also on samples in the absolute vicinity of the frame, as indicated by the hatched portions.
  • interpolation techniques are sometimes also utilised for reducing perception artefacts caused by frame borders.
  • all such procedures require large additional computational resources, and for certain specific encoding techniques, it might also be difficult to provide it with any resources.
  • the audio perception will be improved by using a frame length for encoding of the side signal that is dependent on the present signal content. Since the influence of different frame lengths on the audio perception will differ depending on the nature of the sound to be encoded, an improvement can be obtained by letting the nature of the signal itself affect the frame length that is used.
  • the encoding of the main signal is not the object of the present invention and is therefore not described in detail. However, the frame lengths used for the main signal may or may not be equal to the frame lengths used for the side signal.
  • FIG. 3b One embodiment of a side signal encoder unit 30 is illustrated in Fig. 3b , in which a closed loop decision is utilised.
  • a basic encoding frame of length L is used here.
  • a number of encoding schemes 81 characterised by a separate set 80 of sub-frames, are created.
  • Each set 80 of sub-frames comprises one or more sub-frames of equal or differing lengths.
  • the total length of the set 80 of sub-frames is, however, always equal to the basic encoding frame length L.
  • the top encoding scheme is characterised by a set of sub-frames comprising only one sub-frame of length L.
  • the next set of sub-frames comprises two frames of length L/2.
  • the third set comprises two frames of length L/4 followed by a L/2 frame.
  • the signal x side provided to the side signal encoder unit 30 is encoded by all encoding schemes 81. In the top encoding scheme, the entire basic encoding frame is encoded in one piece. However, in the other encoding schemes, the signal x side is encoded in each sub-frame separately from each other.
  • the result from each encoding scheme is provided to a selector 85.
  • a fidelity measurement means 83 determines a fidelity measure for each of the encoded signals.
  • the fidelity measure is an objective quality value, preferably a signal-to-noise measure or a weighted signal-to-noise ratio.
  • the fidelity measures associated with each encoding scheme are compared and the result controls a switching means 87 to select the encoding parameters representing the side signal from the encoding scheme giving the best fidelity measure as the output signal p side from the side signal encoder unit 30.
  • Fig. 3c another embodiment of a side signal encoder unit 30 is illustrated.
  • the frame length decision is an open loop decision, based on the statistics of the signal.
  • the spectral characteristics of the side signal will be used as a base for deciding which encoding scheme that is going to be used.
  • different encoding schemes characterised by different sets of sub-frames are available.
  • the selector 85 is placed before the actual encoding.
  • the input side signal x side enters the selector 85 and a signal analysing unit 84.
  • the result of the analysis becomes the input of a switch 86, in which only one of the encoding schemes 81 are utilised.
  • the output from that encoding scheme will also be the output signal p side from the side signal encoder unit 30.
  • the advantage with an open loop decision is that only one actual encoding has to be performed.
  • the disadvantage is, however, that the analysis of the signal characteristics may be very complicated indeed and it may be difficult to predict possible behaviours in advance to be able to give an appropriate choice in the switch 86.
  • a lot of statistical analysis of sound has to be performed and included in the signal analysing unit 84. Any small change in the encoding schemes may turn upside down on the statistical behaviour.
  • variable frame length coding for the side signal is that one can select between a fine temporal resolution and coarse frequency resolution on one side and coarse temporal resolution and fine frequency resolution on the other.
  • the above embodiments will preserve the stereo image in the best possible manner.
  • the method presented in US 5,434,948 uses a filtered version of the mono (main) signal to resemble the side or difference signal.
  • the filter parameters are optimised and allowed to vary in time.
  • the filter parameters are then transmitted representing an encoding of the side signal.
  • also a residual side signal is transmitted.
  • the quantisation of the filter coefficients and any residual side signal often require relatively high bit rates for transmission, since the filter order has to be high to provide an accurate side signal estimate.
  • the estimation of the filter itself may be problematic, especially in cases of transient rich music. Estimation errors will give a modified side signal that is sometimes larger in magnitude than the unmodified signal.
  • a means to avoid the need for interpolation is to update the filter coefficients on a sample-by-sample basis and rely on backwards-adaptive analysis. For this to work well it is needed that the bit rate of the residual encoder is fairly high. This is therefore not a good alternative for low bit rate stereo coding.
  • the encoding of the side signal is based on the idea to reduce the redundancy between the mono and side signal by using a simple balance factor instead of a complex bit rate consuming predictor filter.
  • the residual of this operation is then encoded.
  • the magnitude of such a residual is relatively small and does not call for very high bit rate need for transfer. This idea is very suitable indeed to combine with the variable frame set approach described earlier, since the computational complexity is low.
  • the use of a balance factor combined with the variable frame length approach removes the need for complex interpolation and the associated problems that interpolation may cause. Moreover, the use of a simple balance factor instead of a complex filter gives fewer problems with estimation as possible estimation errors for the balance factor has less impact. The preferred solution will be able to reproduce both panned signals and diffuse sound fields with good quality and with limited bit rate requirements and computational resources.
  • Fig. 4 illustrates a preferred embodiment of a stereo encoder. This embodiment is very similar to the one shown in Fig. 2a , however, with the details of the side signal encoder unit 30 revealed.
  • the encoder 14 of this embodiment does not have any pre-processing unit, and the input signals are provided directly to the addition and subtraction units 34, 36.
  • the mono signal x mono is multiplied with a certain balance factor g sm in a multiplier 33.
  • a subtraction unit 35 the multiplied mono signal is subtracted from the side signal x side , i.e. essentially the difference between the two channels, to produce a side residual signal.
  • the balance factor g sm is determined based on the content of the mono and side signals by the optimiser 37 in order to minimise the side residual signal according to a quality criterion.
  • the quality criterion is preferably a least mean square criterion.
  • the side residual signal is encoded in a side residual encoder 39 according to any encoder procedures.
  • the side residual encoder 39 is a low bit rate transform encoder or a CELP (Codebook Excited Linear Prediction) encoder.
  • the encoding parameters p side representing the side signal then comprises the encoding parameters p side residual representing the side residual signal and the optimised balance factor 49.
  • the mono signal 42 used for synthesising the side signals is the target signal x mono for the mono encoder 38.
  • the local synthesis signal of the mono encoder 38 can also be utilised. In the latter case, the total encoder delay may be increased and the computational complexity for the side signal may increase. On the other hand, the quality may be better as it is then possible to repair coding errors made in the mono encoder.
  • x mono n ⁇ a n + 1 - ⁇ ⁇ b n
  • x side n ⁇ a n - 1 - ⁇ ⁇ b n 0 ⁇ ⁇ ⁇ 1.0.
  • the balance factor is used to minimise the residual side signal. In the special case where it is minimised in a mean square sense, this is equivalent to minimising the energy of the residual side signal x side residual .
  • weighting in the frequency domain it is possible to add weighting in the frequency domain to the computation of the balance factor. This is done by convoluting the x side and x mono signals with the impulse response of a weighting filter. It is then possible to move the estimation error to a frequency range where they are less easy to hear. This is referred to as perceptual weighting.
  • Q g (..) is a quantization function that is applied to the balance factor given by the function f ( x mono , x side ).
  • the balance factor is transmitted on the transmission channel. In normal left-right panned signals the balance factor is limited to the interval [-1.0 1.0]. If on the other hand the channels are out of phase with regards to one another, the balance factor may extend beyond these limits.
  • g Q Q g - 1 Q
  • E s is the encoding function (e.g. a transform encoder) of the residual side signal and E m is the encoding function of the mono signal
  • E m is the encoding function of the mono signal
  • One important benefit from computing the balance factor for each frame is that one avoids the use of interpolation. Instead, normally, as described above, the frame processing is performed with overlapping frames.
  • the encoding principle using balance factors operates particularly well in the case of music signals, where fast changes typically are needed to track the stereo image.
  • multi-channel coding has become popular.
  • One example is 5.1 channel surround sound in DVD movies.
  • the channels are there arranged as: front left, front centre, front right, rear left, rear right and subwoofer.
  • Fig. 5 an embodiment of an encoder that encodes the three front channels in such an arrangement exploiting interchannel redundancies is shown.
  • a centre signal encoder unit 130 is added, which receives the centre signal x centre .
  • the mono signal 42 is in this embodiment the encoded and decoded mono signal x" mono , and is multiplied with a certain balance factor g Q in a multiplier 133.
  • the multiplied mono signal is subtracted from the centre signal x centre , to produce a centre residual signal.
  • the balance factor g Q is determined based on the content of the mono and centre signals by an optimiser 137 in order to minimise the centre residual signal according to the quality criterion.
  • the centre residual signal is encoded in a centre residual encoder 139 according to any encoder procedures.
  • the centre residual encoder 139 is a low bit rate transform encoder or a CELP encoder.
  • the encoding parameters p centre representing the centre signal then comprises the encoding parameters p centre residual representing the centre residual signal and the optimised balance factor 149.
  • the centre residual signal and the scaled mono signal are added in an addition unit 235, creating a modified centre signal 142 being compensated for encoding errors.
  • the side signal x side i.e. the difference between the left L and right R channels is provided to the side signal encoder unit 30 as in earlier embodiments.
  • the optimiser 37 also depends on the modified centre signal 142 provided by the centre signal encoder unit 130. The side residual signal will therefore be created as an optimum linear combination of the mono signal 42, the modified centre signal 142 and the side signal in the subtraction unit 35.
  • variable frame length concept described above can be applied on either of the side and centre signals, or on both.
  • Fig. 6 illustrates a decoder unit suitable for receiving encoded audio signals from the encoder unit of Fig. 5 .
  • the received signal 54 is divided into encoding parameters p mono representing the main signal, encoding parameters p centre representing the centre signal and encoding parameters p side representing the side signal.
  • the encoding parameters p mono representing the main signal are used to generate a main signal x" mono .
  • the encoding parameters p centre representing the centre signal are used to generate a centre signal x" centre , based on main signal x" mono .
  • the encoding parameters p side representing the side signal are decoded, generating a side signal x" side , based on main signal x" mono and centre signal x" centre .
  • x mono n ax left n + ⁇ x right n + ⁇ x centre n .
  • ⁇ , ⁇ and ⁇ are in the remaining section set to 1.0 for simplicity, but they can be set to arbitrary values.
  • the ⁇ , ⁇ and ⁇ values can be either constant or dependent of the signal contents in order to emphasise one or two channels in order to achieve an optimal quality.
  • x centre is the centre signal and x mono is the mono signal.
  • the mono signal comes from the mono target signal but it is possible to use the local synthesis of the mono encoder as well.
  • Q g ( .. ) is a quantization function that is applied to the balance factor.
  • the balance factor is transmitted on the transmission channel.
  • E c is the encoding function (e.g. a transform encoder) of the centre residual signal and E m is the encoding function of the mono signal
  • E m is the encoding function of the mono signal
  • frame start frame end
  • frame start frame end
  • can for instance be equal to 2 for a least square minimisation of the error.
  • the g sm and g sc parameters can be quantized jointly or separately.
  • FIG. 7a-b diagrams are illustrating such an artefact.
  • a signal component having the time development as shown by curve 100.
  • the signal component is not present in the audio sample.
  • the signal component suddenly appears.
  • the signal component is encoded, using a frame length of t2-t1
  • the occurrence of the signal component will be "smeared out” over the entire frame, as indicated in curve 101.
  • the signal component appears a time ⁇ t before the intended appearance of the signal component, and a "pre-echo" is perceived.
  • the pre-echoing artefacts become more accentuated if long encoding frames are used. By using shorter frames, the artefact is somewhat suppressed.
  • Another way to deal with the pre-echoing problems described above is to utilise the fact that the mono signal is available at both the encoder and decoder end. This makes it possible to scale the side signal according to the energy contour of the mono signal. In the decoder end, the inverse scaling is performed and thus some of the pre-echo problems may be alleviated.
  • the simplest windowing function is a rectangular window, but other window types such as a hamming window may be more desirable.
  • x ⁇ side residual n x side residual n f E c n , frame start ⁇ n ⁇ frame end , where f (..) is a monotonic continuous function.
  • a more flexible set of encoding schemes may be provided.
  • the different encoding schemes 81 comprise hatched sub-frames, representing encoding applying the energy contour scaling, and un-hatched sub-frames, representing encoding procedures not applying the energy contour scaling.
  • the set of encoding schemes of Fig. 8 comprises schemes that handle e.g. pre-echoing artefacts in different ways. In some schemes, longer sub-frames with pre-echoing minimisation according to the energy contour principle are used. In other schemes, shorter sub-frames without energy contour scaling are utilised. Depending on the signal content, one of the alternatives may be more advantageous. For very severe pre-echoing cases, encoding schemes utilising short sub-frames with energy contour scaling may be necessary.
  • the proposed solution can be used in the full frequency band or in one or more distinct sub bands.
  • the use of sub-band can be applied either on both the main and side signals, or on one of them separately.
  • a preferred embodiment comprises a split of the side signal in several frequency bands. The reason is simply that it is easier to remove the possible redundancy in an isolated frequency band than in the entire frequency band. This is particularly important when encoding music signals with rich spectral content.
  • the pre-determined threshold can preferably be 2 kHz, or even more preferably 1 kHz.
  • the diffuse sound fields generally have little energy content at high frequencies.
  • the natural reason is that sound absorption typically increases with frequency.
  • the diffuse sound field components seem to play a less important role for the human auditory system at higher frequencies.
  • Fig. 10 the main steps of an embodiment of an encoding method are illustrated as a flow diagram.
  • the procedure starts in step 200.
  • a main signal deduced from the polyphonic signals is encoded.
  • encoding schemes are provided, which comprise sub-frames with differing lengths and/or order.
  • a side signal deduced in step 214 from the polyphonic signals is encoded by an encoding scheme selected dependent at least partly on the actual signal content of the present polyphonic signals.
  • the procedure ends in step 299.
  • Fig. 11 the main steps of an embodiment of a decoding method are illustrated as a flow diagram.
  • the procedure starts in step 200.
  • a received encoded main signal is decoded.
  • encoding schemes are provided, which comprise sub-frames with differing lengths and/or order.
  • a received side signal is decoded in step 224 by a selected encoding scheme.
  • the decoded main and side signals are combined to a polyphonic signal.
  • the procedure ends in step 299.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Human Computer Interaction (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Compression, Expansion, Code Conversion, And Decoders (AREA)
  • Transmission Systems Not Characterized By The Medium Used For Transmission (AREA)
  • Mobile Radio Communication Systems (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Compression Of Band Width Or Redundancy In Fax (AREA)
  • Reduction Or Emphasis Of Bandwidth Of Signals (AREA)
  • Endoscopes (AREA)
  • Cable Transmission Systems, Equalization Of Radio And Reduction Of Echo (AREA)
EP07109801A 2003-12-19 2004-12-15 Encoding and decoding of multi-channel audio signals based on a main and side signal representation Active EP1845519B1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
SE0303501A SE0303501D0 (sv) 2003-12-19 2003-12-19 Filter-based parametric multi-channel coding
SE0400417A SE527670C2 (sv) 2003-12-19 2004-02-20 Naturtrogenhetsoptimerad kodning med variabel ramlängd
EP04820553A EP1623411B1 (en) 2003-12-19 2004-12-15 Fidelity-optimised variable frame length encoding

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
EP04820553A Division EP1623411B1 (en) 2003-12-19 2004-12-15 Fidelity-optimised variable frame length encoding

Publications (3)

Publication Number Publication Date
EP1845519A2 EP1845519A2 (en) 2007-10-17
EP1845519A3 EP1845519A3 (en) 2007-11-07
EP1845519B1 true EP1845519B1 (en) 2009-09-16

Family

ID=31996354

Family Applications (2)

Application Number Title Priority Date Filing Date
EP04820553A Active EP1623411B1 (en) 2003-12-19 2004-12-15 Fidelity-optimised variable frame length encoding
EP07109801A Active EP1845519B1 (en) 2003-12-19 2004-12-15 Encoding and decoding of multi-channel audio signals based on a main and side signal representation

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP04820553A Active EP1623411B1 (en) 2003-12-19 2004-12-15 Fidelity-optimised variable frame length encoding

Country Status (15)

Country Link
EP (2) EP1623411B1 (ja)
JP (2) JP4335917B2 (ja)
CN (2) CN101118747B (ja)
AT (2) ATE371924T1 (ja)
AU (1) AU2004298708B2 (ja)
BR (2) BRPI0410856B8 (ja)
CA (2) CA2527971C (ja)
DE (2) DE602004023240D1 (ja)
HK (2) HK1091585A1 (ja)
MX (1) MXPA05012230A (ja)
PL (1) PL1623411T3 (ja)
RU (2) RU2305870C2 (ja)
SE (1) SE527670C2 (ja)
WO (1) WO2005059899A1 (ja)
ZA (1) ZA200508980B (ja)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8428936B2 (en) 2010-03-05 2013-04-23 Motorola Mobility Llc Decoder for audio signal including generic audio and speech frames
US8495115B2 (en) 2006-09-12 2013-07-23 Motorola Mobility Llc Apparatus and method for low complexity combinatorial coding of signals
US8576096B2 (en) 2007-10-11 2013-11-05 Motorola Mobility Llc Apparatus and method for low complexity combinatorial coding of signals
US10643625B2 (en) 2016-08-10 2020-05-05 Huawei Technologies Co., Ltd. Method for encoding multi-channel signal and encoder

Families Citing this family (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE60317203T2 (de) * 2002-07-12 2008-08-07 Koninklijke Philips Electronics N.V. Audio-kodierung
EP1905004A2 (en) 2005-05-26 2008-04-02 LG Electronics Inc. Method of encoding and decoding an audio signal
JP4639966B2 (ja) * 2005-05-31 2011-02-23 ヤマハ株式会社 オーディオデータ圧縮方法およびオーディオデータ圧縮回路並びにオーディオデータ伸張回路
EP1946294A2 (en) 2005-06-30 2008-07-23 LG Electronics Inc. Apparatus for encoding and decoding audio signal and method thereof
EP1908057B1 (en) 2005-06-30 2012-06-20 LG Electronics Inc. Method and apparatus for decoding an audio signal
JP2009500656A (ja) 2005-06-30 2009-01-08 エルジー エレクトロニクス インコーポレイティド オーディオ信号をエンコーディング及びデコーディングするための装置とその方法
US8180631B2 (en) 2005-07-11 2012-05-15 Lg Electronics Inc. Apparatus and method of processing an audio signal, utilizing a unique offset associated with each coded-coefficient
JP4568363B2 (ja) 2005-08-30 2010-10-27 エルジー エレクトロニクス インコーポレイティド オーディオ信号デコーディング方法及びその装置
US7788107B2 (en) 2005-08-30 2010-08-31 Lg Electronics Inc. Method for decoding an audio signal
TWI405475B (zh) 2005-08-30 2013-08-11 Lg Electronics Inc 音頻訊號之編碼及解碼裝置及其方法、電腦可讀取媒體及其系統、及可代表該音頻訊號位元流中之資料結構
US8577483B2 (en) 2005-08-30 2013-11-05 Lg Electronics, Inc. Method for decoding an audio signal
US8068569B2 (en) 2005-10-05 2011-11-29 Lg Electronics, Inc. Method and apparatus for signal processing and encoding and decoding
KR100857117B1 (ko) 2005-10-05 2008-09-05 엘지전자 주식회사 신호 처리 방법 및 이의 장치, 그리고 인코딩 및 디코딩방법 및 이의 장치
US7751485B2 (en) 2005-10-05 2010-07-06 Lg Electronics Inc. Signal processing using pilot based coding
US7646319B2 (en) 2005-10-05 2010-01-12 Lg Electronics Inc. Method and apparatus for signal processing and encoding and decoding method, and apparatus therefor
US7672379B2 (en) 2005-10-05 2010-03-02 Lg Electronics Inc. Audio signal processing, encoding, and decoding
US7696907B2 (en) 2005-10-05 2010-04-13 Lg Electronics Inc. Method and apparatus for signal processing and encoding and decoding method, and apparatus therefor
EP1946299A4 (en) 2005-10-05 2009-12-02 Lg Electronics Inc METHOD AND APPARATUS FOR SIGNAL PROCESSING
US20070092086A1 (en) 2005-10-24 2007-04-26 Pang Hee S Removing time delays in signal paths
WO2007080211A1 (en) * 2006-01-09 2007-07-19 Nokia Corporation Decoding of binaural audio signals
WO2007091927A1 (en) * 2006-02-06 2007-08-16 Telefonaktiebolaget Lm Ericsson (Publ) Variable frame offset coding
US8209190B2 (en) 2007-10-25 2012-06-26 Motorola Mobility, Inc. Method and apparatus for generating an enhancement layer within an audio coding system
US7889103B2 (en) 2008-03-13 2011-02-15 Motorola Mobility, Inc. Method and apparatus for low complexity combinatorial coding of signals
US8639519B2 (en) 2008-04-09 2014-01-28 Motorola Mobility Llc Method and apparatus for selective signal coding based on core encoder performance
EP2124486A1 (de) * 2008-05-13 2009-11-25 Clemens Par Winkelabhängig operierende Vorrichtung oder Methodik zur Gewinnung eines pseudostereophonen Audiosignals
BR122020009727B1 (pt) 2008-05-23 2021-04-06 Koninklijke Philips N.V. Método
US20110137661A1 (en) * 2008-08-08 2011-06-09 Panasonic Corporation Quantizing device, encoding device, quantizing method, and encoding method
WO2010031951A1 (fr) * 2008-09-17 2010-03-25 France Telecom Attenuation de pre-echos dans un signal audionumerique
JP5309944B2 (ja) 2008-12-11 2013-10-09 富士通株式会社 オーディオ復号装置、方法、及びプログラム
US8140342B2 (en) 2008-12-29 2012-03-20 Motorola Mobility, Inc. Selective scaling mask computation based on peak detection
US8175888B2 (en) 2008-12-29 2012-05-08 Motorola Mobility, Inc. Enhanced layered gain factor balancing within a multiple-channel audio coding system
US8200496B2 (en) 2008-12-29 2012-06-12 Motorola Mobility, Inc. Audio signal decoder and method for producing a scaled reconstructed audio signal
US8219408B2 (en) 2008-12-29 2012-07-10 Motorola Mobility, Inc. Audio signal decoder and method for producing a scaled reconstructed audio signal
EP2461321B1 (en) 2009-07-31 2018-05-16 Panasonic Intellectual Property Management Co., Ltd. Coding device and decoding device
US8977546B2 (en) * 2009-10-20 2015-03-10 Panasonic Intellectual Property Corporation Of America Encoding device, decoding device and method for both
EP2346028A1 (en) * 2009-12-17 2011-07-20 Fraunhofer-Gesellschaft zur Förderung der Angewandten Forschung e.V. An apparatus and a method for converting a first parametric spatial audio signal into a second parametric spatial audio signal
US9042560B2 (en) * 2009-12-23 2015-05-26 Nokia Corporation Sparse audio
US8442837B2 (en) 2009-12-31 2013-05-14 Motorola Mobility Llc Embedded speech and audio coding using a switchable model core
US8423355B2 (en) 2010-03-05 2013-04-16 Motorola Mobility Llc Encoder for audio signal including generic audio and speech frames
EP2544465A1 (en) * 2011-07-05 2013-01-09 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Method and apparatus for decomposing a stereo recording using frequency-domain processing employing a spectral weights generator
US9129600B2 (en) 2012-09-26 2015-09-08 Google Technology Holdings LLC Method and apparatus for encoding an audio signal
RU2640743C1 (ru) * 2012-11-15 2018-01-11 Нтт Докомо, Инк. Устройство кодирования аудио, способ кодирования аудио, программа кодирования аудио, устройство декодирования аудио, способ декодирования аудио и программа декодирования аудио
US10060955B2 (en) * 2014-06-25 2018-08-28 Advanced Micro Devices, Inc. Calibrating power supply voltages using reference measurements from code loop executions
RU2764287C1 (ru) * 2015-09-25 2022-01-17 Войсэйдж Корпорейшн Способ и система для кодирования левого и правого каналов стереофонического звукового сигнала с выбором между моделями двух и четырех подкадров в зависимости от битового бюджета
CN109215668B (zh) * 2017-06-30 2021-01-05 华为技术有限公司 一种声道间相位差参数的编码方法及装置
CN110728986B (zh) 2018-06-29 2022-10-18 华为技术有限公司 立体声信号的编码方法、解码方法、编码装置和解码装置
CN112233682B (zh) * 2019-06-29 2024-07-16 华为技术有限公司 一种立体声编码方法、立体声解码方法和装置

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5434948A (en) * 1989-06-15 1995-07-18 British Telecommunications Public Limited Company Polyphonic coding
NL9100173A (nl) * 1991-02-01 1992-09-01 Philips Nv Subbandkodeerinrichting, en een zender voorzien van de kodeerinrichting.
US5285498A (en) * 1992-03-02 1994-02-08 At&T Bell Laboratories Method and apparatus for coding audio signals based on perceptual model
US5694332A (en) * 1994-12-13 1997-12-02 Lsi Logic Corporation MPEG audio decoding system with subframe input buffering
US5956674A (en) * 1995-12-01 1999-09-21 Digital Theater Systems, Inc. Multi-channel predictive subband audio coder using psychoacoustic adaptive bit allocation in frequency, time and over the multiple channels
US5812971A (en) * 1996-03-22 1998-09-22 Lucent Technologies Inc. Enhanced joint stereo coding method using temporal envelope shaping
US5796842A (en) * 1996-06-07 1998-08-18 That Corporation BTSC encoder
US6463410B1 (en) * 1998-10-13 2002-10-08 Victor Company Of Japan, Ltd. Audio signal processing apparatus
US6226616B1 (en) * 1999-06-21 2001-05-01 Digital Theater Systems, Inc. Sound quality of established low bit-rate audio coding systems without loss of decoder compatibility
JP3335605B2 (ja) * 2000-03-13 2002-10-21 日本電信電話株式会社 ステレオ信号符号化方法
WO2002091363A1 (en) * 2001-05-08 2002-11-14 Koninklijke Philips Electronics N.V. Audio coding
JP2003084790A (ja) * 2001-09-17 2003-03-19 Matsushita Electric Ind Co Ltd 台詞成分強調装置
CN1219415C (zh) * 2002-07-23 2005-09-14 华南理工大学 一种5.1通路环绕声的耳机重发的信号处理方法

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8495115B2 (en) 2006-09-12 2013-07-23 Motorola Mobility Llc Apparatus and method for low complexity combinatorial coding of signals
US9256579B2 (en) 2006-09-12 2016-02-09 Google Technology Holdings LLC Apparatus and method for low complexity combinatorial coding of signals
US8576096B2 (en) 2007-10-11 2013-11-05 Motorola Mobility Llc Apparatus and method for low complexity combinatorial coding of signals
US8428936B2 (en) 2010-03-05 2013-04-23 Motorola Mobility Llc Decoder for audio signal including generic audio and speech frames
US10643625B2 (en) 2016-08-10 2020-05-05 Huawei Technologies Co., Ltd. Method for encoding multi-channel signal and encoder
US11217257B2 (en) 2016-08-10 2022-01-04 Huawei Technologies Co., Ltd. Method for encoding multi-channel signal and encoder
US11756557B2 (en) 2016-08-10 2023-09-12 Huawei Technologies Co., Ltd. Method for encoding multi-channel signal and encoder

Also Published As

Publication number Publication date
RU2005134365A (ru) 2006-05-27
WO2005059899A1 (en) 2005-06-30
EP1845519A2 (en) 2007-10-17
JP2008026914A (ja) 2008-02-07
RU2007121143A (ru) 2008-12-10
EP1623411B1 (en) 2007-08-29
DE602004023240D1 (de) 2009-10-29
CN1816847A (zh) 2006-08-09
BRPI0410856B8 (pt) 2019-10-15
SE0400417D0 (sv) 2004-02-20
HK1115665A1 (en) 2008-12-05
JP2007529021A (ja) 2007-10-18
CA2690885C (en) 2014-01-21
EP1845519A3 (en) 2007-11-07
CN101118747B (zh) 2011-02-23
ZA200508980B (en) 2007-03-28
BRPI0419281B1 (pt) 2018-08-14
EP1623411A1 (en) 2006-02-08
AU2004298708A1 (en) 2005-06-30
SE527670C2 (sv) 2006-05-09
CA2690885A1 (en) 2005-06-30
AU2004298708B2 (en) 2008-01-03
JP4589366B2 (ja) 2010-12-01
RU2425340C2 (ru) 2011-07-27
HK1091585A1 (en) 2007-01-19
MXPA05012230A (es) 2006-02-10
ATE371924T1 (de) 2007-09-15
CN101118747A (zh) 2008-02-06
CN100559465C (zh) 2009-11-11
JP4335917B2 (ja) 2009-09-30
SE0400417L (sv) 2005-06-20
CA2527971C (en) 2011-03-15
DE602004008613T2 (de) 2008-06-12
RU2305870C2 (ru) 2007-09-10
CA2527971A1 (en) 2005-06-30
PL1623411T3 (pl) 2008-01-31
DE602004008613D1 (de) 2007-10-11
ATE443317T1 (de) 2009-10-15
BRPI0410856A (pt) 2006-07-04
BRPI0410856B1 (pt) 2019-10-01

Similar Documents

Publication Publication Date Title
EP1845519B1 (en) Encoding and decoding of multi-channel audio signals based on a main and side signal representation
US7809579B2 (en) Fidelity-optimized variable frame length encoding
JP5171269B2 (ja) マルチチャネルオーディオ符号化における忠実度の最適化及び信号伝送量の低減
CN103119647B (zh) 基于改进型离散余弦变换的复数预测立体声编码
EP2109861B1 (en) Audio decoder
JP5179881B2 (ja) オーディオソースのパラメトリックジョイント符号化
US9626973B2 (en) Adaptive bit allocation for multi-channel audio encoding
US20050160126A1 (en) Constrained filter encoding of polyphonic signals
AU2007237227B2 (en) Fidelity-optimised pre-echo suppressing encoding
EP1639580B1 (en) Coding of multi-channel signals

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AC Divisional application: reference to earlier application

Ref document number: 1623411

Country of ref document: EP

Kind code of ref document: P

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU MC NL PL PT RO SE SI SK TR

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU MC NL PL PT RO SE SI SK TR

17P Request for examination filed

Effective date: 20080507

17Q First examination report despatched

Effective date: 20080609

AKX Designation fees paid

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU MC NL PL PT RO SE SI SK TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AC Divisional application: reference to earlier application

Ref document number: 1623411

Country of ref document: EP

Kind code of ref document: P

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU MC NL PL PT RO SE SI SK TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REF Corresponds to:

Ref document number: 602004023240

Country of ref document: DE

Date of ref document: 20091029

Kind code of ref document: P

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090916

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090916

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090916

LTIE Lt: invalidation of european patent or patent extension

Effective date: 20090916

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090916

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090916

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090916

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090916

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090916

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090916

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20091227

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100118

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100116

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090916

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090916

Ref country code: BE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090916

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20100701

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090916

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

26N No opposition filed

Effective date: 20100617

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20091231

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20091217

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20091215

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20091231

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20091231

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20091215

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100317

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090916

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 12

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 13

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20151215

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20151215

PGRI Patent reinstated in contracting state [announced from national office to epo]

Ref country code: IT

Effective date: 20170710

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 14

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20221226

Year of fee payment: 19

Ref country code: GB

Payment date: 20221227

Year of fee payment: 19

Ref country code: FR

Payment date: 20221227

Year of fee payment: 19

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20221221

Year of fee payment: 19

Ref country code: DE

Payment date: 20221228

Year of fee payment: 19

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602004023240

Country of ref document: DE

REG Reference to a national code

Ref country code: NL

Ref legal event code: MM

Effective date: 20240101