EP2439964B1 - Signal processing devices for processing stereo audio signals - Google Patents

Signal processing devices for processing stereo audio signals Download PDF

Info

Publication number
EP2439964B1
EP2439964B1 EP10783094.5A EP10783094A EP2439964B1 EP 2439964 B1 EP2439964 B1 EP 2439964B1 EP 10783094 A EP10783094 A EP 10783094A EP 2439964 B1 EP2439964 B1 EP 2439964B1
Authority
EP
European Patent Office
Prior art keywords
signal
prediction
error
left
adder
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP10783094.5A
Other languages
German (de)
French (fr)
Other versions
EP2439964A4 (en
EP2439964A1 (en
Inventor
Masaru Kimura
Bunkei Matsuoka
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2009132158 priority Critical
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Priority to PCT/JP2010/003310 priority patent/WO2010140306A1/en
Publication of EP2439964A1 publication Critical patent/EP2439964A1/en
Publication of EP2439964A4 publication Critical patent/EP2439964A4/en
Application granted granted Critical
Publication of EP2439964B1 publication Critical patent/EP2439964B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L19/00Speech or audio signals analysis-synthesis techniques for redundancy reduction, e.g. in vocoders; Coding or decoding of speech or audio signals, using source filter models or psychoacoustic analysis
    • G10L19/04Speech or audio signals analysis-synthesis techniques for redundancy reduction, e.g. in vocoders; Coding or decoding of speech or audio signals, using source filter models or psychoacoustic analysis using predictive techniques
    • G10L19/26Pre-filtering or post-filtering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/30Control circuits for electronic adaptation of the sound field
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L19/00Speech or audio signals analysis-synthesis techniques for redundancy reduction, e.g. in vocoders; Coding or decoding of speech or audio signals, using source filter models or psychoacoustic analysis
    • G10L19/008Multichannel audio signal coding or decoding, i.e. using interchannel correlation to reduce redundancies, e.g. joint-stereo, intensity-coding, matrixing
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00-G10L21/00
    • G10L25/03Speech or voice analysis techniques not restricted to a single one of groups G10L15/00-G10L21/00 characterised by the type of extracted parameters
    • G10L25/12Speech or voice analysis techniques not restricted to a single one of groups G10L15/00-G10L21/00 characterised by the type of extracted parameters the extracted parameters being prediction coefficients
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S1/00Two-channel systems
    • H04S1/007Two-channel systems in which the audio signals are in digital form

Description

    TECHNICAL FIELD
  • The present invention relates to a signal processing device for decoding and reproducing a compression-encoded audio signal, for example.
  • BACKGROUND ART
  • Generally, the more the audio signal to be reproduced has spatial information, the richer a sound field feeling or atmospheric feeling becomes when reproducing an audio signal, and the spatial information appears in the difference between the left and right signals (referred to as a left-and-right difference signal, from now on).
  • On the other hand, techniques have been spread recently which save the capacity of a storage device for storing audio signals or save the amount of communications of transmission and reception by carrying out compression encoding such as AAC (Advanced Audio Codec) or MP3 (MPEG Audio Layer 3) rather than by using audio CDs.
  • The compression-encoded audio signal has deteriorated characteristics like a tooth missing such as a lack of a high-frequency component and missing part of a middle- and high-frequency spectrum of the left-and-right difference signal. Playing back such an audio signal with its characteristics being deteriorated has a tendency to cause a muffled sound because of the lack of the high-frequency component, and a tendency to degenerate a sound field feeling and atmospheric feeling because of the deterioration in the characteristics of the left-and-right difference signal.
  • Accordingly, a signal processing device capable of improving the quality of sound of the compression-encoded audio signal is disclosed (see Patent Document 1). According to the Patent Document 1, it extracts a high-frequency component and low-frequency component of a peak value of an input audio signal and adds them, thereby being able to recover the high-frequency component missed because of the signal compression encoding and to lessen the muffled sound.
  • It is also known (Patent Document 2), a stereophonic audio reproduction system where a L-R difference is delayed, amplified and subtractively combined with the channel signals to cancel the left/right speaker mixing.
  • Prior Art Document Patent Document
    • Patent Document 1: Japanese Patent Laid-Open No. 2008-102206 .
    • Patent Document 2: UK Patent application GB 2074823A .
    DISCLOSURE OF THE INVENTION
  • Although the foregoing conventional signal processing device can lessen the muffled sound by recovering the high-frequency component missing from the audio signal, for example, it cannot restore the characteristics of the left-and-right difference signal of the audio signal before the compression encoding, thereby offering a problem of being unable to recover the rich sound field feeling and atmospheric feeling.
  • The present invention is implemented to solve the foregoing problem. Therefore it is an object of the present invention to provide a signal processing device capable of restoring the characteristics of the signal before the compression encoding.
  • The signal processing device of claim 1 in accordance with the present invention comprises a prediction error calculating unit that receives first and second signals and calculates an error signal between the first signal and a prediction signal of the first signal predicted from the second signal, a first adder for adding the first signal and the error signal in phase, and a second adder for adding the second signal and an error signal in opposite phase.
  • Another inventive solution is defined by claim 3.
  • According to the present invention, since it is configured in such a manner that the prediction error calculating unit computes the error signal between the first signal and the prediction signal of the first signal predicted from the second signal, that the first adder adds the first signal and the error signal, and that the second adder adds the second signal and the error signal, it can restore the characteristics of the signal before the compression encoding. As a result, it can recover the characteristics of the left-and-right difference signal of the stereo audio signal, for example, and thus restore the rich sound field feeling and atmospheric feeling.
  • BRIEF DESCRIPTION OF THE DRAWINGS
    • FIG. 1 is a block diagram showing a configuration of a signal processing device of an embodiment 1 in accordance with the present invention;
    • FIG. 2 is a block diagram showing a configuration of the prediction error calculating unit of the embodiment 1;
    • FIG. 3 is a diagram showing phase relationships between a frequency spectrum of a left-and-right sum signal and that of a left-and-right difference signal in the signal processing device of the embodiment 1: FIG. 3(a) shows the phase relationship when the correlation between the left signal frequency spectrum and the right signal frequency spectrum is weak; and FIG. 3(b) shows the phase relationship when the correlation between the left signal frequency spectrum and the right signal frequency spectrum is strong;
    • FIG. 4 is a diagram showing, in the signal processing device of the embodiment 1, deterioration in the left-and-right difference signal owing to the compression encoding, and restoration of the left-and-right difference signal after the signal processing by the signal processing device; and
    • FIG. 5 is a block diagram showing a configuration of a signal processing device of an embodiment 2 in accordance with the present invention.
    EMBODIMENTS FOR CARRYING OUT THE INVENTION
  • The embodiments of the invention will now be described in detail with reference to the accompanying drawings. Incidentally, the following description will be made on the assumption that a signal processing device of an embodiment in accordance with the present invention is applied to an audio device, and that it processes left and right signals of a stereo audio signal as first and second input signals having correlation.
  • EMBODIMENT 1
  • FIG. 1 is a block diagram showing a configuration of a signal processing device of an embodiment 1 in accordance with the present invention 1.
  • As shown in FIG. 1, a signal processing device 1 is placed between a decoder 2 and an output device 3, carries out signal processing of a difference signal between a left signal l(n) 101 (first signal) and a right signal r(n) 102 (second signal) input from the decoder 2 as the stereo audio signal, and supplies improved left signal lout(n) 109 and right signal rout(n) 110 to the output device 3.
  • Incidentally, the decoder 2 is a device that decodes the compressed-encoded audio data and outputs as the stereo audio signal, and the output device 3 is a device that converts the stereo audio signal into acoustic vibration and outputs it, such as a speaker.
  • As shown in FIG. 1, the signal processing device 1 comprises a prediction error calculating unit 13, a first adder 14, a second adder 15, and a gain adjusting unit 17. The prediction error calculating unit 13, which will be described later, calculates an error signal 103 from the left signal l(n) 101 and right signal r(n) 102 of the stereo audio signal as an improving difference signal for improving the left-and-right difference signal.
  • The gain adjusting unit 17 is a multiplier that controls the gain by multiplying the error signal 103 by a prescribed value, and that outputs an error signal 107 after the gain adjustment as the improving difference signal.
  • The first adder 14 adds the left signal l(n) 101 and the error signal 107 in phase and outputs as the left signal lout (n) 109. The second adder 15 adds the right signal r(n) 102 and the error signal 107 in opposite phase, and outputs as the right signal rout(n) 110.
  • Next, the processing operation of the signal processing device 1 will be described.
  • As shown in FIG. 1, the signal processing device 1, receiving the left signal l(n) 101 and right signal r (n) 102 from the external decoder 2 as the stereo audio signal, splits the input left signal l(n) 101 and right signal r(n) 102, each.
  • The signal processing device 1 leads a first left signal l(n) 101 of the split left signal l(n) 101 to the prediction error calculating unit 13 and a second left signal l(n) 101 thereof to the first adder 14. Likewise, the signal processing device 1 leads a first right signal r(n) 102 of the split right signal r(n) 102 to the prediction error calculating unit 13 and a second right signal r(n) 102 thereof to the second adder 15.
  • According to the left signal l(n) 101 and right signal r(n) 102 supplied, the prediction error calculating unit 13 calculates the error signal 103 as an improving difference signal for improving the left-and-right difference signal of the stereo audio signal, and supplies it to the gain adjusting unit 17. The detailed processing operation of the prediction error calculating unit 13 will be described later.
  • The gain adjusting unit 17 controls the gain of the error signal 103 fed from the prediction error calculating unit 13 by multiplying it by a preset fixed value or a value that can be set properly from an external control panel or the like not shown, and outputs the error signal 107 after the gain adjustment as the improving difference signal.
  • The error signal 107 output from the gain adjusting unit 17 is split so that a first error signal 107 is supplied to the first adder 14 and a second error signal 107 is supplied to the second adder 15.
  • The first adder 14 adds the left signal l(n) 101 and the error signal 107 from the gain adjusting unit 17 in phase, and supplies the left signal lout (n) 109 to the external output device 3 as the output signal after the signal processing.
  • In contrast, the second adder 15 inverts the phase of the error signal 107 fed from the gain adjusting unit 17, and adds the right signal r(n) 102 and the phase-inverted error signal 107, and supplies the right signal rout(n) 110 to the external output device 3 as the output signal after the signal processing. In other words, the second adder 15 subtracts the error signal 107 from the right signal r(n) 102 and outputs it.
  • Thus, the first adder 14 and second adder 15 add the split error signal 107 to the left signal l(n) 101 and right signal r(n) 102 in opposite phases.
  • Incidentally, although the signal processing device 1 of the embodiment 1 has a configuration of making the gain adjustment of the error signal 103 with the gain adjusting unit 17, a configuration is also possible which removes the gain adjusting unit 17 as needed.
  • Next, a concrete configuration of the prediction error calculating unit 13 will be described.
  • FIG. 2 is a block diagram showing a configuration of the prediction error calculating unit 13 of the embodiment 1.
  • As shown in FIG. 2, the prediction error calculating unit 13, which comprises a prediction unit 21 and a signal calculating unit 22, calculates the error signal 103 from the input left signal l(n) 101 and right signal r(n) 102, and outputs it as the improving difference signal.
  • The prediction unit 21, which predicts the left signal l(n) 101 from the input right signal r(n) 102, previously input right signals r(n-1), r(n-2), r(n-3), ···, r(n-N) and prediction coefficients and outputs as a prediction signal 203, is an AR prediction unit using a known AR (Auto-Regressive) prediction technique, for example. Here, N is a prediction order.
  • Incidentally, a configuration is also possible which comprises a delay unit not shown for delaying the input right signal r(n) 102 by one sample, predicts the left signal l(n) 101 from the one-sample delayed right signal r(n-1) 102, the previously input right signals r(n-2), r(n-3), r(n-4), ..., r(n-1-N) and the prediction coefficients, and outputs as the prediction signal 203.
  • The signal calculating unit 22, which is an adder for inverting the phase of the input prediction signal 203 and adds the phase-inverted prediction signal 203 to the left signal l(n) 101, calculates an error signal 204 as a prediction error and outputs it.
  • In addition, the prediction unit 21 receives the error signal 204 from the signal calculating unit 22, and updates the prediction coefficients according to the error signal 204 using a known learning algorithm at every sampling time.
  • Next, the processing operation of the prediction error calculating unit 13 will be described.
  • The prediction error calculating unit 13 receives the left signal l(n) 101 and right signal r(n) 102 as the stereo audio signal, and leads the left signal l(n) 101 to the signal calculating unit 22 and the right signal r(n) 102 to the prediction unit 21.
  • Receiving the right signal r(n) 102, the prediction unit 21 AR predicts the left signal l(n) 101 from the right signals r(n) 102 and prediction coefficients, and supplies it to the signal calculating unit 22 as the prediction signal 203.
  • The signal calculating unit 22 inverts the phase of the prediction signal 203 fed from the prediction unit 21, adds the phase-inverted prediction signal 203 and the left signal l(n) 101, and outputs the error signal 204 as the prediction error of the prediction signal 203.
  • The prediction error calculating unit 13 splits the error signal 204 output from the signal calculating unit 22, outputs a first error signal 204 as the error signal 103 and returns a second error signal 204 to the prediction unit 21.
  • Receiving the error signal 204 and according to the error signal 204, the prediction unit 21 updates the prediction coefficients using a known learning algorithm such as a steepest descent method and learning identification method.
  • Incidentally, although the prediction unit 21 is supplied with the right signal r(n) 102 and the signal calculating unit 22 is supplied with the left signal l(n) 101, the left signal l(n) 101 and the right signal r(n) 102 can be exchanged. Thus, a configuration can suffice as long as it predicts a second signal from a first signal or vice versa.
  • In addition, although a configuration has been described in which the prediction unit 21 successively updates the prediction coefficients at every sampling time, a configuration is also possible which updates the prediction coefficients at once at any given point of time or which employs a prediction unit 21 using fixed prediction coefficients designated in advance without carry out the successive update.
  • Next, the advantages of the signal processing device 1 of the embodiment 1 will be described.
  • First, characteristics of the left-and-right difference signal of the stereo audio signal will be described.
  • FIG. 3 is a diagram showing phase relationships between the signal frequency spectrum of the left-and-right sum signal and that of the left-and-right difference signal when the spectral intensity of the left signal is nearly equal to that of the right signal at a frequency θ. FIG. 3(a) shows a case where the correlation between the left signal frequency spectrum and the right signal frequency spectrum is weak, and FIG. 3(b) shows a case where the correlation between the left signal frequency spectrum and the right signal frequency spectrum is strong.
  • As shown in FIG. 3(a) and FIG. 3(b), when the left signal and right signal have nearly the same spectral intensity, the phase of the frequency spectrum of the left-and-right sum signal and the phase of the frequency spectrum of the left-and-right difference signal are orthogonal regardless of the correlation (magnitude of the phase difference) between the frequency spectrum of the left signal and that of the right signal.
  • Here, since the left-and-right sum signal is an in-phase component of the left signal l(n) 101 and right signal r(n) 102, the left-and-right sum signal is a correlation component between the left signal l(n) 101 and signal r(n) 102 when disregarding a time delay (when a time delay is zero), and the left-and-right difference signal orthogonal to the left-and-right sum signal is an uncorrelated component between the left signal l(n) 101 and right signal r(n) 102 when disregarding a time delay (when a time delay is zero).
  • On the other hand, the present embodiment 1 employs an AR prediction unit as the prediction unit 21, and the AR prediction unit enables optimum prediction that satisfies Wiener-Hopf equations as long as the signal conforms to an AR model. That the optimally predicted prediction signal is orthogonal to the error signal between the prediction signal and reference signal is known as "orthogonal principle".
  • In addition, a steady signal with a harmonic structure can be expressed in an AR model. In the present embodiment 1, since the stereo audio signal such as instrumental sounds and voice has a harmonic structure and can be considered as a steady signal when observed in a short time period, the stereo audio signal can be assumed as an AR model.
  • Here, because the prediction signal 203 predicted by the AR prediction unit (prediction unit 21 shown in FIG. 2) can be considered as a common signal component of the left signal l(n) 101 and right signal r(n) 102, it is a correlation component between the left signal l(n) 101 and right signal r(n) 102 when considering the time delay. In contrast, since the error signal 204 is orthogonal to the correlation component, it is an uncorrelated component between the left signal l(n) 101 and right signal r(n) 102 when considering the time delay. Thus, the prediction error calculating unit 13 of the present embodiment 1 can separate the left signal l(n) 101 and right signal r(n) 102 to the correlation component and uncorrelated component.
  • In this way, since the error signal 103 is the uncorrelated component of the left and right signals considering the time delay and the left-and-right difference signal is the uncorrelated component of the left and right signals when the time delay is zero, they have the same quality. Accordingly, the signal processing device 1 of the embodiment 1 can restore the frequency spectrum of the left-and-right difference signal using the error signal 103.
  • FIG. 4 is a diagram showing deterioration of the left-and-right difference signal due to the compression encoding and the restoration of the left-and-right difference signal after the signal processing by the signal processing device 1.
  • As shown in FIG. 4, a solid line denotes a frequency spectrum of the left-and-right difference signal before the compression encoding and that of the left-and-right difference signal after the signal processing, and broken lines denote a frequency spectrum of the left-and-right difference signal after the compression encoding.
  • Although the frequency spectrum of the left-and-right difference signal before the compression encoding denoted by the solid line in FIG. 4 is continuous, the left-and-right difference signal after the compression encoding denoted by the broken lines in FIG. 4 lacks part of the frequency spectrum, and becomes like a tooth missing and deteriorates its characteristics, thereby reducing the spatial information and degenerating the sound field feeling and atmospheric feeling.
  • Thus, according to the signal processing device 1 of the embodiment 1, it can recover the frequency spectrum of the left-and-right difference signal before the compression encoding from the frequency spectrum of the left-and-right difference signal deteriorated because of the compression encoding, thereby being able to restore the spatial information and to achieve the rich sound field feeling and atmospheric feeling.
  • As described above, according to the signal processing device 1 of the embodiment 1, since it is configured in such a manner that the prediction error calculating unit 13 receives the left signal l(n) 101 and right signal r(n) 102, that the prediction unit 21 predicts the left signal l(n) 101 from the input right signal r(n) 102 and the prediction coefficients and outputs it as the prediction signal 203, that the signal calculating unit 22 adds the phase-inverted prediction signal 203 and the left signal l(n) 101 and outputs the error signal 204, and that the first adder 14 and second adder 15 add the error signal 107 to the left signal l(n) 101 and right signal r(n) 102 in opposite phase relationships, respectively. Accordingly, it can recover the frequency spectrum before the compression encoding from the left-and-right difference signal of the stereo audio signal, thereby offering an advantage of being able to obtain the rich sound field feeling or atmospheric feeling when playing back the stereo audio signal.
  • In addition, according to the signal processing device 1 of the embodiment 1, since it employs the AR prediction unit that makes the AR prediction as the prediction unit 21, if offers an advantage of being able to carry out high accuracy prediction.
  • Furthermore, according to the signal processing device 1 of the embodiment 1, since it is configured in such a manner that the AR prediction unit working as the prediction unit 21 updates the prediction coefficients in accordance with the error signal 204, it offers an advantage of being able to make the prediction at high accuracy.
  • Furthermore, according to the signal processing device 1 of the embodiment 1, since it comprises the gain adjusting unit 17 that adjusts the gain of the error signal 103 and outputs the error signal 107 after the adjustment as the improving difference signal, it can control the degree of improvement of the sound field feeling and atmospheric feeling of the stereo audio signal.
  • Moreover, as for the coefficient of the gain adjusting unit 17, since the present embodiment can set it at a variable value that can be set appropriately, it can adjust the degree of the improvement of the sound field feeling and atmospheric feeling of the stereo audio signal in a finer manner.
  • Incidentally, although the signal processing device 1 of the embodiment 1 is described by way of example of a signal processing device that processes the stereo audio signal of the audio device as the first and second input signals, for example, it can handle not only the stereo audio signal, but also two input signals having some degree of correlation between them.
  • EMBODIMENT 2
  • In the embodiment 1, the configuration is described in which the prediction error calculating unit 13 calculates the error signal 103 between the prediction signal 203 and the left signal l(n) 101, the first adder 14 adds the left signal l(n) 101 and the error signal 103, and the second adder 15 adds the right signal r(n) 102 and the error signal 103 in opposite phase. In the embodiment 2, however, a configuration that adjusts the improving difference signal in a finer manner will be described.
  • FIG. 5 is a block diagram showing a configuration of the signal processing device 1 of the embodiment 2 in accordance with the present invention. Incidentally, in FIG. 5, the same or like components to those of the embodiment 1 are designated by the same reference numerals, and their detailed description will be omitted here.
  • As shown in FIG. 5, the signal processing device 1 comprises the prediction error calculating unit 13, a first adder 51, a second adder 52, a third adder 55, a fourth adder 57, a fifth adder 58, a first gain adjusting unit 53, and a second gain adjusting unit 54. The prediction error calculating unit 13, in the same manner as in the embodiment 1, calculates the error signal 103 from the left signal l(n) 101 (first signal) and right signal r(n) 102 (second signal) of the stereo audio signal as the improving difference signal for improving the left-and-right difference signal.
  • The first adder 51, third adder 55 and fourth adder 57 add their two input signals in phase, but the second adder 52 and fifth adder 58 add the two input signals with the phase of their first signal being inverted.
  • The first gain adjusting unit 53 and second gain adjusting unit 54 are a multiplier for multiplying the input signal by a prescribed value, and output as a signal with its gain being adjusted.
  • Next, the processing operation of the signal processing device 1 of the embodiment 2 will be described.
  • As shown in FIG. 5, when the signal processing device 1 receives the left signal l(n) 101 and right signal r(n) 102 from the external decoder 2 as the stereo audio signal, it splits the input left signal l(n) 101 and right signal r(n) 102 in three, respectively.
  • The signal processing device 1 leads the split left signal l (n) 101 to the prediction error calculating unit 13, first adder 51 and second adder 52. Likewise, the signal processing device 1 leads the split right signal r(n) 102 to the prediction error calculating unit 13, first adder 51 and second adder 52.
  • The first adder 51 receives and adds the left signal l(n) 101 and right signal r(n) 102, and supplies to the fourth adder 57 and fifth adder 58 as a first addition signal 501.
  • In the same processing operation as that of the embodiment 1, the prediction error calculating unit 13 calculates, from the input left signal l(n) 101 and right signal r(n) 102, the error signal 103 between the left signal l(n) 101 and the prediction signal that estimates the left signal l(n) 101, and supplies the error signal 103 to the first gain adjusting unit 53 as the improving difference signal for improving the left-and-right difference signal of the stereo audio signal.
  • The first gain adjusting unit 53 controls the gain of the input error signal 103 by multiplying it by a preset fixed value or a value that can be set properly from an external control panel or the like not shown, and supplies the error signal 503 after the gain adjustment to the third adder 55.
  • The second adder 52, receiving the left signal l(n) 101 and right signal r(n) 102, adds the left signal l(n) 101 and right signal r(n) 102 in opposite phase, and supplies to the second gain adjusting unit 54 as a second addition signal 502.
  • The second gain adjusting unit 54 controls the gain of the input second addition signal 502 by multiplying it by a preset fixed value or a value that can be set properly from an external control panel or the like not shown, and supplies the second addition signal 504 after the gain adjustment to the third adder 55 as the improving difference signal.
  • The third adder 55 adds the error signal 503 from the first gain adjusting unit 53 and the second addition signal 504 from the second gain adjusting unit, and supplies a third addition signal 505 to the fourth adder 57 and fifth adder 58 as a new improving difference signal.
  • The fourth adder 57 adds the first addition signal 501 fed from the first adder 51 and the third addition signal 505 fed from the third adder 55, and supplies the left signal lout(n) 109 to the external output device 3 as an output signal after the signal processing.
  • The fifth adder 58 adds the first addition signal 501 fed from the first adder 51 and the third addition signal 505 fed from the third adder 55 in opposite phase, and supplies the right signal rout(n) 110 to the external output device 3 as an output signal after the signal processing.
  • Incidentally, in the embodiment 2 also, the left signal l(n) 101 and the right signal r(n) 102 can be exchanged. Thus, a configuration can suffice as long as it predicts a second signal from a first signal or vice versa.
  • As described above, according to the embodiment 2, it is configured in such a manner that the first gain adjusting unit 53 controls the gain of the error signal 103 to make the error signal 503, the second gain adjusting unit 54 controls the gain of the second addition signal 502 to make the second addition signal 504, the third adder 55 adds the error signal 503 and the second addition signal 504 to make the third addition signal 505, the fourth adder 57 adds the third addition signal 505 and the left signal l(n) 101, and the fifth adder 58 adds to the right signal r(n) 102 the third addition signal 505 with its phase being inverted. Accordingly, it offers an advantage of being able to adjust the improving difference signal in a finer manner.
  • For example, to increase an improvement effect, it is enough to reduce the coefficient of the second gain adjusting unit 54 and to increase the coefficient of the first gain adjusting unit 53. In contrast, to reduce the improvement effect, it is enough to increase the coefficient of the second gain adjusting unit 54 and to reduce the coefficient of the first gain adjusting unit 53. Furthermore, it is also possible to make the coefficient of the second gain adjusting unit 54 comparable to the coefficient of the first gain adjusting unit 53.
  • Furthermore, when the intensity of the left-and-right difference signal increases too much, the central component of the stereo audio signal becomes weak and a comfortable sound field feeling is impaired. According to the embodiment 2, however, it can curb the excessive increase of the left-and-right difference signal intensity, thereby offering an advantage of being able to achieve a stable sound field feeling.
  • Incidentally, although the embodiments 1 and 2 are designed for the signal processing of the stereo audio signal passing through the compression encoding, this is not essential. For example, it can also use a stereo audio signal that does not undergo compression encoding. In this case, the configuration as to the embodiment 1 or 2 can further increase the information about the left-and-right difference signal of the stereo audio signal, thereby offering an advantage of being able to achieve a richer sound field feeling and atmospheric feeling.
  • Furthermore, inputting a sensor signal instead of the stereo audio signal offers an advantage of being able to obtain a measurement result at higher accuracy.
  • INDUSTRIAL APPLICABILITY
  • A signal processing device in accordance with the present invention can restore the characteristics of the signal before the compression encoding. As a result, it can restore the characteristics of the left-and-right difference signal of the stereo audio signal, for example, thereby being able to recover a rich sound field feeling or atmospheric feeling. Accordingly, it is suitable for applications to signal processing devices which decode and play back a compression-encoded audio signal.

Claims (9)

  1. A signal processing device (1) characterized in that it comprises:
    a prediction error calculating unit (13) for receiving a first signal (101) and a second signal (102) as stereo audio signals, and for calculating an error signal(103) between the first signal and a prediction signal of the first signal, the prediction signal being predicted from the second signal, wherein one of the first signal and second signal is a left signal and another is a right signal;
    a first adder (14) for adding the first signal and the error signal (103) in phase; and
    a second adder (15) for adding the second signal and the error signal in opposite phase.
  2. The signal processing device according to claim 1, further comprising:
    a gain adjusting unit (17) for receiving the error signal (103) from the prediction error calculating unit (13), and for controlling the gain of the error signal.
  3. A signal processing device characterized in that it comprises:
    a prediction error calculating unit (13) for receiving a first signal (101) and a second signal (102) as stereo audio signals, and for calculating an error signal between the first signal and a prediction signal of the first signal, the prediction signal being predicted from the second signal; wherein one of the first signal and second signal is a left signal and another is a right signal;
    a first gain adjusting unit (53) for controlling the gain of the error signal;
    a first adder (51) for adding the first signal and the second signal in phase, and for outputting as a first addition signal;
    a second adder (52) for adding the first signal and the second signal in opposite phase, and for outputting as a second addition signal;
    a second gain adjusting unit (54) for controlling the gain of the second addition signal;
    a third adder (55) for adding the error signal from the first gain adjusting unit (53) and the second addition signal from the second gain adjusting unit (54) in phase, and for outputting as a third addition signal;
    a fourth adder (57) for adding the first addition signal and the third addition signal in phase; and
    a fifth adder (58) for adding the first addition signal and the third addition signal in opposite phase.
  4. The signal processing device according to claim 1, wherein
    the prediction error calculating unit (13) comprises an AR (Auto-Regressive) prediction unit (21) for predicting the first signal from the second signal and a prediction coefficient.
  5. The signal processing device according to claim 3, wherein
    the prediction error calculating unit (13) comprises an AR (Auto-Regressive) prediction unit (21) for predicting the first signal from the second signal and a prediction coefficient.
  6. The signal processing device according to claim 4, wherein
    the prediction error calculating unit (13) inputs the error signal to the AR prediction unit (21), and the AR prediction unit updates the prediction coefficient in accordance with the error signal.
  7. The signal processing device according to claim 5, wherein
    the prediction error calculating unit (13) inputs the error signal to the AR prediction unit, and the AR prediction unit updates the prediction coefficient in accordance with the error signal.
  8. The signal processing device accordingto claim 2, wherein
    the gain adjusting unit (17) controls the gain by multiplying a value properly set.
  9. The signal processing device according to claim 3, wherein
    the gain adjusting units (53, 54) control the gain by multiplying a value properly set.
EP10783094.5A 2009-06-01 2010-05-17 Signal processing devices for processing stereo audio signals Active EP2439964B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2009132158 2009-06-01
PCT/JP2010/003310 WO2010140306A1 (en) 2009-06-01 2010-05-17 Signal processing device

Publications (3)

Publication Number Publication Date
EP2439964A1 EP2439964A1 (en) 2012-04-11
EP2439964A4 EP2439964A4 (en) 2013-04-03
EP2439964B1 true EP2439964B1 (en) 2014-06-04

Family

ID=43297449

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10783094.5A Active EP2439964B1 (en) 2009-06-01 2010-05-17 Signal processing devices for processing stereo audio signals

Country Status (5)

Country Link
US (1) US8918325B2 (en)
EP (1) EP2439964B1 (en)
JP (1) JP5355690B2 (en)
CN (1) CN102440008B (en)
WO (1) WO2010140306A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10480967B2 (en) * 2014-11-14 2019-11-19 Sharp Kabushiki Kaisha Signal processing device and signal processing method
WO2019155603A1 (en) * 2018-02-09 2019-08-15 三菱電機株式会社 Acoustic signal processing device and acoustic signal processing method

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA1085044A (en) * 1975-04-03 1980-09-02 Yukihiko Iijima Composite feedback predictive code communication system for a color tv signal including a carrier chrominance signal
US4308423A (en) 1980-03-12 1981-12-29 Cohen Joel M Stereo image separation and perimeter enhancement
DE4320990B4 (en) * 1993-06-05 2004-04-29 Robert Bosch Gmbh Redundancy reduction procedure
JP3283413B2 (en) * 1995-11-30 2002-05-20 株式会社日立製作所 Encoding / decoding method, encoding device and decoding device
KR100206333B1 (en) * 1996-10-08 1999-07-01 윤종용 Device and method for the reproduction of multichannel audio using two speakers
US6111958A (en) * 1997-03-21 2000-08-29 Euphonics, Incorporated Audio spatial enhancement apparatus and methods
JPH113099A (en) * 1997-04-16 1999-01-06 Mitsubishi Electric Corp Speech encoding/decoding system, speech encoding device, and speech decoding device
US6463410B1 (en) * 1998-10-13 2002-10-08 Victor Company Of Japan, Ltd. Audio signal processing apparatus
US6810124B1 (en) * 1999-10-08 2004-10-26 The Boeing Company Adaptive resonance canceller apparatus
US6856790B1 (en) * 2000-03-27 2005-02-15 Marvell International Ltd. Receiver with dual D.C. noise cancellation circuits
DE60222445T2 (en) * 2001-08-17 2008-06-12 Broadcom Corp., Irvine Method for hiding bit errors for language coding
US7359522B2 (en) 2002-04-10 2008-04-15 Koninklijke Philips Electronics N.V. Coding of stereo signals
JP4369946B2 (en) 2002-11-21 2009-11-25 日本電信電話株式会社 Digital signal processing method, program thereof, and recording medium containing the program
US7580482B2 (en) * 2003-02-19 2009-08-25 Endres Thomas J Joint, adaptive control of equalization, synchronization, and gain in a digital communications receiver
US20040176056A1 (en) * 2003-03-07 2004-09-09 Shen Feng Single-tone detection and adaptive gain control for direct-conversion receivers
JP3718706B2 (en) * 2003-10-28 2005-11-24 松下電器産業株式会社 Delta-sigma modulator
US7272567B2 (en) * 2004-03-25 2007-09-18 Zoran Fejzo Scalable lossless audio codec and authoring tool
GB2419265B (en) * 2004-10-18 2009-03-11 Wolfson Ltd Improved audio processing
KR100612024B1 (en) * 2004-11-24 2006-08-11 삼성전자주식회사 Apparatus for generating virtual 3D sound using asymmetry, method thereof, and recording medium having program recorded thereon to implement the method
EP1720249B1 (en) * 2005-05-04 2009-07-15 Harman Becker Automotive Systems GmbH Audio enhancement system and method
US7630779B2 (en) * 2005-06-01 2009-12-08 Analog Devices, Inc. Self compensating closed loop adaptive control system
US7996216B2 (en) * 2005-07-11 2011-08-09 Lg Electronics Inc. Apparatus and method of encoding and decoding audio signal
WO2007035072A1 (en) * 2005-09-26 2007-03-29 Samsung Electronics Co., Ltd. Apparatus and method to cancel crosstalk and stereo sound generation system using the same
KR100739762B1 (en) 2005-09-26 2007-07-09 삼성전자주식회사 Apparatus and method for cancelling a crosstalk and virtual sound system thereof
EP1953736A4 (en) * 2005-10-31 2009-08-05 Panasonic Corp Stereo encoding device, and stereo signal predicting method
FI20051294A0 (en) * 2005-12-19 2005-12-19 Noveltech Solutions Oy signal processing
WO2007088853A1 (en) 2006-01-31 2007-08-09 Matsushita Electric Industrial Co., Ltd. Audio encoding device, audio decoding device, audio encoding system, audio encoding method, and audio decoding method
JP2008033269A (en) * 2006-06-26 2008-02-14 Sony Corp Digital signal processing device, digital signal processing method, and reproduction device of digital signal
JP4972742B2 (en) 2006-10-17 2012-07-11 国立大学法人九州工業大学 High-frequency signal interpolation method and high-frequency signal interpolation device
JP4963973B2 (en) * 2007-01-17 2012-06-27 日本電信電話株式会社 Multi-channel signal encoding method, encoding device using the same, program and recording medium using the method
US8064624B2 (en) * 2007-07-19 2011-11-22 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Method and apparatus for generating a stereo signal with enhanced perceptual quality
WO2009022463A1 (en) * 2007-08-13 2009-02-19 Mitsubishi Electric Corporation Audio device
EP2201566B1 (en) * 2007-09-19 2015-11-11 Telefonaktiebolaget LM Ericsson (publ) Joint multi-channel audio encoding/decoding

Also Published As

Publication number Publication date
EP2439964A4 (en) 2013-04-03
EP2439964A1 (en) 2012-04-11
CN102440008A (en) 2012-05-02
US20120014485A1 (en) 2012-01-19
CN102440008B (en) 2015-01-21
US8918325B2 (en) 2014-12-23
JPWO2010140306A1 (en) 2012-11-15
JP5355690B2 (en) 2013-11-27
WO2010140306A1 (en) 2010-12-09

Similar Documents

Publication Publication Date Title
KR101849612B1 (en) Method and apparatus for normalized audio playback of media with and without embedded loudness metadata on new media devices
US8477963B2 (en) Method, apparatus, and computer program for suppressing noise
AU682926B2 (en) Process for coding a plurality of audio signals
KR101049751B1 (en) Audio Coding
EP1623411B1 (en) Fidelity-optimised variable frame length encoding
US8194880B2 (en) System and method for utilizing omni-directional microphones for speech enhancement
US8428275B2 (en) Wind noise reduction device
US8767850B2 (en) Apparatus and method for encoding/decoding a multichannel signal
US7573912B2 (en) Near-transparent or transparent multi-channel encoder/decoder scheme
US7991176B2 (en) Stereo widening network for two loudspeakers
CA2283838C (en) Multidirectional audio decoding
US7809579B2 (en) Fidelity-optimized variable frame length encoding
US8081764B2 (en) Audio decoder
JP4606507B2 (en) Spatial downmix generation from parametric representations of multichannel signals
RU2625444C2 (en) Audio processing system
JP4989468B2 (en) Audio channel conversion
JP3670562B2 (en) Stereo sound signal processing method and apparatus, and recording medium on which stereo sound signal processing program is recorded
JP4804532B2 (en) Envelope shaping of uncorrelated signals
RU2550549C2 (en) Signal processing device and method and programme
JP3646938B1 (en) Audio decoding apparatus and audio decoding method
JP4413480B2 (en) Voice processing apparatus and mobile communication terminal apparatus
EP2467850B1 (en) Method and apparatus for decoding multi-channel audio signals
JP2008512055A (en) Audio channel mixing method using correlation output
US8099293B2 (en) Audio signal processing
TW201503711A (en) System and method for stereo field enhancement in two-channel audio systems

Legal Events

Date Code Title Description
AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

17P Request for examination filed

Effective date: 20111230

DAX Request for extension of the european patent (to any country) (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20130305

RIC1 Information provided on ipc code assigned before grant

Ipc: H04S 1/00 20060101AFI20130227BHEP

Ipc: G10L 21/02 20130101ALI20130227BHEP

Ipc: G10L 19/26 20130101ALI20130227BHEP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602010016511

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: H04S0001000000

Ipc: H04S0007000000

RIC1 Information provided on ipc code assigned before grant

Ipc: G10L 19/008 20130101ALN20131127BHEP

Ipc: G10L 25/12 20130101ALN20131127BHEP

Ipc: H04S 7/00 20060101AFI20131127BHEP

Ipc: H04S 1/00 20060101ALN20131127BHEP

Ipc: G10L 19/26 20130101ALI20131127BHEP

INTG Intention to grant announced

Effective date: 20140102

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 671686

Country of ref document: AT

Kind code of ref document: T

Effective date: 20140615

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602010016511

Country of ref document: DE

Effective date: 20140717

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 671686

Country of ref document: AT

Kind code of ref document: T

Effective date: 20140604

REG Reference to a national code

Ref country code: NL

Ref legal event code: VDEP

Effective date: 20140604

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140904

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140905

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141006

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141004

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602010016511

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

26N No opposition filed

Effective date: 20150305

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602010016511

Country of ref document: DE

Effective date: 20150305

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20150517

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150531

Ref country code: LU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150517

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150531

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20160129

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150517

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150517

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150601

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

REG Reference to a national code

Ref country code: DE

Ref legal event code: R084

Ref document number: 602010016511

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20100517

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140604

PGFP Annual fee paid to national office [announced from national office to epo]

Ref country code: DE

Payment date: 20190508

Year of fee payment: 10