EP0664660B1 - Einrichtung zur Wiedergabe von Audiosignalen - Google Patents

Einrichtung zur Wiedergabe von Audiosignalen Download PDF

Info

Publication number
EP0664660B1
EP0664660B1 EP95104929A EP95104929A EP0664660B1 EP 0664660 B1 EP0664660 B1 EP 0664660B1 EP 95104929 A EP95104929 A EP 95104929A EP 95104929 A EP95104929 A EP 95104929A EP 0664660 B1 EP0664660 B1 EP 0664660B1
Authority
EP
European Patent Office
Prior art keywords
transfer characteristics
audio signal
signal
head
listener
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
EP95104929A
Other languages
English (en)
French (fr)
Other versions
EP0664660A2 (de
EP0664660A3 (de
Inventor
Kiyofumi C/O Sony Corporation Inanaga
Hiroyuki C/O Sony Corporation Sogawa
Yasuhiro C/O Sony Corporation Iida
Susumu C/O Sony Corporation Yabe
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2008520A external-priority patent/JP2893780B2/ja
Priority claimed from JP2008514A external-priority patent/JP2751512B2/ja
Application filed by Sony Corp filed Critical Sony Corp
Publication of EP0664660A2 publication Critical patent/EP0664660A2/de
Publication of EP0664660A3 publication Critical patent/EP0664660A3/xx
Application granted granted Critical
Publication of EP0664660B1 publication Critical patent/EP0664660B1/de
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S1/00Two-channel systems
    • H04S1/002Non-adaptive circuits, e.g. manually adjustable or static, for enhancing the sound image or the spatial distribution
    • H04S1/005For headphones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R5/00Stereophonic arrangements
    • H04R5/033Headphones for stereophonic communication

Definitions

  • the present invention relates to an audio signal binaural reproducing apparatus for reproducing audio signals by means of headphones.
  • a binaural reproducing method has heretofore been known as an approach for providing better direction sensation of sound image or outside head localization sensation when audio signals are reproduced by headphones fitted to the head of a listener so that a pair of headphones are located in the vicinity of both ears.
  • the direction sensation of sound image and outside head localization sensation and the like depend upon the differences in volumes, times and phases of sounds listened by left and right ears.
  • the signal processing aims at causing in an audio output reproduced by the headphones, audio effects equivalent to those caused by the difference in distances between sound sources, that is, speaker systems and right and left ears of a listener and reflections and diffractions in the vicinity of the head of the listener when audio reproducing is performed, for example, by speaker systems remote from the listener.
  • Such a signal processing is performed by convolution-integrating left and right ear audio signals with impulse responses corresponding to the above-mentioned audio effects.
  • the relative direction and position of the sound image that the listener senses are changed.
  • the headphones is turned together with the listeners head if the listener turns his or her head when audio reproducing is performed by a binaural method using headphones, the relative direction and position of the sound image which the listener senses are not changed.
  • an audio signal reproducing system which detects a change in the direction of the listener's head and changes the modes of the signal processing based upon a result of the detection for providing a good front localization sensation in headphones has heretofore been proposed as is disclosed in Japanese Unexamined Patent Publication No. Sho 42-227 and Japanese Examined Patent Publication No. 54-19242.
  • a direction detecting device such as gyrocompass and magnetic needle is provided on the head of the listener.
  • a level adjusting circuit and a delay circuit and the like for processing the audio signals are controlled based upon a result of detection from the direction detecting device so that a sound image sensation which is similar to that of the audio reproducing using speaker systems remote from the listener is obtained.
  • the impulse responses that is, transfer characteristics corresponding to audio effects given to audio signals of left and right ears for each predetermined rotational angle and to store a great amount of information on the transfer characteristics.
  • the information is read from the storing means depending upon the change in direction of the head.
  • the audio signal will be subjected to a necessary convolution-integration processing in real-time.
  • JP-A-1-121,000 discloses an audio signal reproducing apparatus according to the preamble of claim 1. A similar arrangement is disclosed in JP-A-58-116,900.
  • an audio signal reproducing apparatus comprising:
  • An audio signal reproducing apparatus described for reference comprises a headphone set 10 which is fitted over the head M of a listener P and a pair of headphones 2L and 2R are supported by a head band 1 so that they are located In the vicinity of left and right ears of the listener P, respectively as shown in Fig. 1.
  • This apparatus is not prior art to the present invention but is the subject of EPA 919027383 (Patent No. 0,464,217) from which this application was divided.
  • Two sliders 4L and 4R from which support arms 3L and 3R, respectively project are slidably mounted on the head band 1 of the headphone set 10.
  • a pair of signal detectors 5L and 5R which detect a position detection reference signal emitted from a reference signal source 11 are provided at the tip ends of the support arms 3L and 3R, respectively. That is, the pair of signal detectors 5L and 5R are provided on the tip ends of the support arms 3L and 3R projectedly formed on the sliders 4L and 4R which are slidably mounted on the head band 1 so that they are supported in positions remote from the head band 1 and the pair of headphones 2L and 2R, that is the main body of the headphone set.
  • the reference signal source 11 comprises an ultrasonic signal source 12 and an ultrasonic speaker 13 for generating an ultrasonic signal from the ultrasonic signal source 12 as a reference signal.
  • Each of the pair of signal detectors 5L and 5R which receive the reference signal comprises an ultrasonic microphone.
  • An ultrasonic wave generated from the ultrasonic speaker 13, that is, the position detection reference signal is a burst wave in which an ultrasonic wave having a given level is intermittently generated for a given period of time as shown at A in Fig. 2, or an ultrasonic wave, the phase of which may be detected like a so-called level modulated wave, the level of which changes in a given circle.
  • the pair of signal detectors 5L and 5R provided on the headphone set 10 detects the ultrasonic position detection reference signal generated from the ultrasonic speaker 13 and generate respective detection signals shown at B and C in Fig. 2, each having a time lag depending upon the relative positional relation between the listener P and the ultrasonic speaker 13.
  • the pair of signal detectors 5L and 5R are supported by the support arm 3L and 3R in positions remote from the main body of the headphone set 10 while they are mounted on the tip ends of the support arms 3L and 3R in positions remote from the main body of the headphone set 10 while they are mounted on the tip ends of the support arms 3L and 3R which project from the sliders 4L and 4R, respectively slidably mounted on the head band 1 and, the head band 1 and the pair of headphone 2L and 2R, that is, the main body of the headphone set is fitted on the head M of the listener P, they can detect the ultrasonic wave generated from the ultrasonic speaker 13, that is, the position detection reference signal stably and accurately without being located behind the head P of the listener P even if the listener P moves or rotates his head P.
  • the pair of the signal detectors 5L and 5R can be adjusted to a position optical for detecting the detection reference signal by sliding the sliders 4L and 4R along the head band 1.
  • the optimal positions of the headphones 2L and 2R which are fitted on the head M of the listener P by the head band 1 so that they correspond to the vicinity of the left and right ears depend on the shape and size of the had M of the listener P, that is, have the differences among individuals. Accordingly, the positions of the pair of signal detectors 5L and 5R can be adjusted so that they correspond to the headphones 2L and 2R, respectively.
  • Each detection signal obtained by these signal detectors 5L and 5R is applied to an operation unit 14.
  • the operation unit 14 comprises first and second edge detecting circuits 15 and 16, to which the detection signal from the signal detectors 5L and 5R for detecting the position detection reference signal are supplied, respectively and a third edge detecting circuit 17 to which an ultrasonic signal from the ultrasonic signal sources 12, that is, the position detection reference signal is applied.
  • the distance calculating circuit 18 detects the time difference t 1 between pulse signals obtained by the third and first edge detecting circuits 17 and 15 which is represented as ⁇ T 1 in Fig. 2 and the time difference t 2 between pulse signals obtained by the third and second edge detecting circuits 17 and 16 which is represented as ⁇ T 2 in Fig. 2 and then calculates the distance l 0 between the ultrasonic speaker 13 and the center of the head M of the listener P represented as l 0 in Fig. 3 based upon the time differences t 1 , t 2 and the sound velocity V.
  • the sound velocity V may be preliminarily preset as a constant in the distance calculating circuit 18 or alternatively may be changed with changes in atmospheric temperature, humidity and atmospheric pressure and the like.
  • compensation may be conducted for the positional relation between the signal detector 5L and 5R and the center of the head M, the shape and size of the hand M.
  • Signals representative of the distance l 0 , time differences t 1 and t 2 are fed to an angle calculating circuit 20.
  • the circuit 19 for detecting the time difference between both ears detects the time difference t 3 between the pulse signals generated by the first and second edge detecting circuits 15 and 16, represented as ⁇ 3 in Fig. 2.
  • a signal representative of the time difference t 3 is fed to the angle calculating circuit 20.
  • the angle calculating circuit 20 calculates an angle representative of the direction of the head M represented by an arrow ⁇ 0 in Fig. 3 by using the time differences t 1 , t 2 , t 3 , the distance l 0 , the sound velocity V and the radius r of the head M.
  • the operation unit 14 includes a storing circuit 22 in which information on transfer characteristics from the virtual sound source to both ears of the listener in first quadrant of the rotational angular position of the head of the listener, for example, information on the transfer characteristics for each angle ⁇ 11 to ⁇ 1n in the first quadrant.
  • the control circuit 21 Based upon the current angle position calculated by the angle calculating circuit 20, the control circuit 21 reads the information on the transfer characteristics corresponding to the current angles ⁇ 11 to ⁇ 1n positions from the storing circuit 22 if the current angle position is in the first quadrant in Fig. 4 and reads the transfer characteristics information in which the current angles ⁇ 21 to ⁇ 2n corresponds to the angles ⁇ 11 to ⁇ 1n in the first quadrant from the storing circuit 22 if the current angle position is in the second quadrant in Fig. 4 and read the transfer characteristics information in which the current angles ⁇ 31 to ⁇ 3n corresponds to the angles ⁇ 11 to ⁇ 1n in the first quadrant from the storing circuit 22 if the current angle position is in the third quadrant in Fig.
  • two transfer characteristics in the vicinity of the rotational angular position of the head represented by the angular position information may be read form the storing circuit 22 and the information on the transfer characteristics in the current head rotational angular position may be operated by, for example, linear interpolation processing, as described later with reference to Fig. 6.
  • the audio signal source 24 is an apparatus for outputting given left and right channel audio signals S L and S R , such as recording disc playback apparatus or radio communication receivers and the like.
  • the audio signal processing circuit 23 performs a signal processing which provides the left and right channel audio signals S L and S R fed from the audio signal source 24 with a given transfer characteristics form the virtual sound source to the both ears of the listener.
  • the audio signal processing circuit 23 comprises first to sixth switches 25L, 25R, 26L, 26R, 27L and, 27R for switching the signal lines and first to fourth signal processing units 28a, 28b, 28c and 28d.
  • the first to sixth switches 25L, 25R, 26L, 26R, 27L and 27R are controlled for switching in response to a control signal from the control circuit 21 representative of the quadrant to which the current angular position belongs.
  • the third and fourth switches 26 L and 26 R perform switching of filters for the left and right channel audio signals E L and E R outputted from the audio signal processing circuit 23 and output the left and right audio signals E L and E R unswitched when the current angular position is in the second or fourth quadrant and output the audio signals E L and E R from which high frequency components have been removed by low pass filters 30L and 30R when the current angular position is in the second or fourth quadrant.
  • an impulse response representative of the transfer characteristics of the left and right channel audio signals S L and S R reproduced from a pair of left and right channel speakers which are virtual sound sources facing to a listener to each ear of the listener is preset based upon information on transfer characteristics supplied from the control circuit 21.
  • the first signal processing unit 28a presets the impulse response ⁇ h RR (t, ⁇ ) ⁇ representative of transfer characteristics of the sound reproduced from the right channel audio signal S R to the right ear.
  • the second signal processing unit 28b presets the impulse response ⁇ h RL (t, ⁇ ) ⁇ representative of the transfer characteristics of the sound reproduced from the right channel audio signal S R to the left ear.
  • the third signal processing unit 28c presets the impulse response ⁇ h LR (t, ⁇ ) ⁇ representative of transfer characteristics of the sound reproduced form the left channel audio signal S L to the right ear.
  • the fourth signal processing unit 28d presents the impulse response ⁇ h LL (t, ⁇ ) ⁇ representative of the transfer characteristics of the sound reproduced from the left channel audio signal S L to the left ear.
  • the right channel audio signal S R is fed the first and second signal processing units 28a and 28b.
  • the first signal processing unit 28a the right channel audio signal S R is subjected to a signal processing of convolution-integration of the impulse response ⁇ h RR (t, ⁇ ) ⁇ .
  • the second signal processing unit 28b the right channel audio signal S R is subjected to a signal processing of convolution-integration of the impulse response ⁇ h RL (t, ⁇ ) ⁇ .
  • the left channel audio signal S L is fed to the third and fourth signal processing units 28c and 28d.
  • the third signal processing unit 28c the left channel audio signal S L is subjected to a signal processing of convolution-integration of the impulse response ⁇ h LR (t, ⁇ ) ⁇ .
  • the second signal processing unit 28d the left channel audio signal S L is subjected to a signal processing of convolution-integration of the impulse response ⁇ h LL (t, ⁇ ) ⁇ .
  • the output signals from the first and third signal processing units 28a and 28c are applied to the right channel adder 29R and are added with each other therein.
  • the output signal of the right channel adder 28R is fed as the right channel studio signal E R via the right channel amplifier 31R to the right channel headphone 2R and reproduced as a sound.
  • the output signals from the second and fourth signal processing units 28b and 28d are applied to the left channel adder 29L and are added with each other therein.
  • the output signal of the left channel adder 29 is fed as the left channel audio signal E L via the left channel amplifier 31L to the left channel headphone 2L and reproduced as a sound.
  • Fig. 5B shows that the listener P has approached to a pair of speaker systems S L and S R , that is, virtual sound sources from a position of Fig. 5A.
  • Fig. 4C shows that the listener P rotates his head M towards the right speaker device S R .
  • the audio signal reproducing apparatus of the present invention shown in Fig. 6 comprises a headphone set 40 which is fitted over the head M of a listener P and a pair of headphones 42L and 42R are supported by a head band 41 so that they are located in the vicinity of the left and right ears of the listener P, as is similar to the apparatus shown in Fig. 1.
  • Two sliders 44L and 44R from which support arms 43L and 43R, respectively project are slidably mounted on the head 1 of the headphone set 40.
  • a pair of signal detectors 45L and 45R which detect a position detection reference signal emitted from a reference signal source 51 are provided at the tip ends of the support arms 43L and 43R, respectively. That is, the pair of signal detectors 45L and 45R are provided on the tip ends of the support arms 43L and 43R projectedly formed on the sliders 44L and 44R which are slidably mounted on the head band 51 so that they are supported in positions remote from the head band 51 and the pair of headphones 42L and 42R, that is the main body of the headphone set.
  • the reference signal source 51 comprises an ultrasonic signal source 52 and an ultrasonic speaker 53 for generating an ultrasonic signal from the ultrasonic signal source 52 as a reference signal.
  • Each of the pair of signal detectors 45L and 45R which receives the reference signal comprises an ultrasonic microphone.
  • An ultrasonic wave generated from the ultrasonic speaker 53 that is, the position detection reference signal is a burst wave in which an ultrasonic wave having a given level is intermittently generated for a given period of time as is similar to the first embodiment, or an ultrasonic wave, the phase of which may be detected like a so-called level modulated wave, the level of which changes in a given circle.
  • the pair of signal detectors 45L and 45R provided on the headphone set 40 detects the ultrasonic position detection reference signal generated from the ultrasonic speaker 53 and generate respective detection signals, each having a time lag depending upon the relative positional relation between the listener P and the ultrasonic speaker 53.
  • Each detection signal obtained by these signal detectors 45L and 45R is applied to an operation unit 54.
  • the operation unit 54 comprises first and second edge detecting circuit 55 and 56, to which the detection signal from the signal detectors 45L and 45R for detecting the position detection reference signal are supplied, respectively and a third edge detecting circuit 57 to which an ultrasonic signal from the ultrasonic signal source 52, that is, the position detection reference signal is applied.
  • the first and second edge detecting circuits 55 and 56 detect rise-up edges of the detection signals generated from the signal detectors 45L and 45R, respectively and output pulse signals corresponding to the rise-up edges pulse signals generated by the first and second edge detecting circuits 55 and 56 are supplied to a distance calculating circuit 58 and a circuit 59 for detecting the time difference between both ears.
  • the third edge detecting circuit 57 detects the rise-up edge of the ultrasonic signal from the ultrasonic signal source 52 and outputs a pulse signal corresponding to the rise-up edge.
  • a pulse signal obtained by the third edge detection circuit 57 is supplied to the distance calculating circuit 58.
  • Signals representative of the distance l 0 , time differences t 1 and t 2 are fed to an angle calculating circuit 60.
  • the circuit 59 for detecting the time difference between both ears detects the time difference t 3 between the pulse signals generated by the first and second edge detecting circuits 55 and 56. A signal representative of the time difference t 3 is fed to the angle calculating circuit 60.
  • the angle calculating circuit 60 calculates an angle ⁇ 0 representative of the direction of the head M by using the time differences t 1 , t 2 , t 3 , the distance l 0 , the sound velocity V and the radius r of the head M similarly to the angle calculating circuit 20 in the first embodiment.
  • the operation unit 54 includes a storing circuit 62 in which transfer characteristics information representative of the transfer characteristics from the virtual sound sources to both ears of the listener for each predetermined angle, which is larger than that of the angular positional information of the listener calculated by the angle calculating circuit 60.
  • the interpolation operation and processing circuit 61 reads the information on two transfer characteristics in the vicinity of the rotational angular position of the head represented the current angular positional information calculated by the angle calculating circuit 60 and operates the transfer characteristics in the current rotational angular position of the head by, for example, a linear interpolation processing.
  • the information on the transfer characteristics in the current rotational angular position obtained by the interpolation operation and processing circuit 61 is supplied to an audio signal processing circuit 63.
  • the audio signal processing circuit 63 is also supplied with left and right channel audio signals S L and S R outputted from an audio signal source 64.
  • the output signals from the first and third signal processing units 65a and 65c are applied to the right channel adder 66R and are added with each other therein.
  • the output signal of the right channel adder 66R is fed as the right channel audio signal E R via the right channel amplifier 68R to the right channel headphone 4R of the headphones 40 and reproduced as a sound.
  • the output signals from the second and fourth signal processing units 64b and 64a are applied to the left channel adder 66L and are added with each other therein.
  • the output signal of the right channel adder 66L is fed as the left channel audio signal E R via the left channel amplifier 68L to the left channel headphone 42L of the headphone set 40 and reproduced as a sound.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • Stereophonic System (AREA)
  • Stereophonic Arrangements (AREA)

Claims (5)

  1. Audiosignal-Wiedergabevorrichtung, mit
    einer Vorrichtung (62) zum Speichern von Übertragungskennlinien-Informationen, die für die Übertragungskennlinien von virtuellen Klangquellen zu beiden Ohren eines Hörers für jeden von vorbestimmten Drehwinkeln entsprechend einer Kopfbewegung des Hörers repräsentativ sind;
    einer Erfassungsvorrichtung (45L, 45R, 51, 55-59) der Drehwinkelstellung entsprechend der Kopfbewegung des Hörers; und
    einer Audiosignal-Verarbeitungsvorrichtung (63) zum Verarbeiten von Audiosignalen des linken und des rechten Kanals mit berechneten Ubertragungskennlinien-Informationen, wodurch die Audiosignale, die durch die Audiosignal-Verarbeitungsvorrichtung verarbeitet worden sind, als Töne durch eine Kopfhöreranordnung (40) wiedergegeben werden können;
    gekennzeichnet durch
    eine Interpolationsvorgang-Vorrichtung (61) zum Lesen von Informationen über wenigstens zwei Übertragungskennlinien für Winkel in der Nähe der Drehwinkelstellung des Kopfes, die durch ein Erfassungs-Ausgangssignal der Erfassungsvorrichtung dargestellt ist, aus der Speichervorrichtung, und zur Interpolationsverarbeitung der gelesenen Übertragungskennlinien-Informationen für die Drehwinkelstellung des Kopfes, die durch das Erfassungs-Ausgangssignal der Erfassungsvorrichtung dargestellt ist, um die berechneten Übertragungskennlinien-Informationen abzuleiten.
  2. Vorrichtung nach Anspruch 1,
    dadurch gekennzeichnet,
    daß die Erfassungsvorrichtung aufweist:
    ein Paar von Signalerfassungselementen (45L, 45R) zum Erfassen eines Referenzsignals, das von einer Referenzsignalquelle (53) übertragen wird; eine Vorrichtung (58) zum Berechnen des Abstandes zwischen der Referenzsignalquelle und dem Kopf des Hörers aus der Phasendifferenz zwischen den Erfassungs-Ausgangssignalen aus dem Paar der Signalerfassungselemente und dem Referenzsignal; und
    eine Vorrichtung (59) zum Erfassen der Zeitdifferenz zwischen den Erfassungs-Ausgangssignalen aus dem Paar von Signalerfassungselementen, wodurch die Drehwinkelstellung des Kopfes des Hörers unter Verwendung der Informationen bezüglich des Abstandes, der aus der Abstandsberechnungsvorrichtung erhalten wurde, und bezüglich der Zeitdifferenz, die aus der Zeitdifferenz-Erfassungsvorrichtung erhalten wurde, berechnet wird.
  3. Vorrichtung nach Anspruch 1 oder 2,
    dadurch gekennzeichnet,
    daß die Audiosignal-Verarbeitungsvorrichtung aufweist:
    eine erste Signalverarbeitungseinrichtung (65a) zum Anwenden eines Faltungsintegrals der Impulsantwort, die der Übertragungskennlinie eines wiedergegebenen Audiosignals des rechten Kanals eines Eingangs-Audiosignals zum rechten Ohr entspricht, auf das Eingangs-Audiosignal des rechten Kanals (SR);
    eine zweite Signalverarbeitungseinrichtung (65b) zum Anwenden eines Faltungsintegrals der Impulsantwort, die der Übertragungskennlinie des wiedergegebenen Audiosignals des rechten Kanals zum linken Ohr entspricht, auf das Eingangs-Audiosignal des rechten Kanals (SR);
    eine dritte Signalverarbeitungseinrichtung (65c) zum Anwenden eines Faltungsintegrals der Impulsantwort, die der Übertragungskennlinie des wiedergegebenen Audiosignals des linken Kanals des Eingangs-Audiosignals zum rechten Ohr entspricht, auf das Eingangs-Audiosignal des rechten Kanals (SL);
    eine vierte Signalverarbeitungseinrichtung (65d) zum Anwenden eines Faltungsintegrals der Impulsantwort, die der Übertragungskennlinie des wiedergegebenen Audiosignals des linken Kanals zum linken Ohr entspricht, auf das Eingangs-Audiosignal des linken Kanals (SL);
    eine erste Additionseinrichtung (66R) zum Addieren des Ausgangssignals der ersten Signalverarbeitungseinheit mit dem Ausgangssignal der dritten Signalverarbeitungseinheit;
    eine zweite Additionseinrichtung (66L) zum Addieren des Ausgangssignals der zweiten Signalverarbeitungseinheit mit dem Ausgangssignal der vierten Signalverarbeitungseinheit, wobei die Ausgangssignale (ER, EL) der ersten und der zweiten Additionseinrichtung den Kopfhörern des rechten bzw. des linken Kanals der Kopfhöreranordnung zugeführt werden.
  4. Vorrichtung nach Anspruch 1, 2 oder 3,
    dadurch gekennzeichnet,
    daß die Interpolationsvorgang-Vorrichtung (61) zum Lesen der Informationen über zwei Übertragungskennlinien für Winkel in der Nähe der Drehwinkelstellung des Kopfes, die durch das Erfassungs-Ausgangssignal aus der Erfassungsvorrichtung dargestellt ist, aus der Speichervorrichtung (62) und zum Bewirken eines linearen Interpolationsvorganges, um die berechneten Übertragungskennlinien-Informationen abzuleiten, konstruiert ist.
  5. Vorrichtung nach Anspruch 1, 2 oder 3,
    dadurch gekennzeichnet,
    daß die Interpolationsvorgang-Vorrichtung (61) zum Lesen der Informationen über mehr als zwei Übertragungskennlinien für Winkel in der Nähe der Drehwinkelstellung des Kopfes, die durch das Erfassungs-Ausgangssignal aus der Erfassungsvorrichtung dargestellt ist, aus der Speichervorrichtung und zum Bewirken eines Sekundär-Interpolationsvorganges, um die berechneten Übertragungskennlinien-Informationen abzuleiten, konstruiert ist.
EP95104929A 1990-01-19 1991-01-18 Einrichtung zur Wiedergabe von Audiosignalen Expired - Lifetime EP0664660B1 (de)

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
JP8520/90 1990-01-19
JP2008520A JP2893780B2 (ja) 1990-01-19 1990-01-19 音響信号再生装置
JP851490 1990-01-19
JP8514/90 1990-01-19
JP852090 1990-01-19
JP2008514A JP2751512B2 (ja) 1990-01-19 1990-01-19 音響信号再生装置
EP91902738A EP0464217B1 (de) 1990-01-19 1991-01-18 Gerät zur wiedergabe von tonsignalen

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
EP91902738A Division EP0464217B1 (de) 1990-01-19 1991-01-18 Gerät zur wiedergabe von tonsignalen
EP91902738.3 Division 1991-01-18

Publications (3)

Publication Number Publication Date
EP0664660A2 EP0664660A2 (de) 1995-07-26
EP0664660A3 EP0664660A3 (de) 1995-08-09
EP0664660B1 true EP0664660B1 (de) 2000-09-27

Family

ID=26343043

Family Applications (2)

Application Number Title Priority Date Filing Date
EP95104929A Expired - Lifetime EP0664660B1 (de) 1990-01-19 1991-01-18 Einrichtung zur Wiedergabe von Audiosignalen
EP91902738A Expired - Lifetime EP0464217B1 (de) 1990-01-19 1991-01-18 Gerät zur wiedergabe von tonsignalen

Family Applications After (1)

Application Number Title Priority Date Filing Date
EP91902738A Expired - Lifetime EP0464217B1 (de) 1990-01-19 1991-01-18 Gerät zur wiedergabe von tonsignalen

Country Status (5)

Country Link
EP (2) EP0664660B1 (de)
KR (1) KR920702175A (de)
CA (1) CA2048686C (de)
DE (2) DE69120150T2 (de)
WO (1) WO1991011080A1 (de)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0795698A (ja) * 1993-09-21 1995-04-07 Sony Corp オーディオ再生装置
DE69632889T2 (de) * 1995-05-22 2005-07-21 Victor Company of Japan, Ltd., Yokohama Wiedergabegerät mit Kopfhörer
JP3577798B2 (ja) * 1995-08-31 2004-10-13 ソニー株式会社 ヘッドホン装置
FR2744871B1 (fr) * 1996-02-13 1998-03-06 Sextant Avionique Systeme de spatialisation sonore, et procede de personnalisation pour sa mise en oeuvre
US20090052703A1 (en) * 2006-04-04 2009-02-26 Aalborg Universitet System and Method Tracking the Position of a Listener and Transmitting Binaural Audio Data to the Listener
EP2288178B1 (de) * 2009-08-17 2012-06-06 Nxp B.V. Vorrichtung und Verfahren zur Verarbeitung von Audiodaten
US9706304B1 (en) * 2016-03-29 2017-07-11 Lenovo (Singapore) Pte. Ltd. Systems and methods to control audio output for a particular ear of a user

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5419242B2 (de) 1973-06-22 1979-07-13
JPS5165901A (de) * 1974-12-05 1976-06-08 Sony Corp
US4076677A (en) 1976-06-23 1978-02-28 Desoto, Inc. Aqueous copolymer dispersions and method of producing the same
JPS54109401A (en) * 1978-02-16 1979-08-28 Victor Co Of Japan Ltd Signal converter
JPS58116900A (ja) * 1982-11-15 1983-07-12 Sony Corp ステレオ再生装置
US4893342A (en) * 1987-10-15 1990-01-09 Cooper Duane H Head diffraction compensated stereo system
JP2671329B2 (ja) * 1987-11-05 1997-10-29 ソニー株式会社 オーディオ再生装置
JP3155592B2 (ja) * 1991-12-11 2001-04-09 武藤工業株式会社 累進寸法修正方法および装置

Also Published As

Publication number Publication date
EP0464217A1 (de) 1992-01-08
DE69120150T2 (de) 1996-12-12
DE69120150D1 (de) 1996-07-18
DE69132430D1 (de) 2000-11-02
CA2048686C (en) 2001-01-02
EP0464217B1 (de) 1996-06-12
EP0464217A4 (en) 1992-06-24
CA2048686A1 (en) 1991-07-20
KR920702175A (ko) 1992-08-12
EP0664660A2 (de) 1995-07-26
WO1991011080A1 (fr) 1991-07-25
DE69132430T2 (de) 2001-04-05
EP0664660A3 (de) 1995-08-09

Similar Documents

Publication Publication Date Title
US5495534A (en) Audio signal reproducing apparatus
JP2964514B2 (ja) 音響信号再生装置
EP0465662B1 (de) Gerät zur wiedergabe von tonsignalen
JP3687099B2 (ja) 映像信号及び音響信号の再生装置
KR100435217B1 (ko) 헤드폰장치
EP0699012B1 (de) Schallbildverbesserungsvorrichtung
EP0674467B1 (de) Audiowiedergabeeinrichtung
US5896456A (en) Automatic stereophonic manipulation system and apparatus for image enhancement
US5526429A (en) Headphone apparatus having means for detecting gyration of user's head
EP0664660B1 (de) Einrichtung zur Wiedergabe von Audiosignalen
TW200513134A (en) Method and arrangement for locating aural events such that they have a constant spatial direction using headphones
US7917236B1 (en) Virtual sound source device and acoustic device comprising the same
EP1161119B1 (de) Verfahren zur Tonbildlokalisierung
JP2893780B2 (ja) 音響信号再生装置
JPH09205700A (ja) ヘッドホン再生における音像定位装置
JP2893779B2 (ja) ヘッドホン装置
JP2751512B2 (ja) 音響信号再生装置
JPH03296400A (ja) 音響信号再生装置
JP2751514B2 (ja) 音響信号再生装置
JP2874236B2 (ja) 音響信号再生システム
JP3111455B2 (ja) 音響信号再生装置
JPH03214896A (ja) 音響信号再生装置
JPH03254163A (ja) 車両用音響装置
KR20000009249A (ko) 스테레오 다이폴을 이용한 3차원 음향재생장치

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AC Divisional application: reference to earlier application

Ref document number: 464217

Country of ref document: EP

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): DE FR GB NL

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): DE FR GB NL

17P Request for examination filed

Effective date: 19960115

17Q First examination report despatched

Effective date: 19990308

GRAG Despatch of communication of intention to grant

Free format text: ORIGINAL CODE: EPIDOS AGRA

17Q First examination report despatched

Effective date: 19990308

GRAG Despatch of communication of intention to grant

Free format text: ORIGINAL CODE: EPIDOS AGRA

GRAH Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOS IGRA

GRAH Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOS IGRA

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AC Divisional application: reference to earlier application

Ref document number: 464217

Country of ref document: EP

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): DE FR GB NL

REF Corresponds to:

Ref document number: 69132430

Country of ref document: DE

Date of ref document: 20001102

ET Fr: translation filed
PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed
REG Reference to a national code

Ref country code: GB

Ref legal event code: IF02

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20100208

Year of fee payment: 20

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20100113

Year of fee payment: 20

Ref country code: DE

Payment date: 20100114

Year of fee payment: 20

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20100101

Year of fee payment: 20

REG Reference to a national code

Ref country code: NL

Ref legal event code: V4

Effective date: 20110118

REG Reference to a national code

Ref country code: GB

Ref legal event code: PE20

Expiry date: 20110117

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF EXPIRATION OF PROTECTION

Effective date: 20110118

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF EXPIRATION OF PROTECTION

Effective date: 20110117

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF EXPIRATION OF PROTECTION

Effective date: 20110118