EP0664660A2 - Einrichtung zur Wiedergabe von Audiosignalen - Google Patents

Einrichtung zur Wiedergabe von Audiosignalen Download PDF

Info

Publication number
EP0664660A2
EP0664660A2 EP95104929A EP95104929A EP0664660A2 EP 0664660 A2 EP0664660 A2 EP 0664660A2 EP 95104929 A EP95104929 A EP 95104929A EP 95104929 A EP95104929 A EP 95104929A EP 0664660 A2 EP0664660 A2 EP 0664660A2
Authority
EP
European Patent Office
Prior art keywords
transfer characteristics
audio signal
signal
head
listener
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP95104929A
Other languages
English (en)
French (fr)
Other versions
EP0664660B1 (de
EP0664660A3 (de
Inventor
Kiyofumi C/O Sony Corporation Inanaga
Hiroyuki C/O Sony Corporation Sogawa
Yasuhiro C/O Sony Corporation Iida
Susumu C/O Sony Corporation Yabe
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2008520A external-priority patent/JP2893780B2/ja
Priority claimed from JP2008514A external-priority patent/JP2751512B2/ja
Application filed by Sony Corp filed Critical Sony Corp
Publication of EP0664660A2 publication Critical patent/EP0664660A2/de
Publication of EP0664660A3 publication Critical patent/EP0664660A3/xx
Application granted granted Critical
Publication of EP0664660B1 publication Critical patent/EP0664660B1/de
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R5/00Stereophonic arrangements
    • H04R5/033Headphones for stereophonic communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S1/00Two-channel systems
    • H04S1/002Non-adaptive circuits, e.g. manually adjustable or static, for enhancing the sound image or the spatial distribution
    • H04S1/005For headphones

Definitions

  • the present invention relates to an audio signal binaural reproducing apparatus for reproducing audio signals by means of headphones.
  • a binaural reproducing method has heretofore been known as an approach for providing better direction sensation of sound image or outside head localization sensation when audio signals are reproduced by headphones fitted to the head of a listener so that a pair of headphones are located in the vicinity of both ears.
  • An audio reproducing system adopting this binaural system preliminarily applies a given signal processing to the audio signals reproduced by headphones as is described in, for example, specification of Japanese Patent Publication Sho 53-283.
  • the direction sensation of sound image and outside head localization sensation and the like depend upon the differences in volumes, times and phases of sounds listened by left and right ears.
  • the signal processing aims at causing in an audio output reproduced by the headphones, audio effects equivalent to those caused by the difference in distances between sound sources, that is, speaker systems and right and left ears of a listener and reflections and diffractions in the vicinity of the head of the listener when audio reproducing is performed, for example, by speaker systems remote from the listener.
  • Such a signal processing is performed by convolution-integrating left and right ear audio signals with impulse responses corresponding to the above-mentioned audio effects.
  • the relative direction and position of the sound image that the listener senses are changed.
  • the headphones is turned together with the listeners head if the listener turns his or her head when audio reproducing is performed by a binaural method using headphones, the relative direction and position of the sound image which the listener senses are not changed.
  • an audio signal reproducing system which detects a change in the direction of the listener's head and changes the modes of the signal processing based upon a result of the detection for providing a good front localization sensation in headphones has heretofore been proposed as is disclosed in Japanese Unexamined Patent Publication No. Sho 42-227 and Japanese Examined Patent Publication No. 54-19242.
  • a direction detecting device such as gyrocompass and magnetic needle is provided on the head of the listener.
  • a level adjusting circuit and a delay circuit and the like for processing the audio signals are controlled based upon a result of detection from the direction detecting device so that a sound image sensation which is similar to that of the audio reproducing using speaker systems remote from the listener is obtained.
  • the impulse responses that is, transfer characteristics corresponding to audio effects given to audio signals of left and right ears for each predetermined rotational angle and to store a great amount of information on the transfer characteristics.
  • the information is read from the storing means depending upon the change in direction of the head.
  • the audio signal will be subjected to a necessary convolution-integration processing in real-time.
  • the present invention was made under such circumstances.
  • An audio signal reproducing apparatus of the present invention comprises an audio signal reproducing apparatus, comprising: means for storing transfer characteristics information representative of the transfer characteristics from virtual sound sources to both ears of a listener for each predetermined rotational angle corresponding to the movement of a head of the listener; means for detecting the rotational angular position corresponding to the movement of the head of the listener; interpolation operation means which reads from said storing means information on at least two transfer characteristics in the vicinity of the rotational angular position of the head represented by a detection output of said detecting means for interpolation-processing the read transfer characteristics information in the rotational angular position of the head represented by the detection output of said detecting means; and audio signal processing means for processing left and right channel audio signals with the transfer characteristics information determined by said interpolation operation means, whereby the audio signals which have been processed by said audio signal processing means are reproduced as sounds by a headphone set.
  • An audio signal reproducing apparatus described for reference comprises a headphone set 10 which is fitted over the head M of a listener P and a pair of headphones 2L and 2R are supported by a head band 1 so that they are located in the vicinity of left and right ears of the listener P, respectively as shown in Fig. 1.
  • Two sliders 4L and r$ from which support arms 3L and 3R, respectively project are slidably mounted on the head band 1 of the headphone set 10.
  • a pair of signal detectors 5L and 5R which detect a position detection reference signal emitted from a reference signal source 11 are provided at the tip ends of the support arms 3L and 3R, respectively. That is, the pair of signal detectors 5L and 5R are provided on the tip ends of the support arms 3L and 3R projectedly formed on the sliders 4L and 4R which are slidably mounted on the head band 1 so that they are supported in positions remote from the head band 1 and the pair of headphones 2L and 2R, that is the main body of the headphone set.
  • the reference signal source 11 comprises an ultrasonic signal source 12 and an ultrasonic speaker 13 for generating an ultrasonic signal from the ultrasonic signal source 12 as a reference signal.
  • Each of the pair of signal detectors 5L and 5R which receive the reference signal comprises an ultrasonic microphone.
  • An ultrasonic wave generated from the ultrasonic speaker 13, that is, the position detection reference signal is a burst wave in which an ultrasonic wave having a given level is intermittently generated for a given period of time as shown at A in Fig. 2, or an ultrasonic wave, the phase of which may be detected like a so-called level modulated wave, the level of which changes in a given circle.
  • the pair of signal detectors 5L and 5R provided on the headphone set 10 detects the ultrasonic position detection reference signal generated from the ultrasonic speaker 13 and generate respective detection signals shown at B and C in Fig. 2, each having a time lag depending upon the relative positional relation between the listener P and the ultrasonic speaker 13.
  • the pair of signal detectors 5L and 5R are supported by the support arm 3L and 3R in positions remote from the main body of the headphone set 10 while they are mounted on the tip ends of the support arms 3L and 3R in positions remote from the main body of the headphone set 10 while they are mounted on the tip ends of the support arms 3L and 3R which project from the sliders 4L and 4R, respectively slidably mounted on the head band 1 and, the head band 1 and the pair of headphone 2L and 2R, that is, the main body of the headphone set is fitted on the head M of the listener P, they can detect the ultrasonic wave generated from the ultrasonic speaker 13, that is, the position detection reference signal stably and accurately without being located behind the head P of the listener P even if the listener P moves or rotates his head P.
  • the pair of the signal detectors 5L and 5R can be adjusted to a position optical for detecting the detection reference signal by sliding the sliders 4L and 4R along the head band 1.
  • the optimal positions of the headphones 2L and 2R which are fitted on the head M of the listener P by the head band 1 so that they correspond to the vicinity of the left and right ears depend on the shape and size of the had M of the listener P, that is, have the differences among individuals. Accordingly, the positions of the pair of signal detectors 5L and 5R can be adjusted so that they correspond to the headphones 2L and 2R, respectively.
  • Each detection signal obtained by these signal detectors 5L and 5R is applied to an operation unit 14.
  • the operation unit 14 comprises first and second edge detecting circuits 15 and 16, to which the detection signal from the signal detectors 5L and 5R for detecting the position detection reference signal are supplied, respectively and a third edge detecting circuit 17 to which an ultrasonic signal from the ultrasonic signal sources 12, that is, the position detection reference signal is applied.
  • the first and second edge detecting circuits 15 and 16 detect rise-up edges of the detection signals generated from the signal detectors 5L and 5R, respectively and output pulse signals shown at D and E of Fig. 2 corresponding to the rise-up edges pulse signals generated by the first and second edge detecting circuits 15 and 16 are supplied to a distance calculating circuit 18 and a circuit 19 for detecting the time difference between both ears.
  • the third edge detecting circuit 17 detects the rise-up edge of the ultrasonic signal from the ultrasonic signal source 12 and outputs a pulse signal corresponding to the rise-up edge as shown at F in Fig. 2. A pulse signal obtained by the third edge detection circuit 17 is supplied to the distance calculating circuit 18.
  • the distance calculating circuit 18 detects the time difference t1 between pulse signals obtained by the third and first edge detecting circuits 17 and 15 which is represented as ⁇ T1 in Fig. 2 and the time difference t2 between pulse signals obtained by the third and second edge detecting circuits 17 and 16 which is represented as ⁇ T2 in Fig. 2 and then calculates the distance l0 between the ultrasonic speaker 13 and the center of the head M of the listener P represented as l0 in Fig. 3 based upon the time differences t1, t2 and the sound velocity V.
  • the sound velocity V may be preliminarily preset as a constant in the distance calculating circuit 18 or alternatively may be changed with changes in atmospheric temperature, humidity and atmospheric pressure and the like.
  • compensation may be conducted for the positional relation between the signal detector 5L and 5R and the center of the head M, the shape and size of the hand M.
  • Signals representative of the distance l0, time differences t1 and t2 are fed to an angle calculating circuit 20.
  • the circuit 19 for detecting the time difference between both ears detects the time difference t3 between the pulse signals generated by the first and second edge detecting circuits 15 and 16, represented as ⁇ 3 in Fig. 2.
  • a signal representative of the time difference t3 is fed to the angle calculating circuit 20.
  • the angle calculating circuit 20 calculates an angle representative of the direction of the head M represented by an arrow ⁇ 0 in Fig. 3 by using the time differences t1, t2, t3, the distance l0, the sound velocity V and the radius r of the head M.
  • the angle ⁇ 0 can be determined, for example, by the equation 1 as follows: ⁇ 0 ⁇ sin ⁇ 1 ⁇ V2(t1+t2)t3/4rl ⁇
  • the rotation angle ⁇ of the head M relative to a desired position of a virtual sound source is calculated from information on the angle ⁇ 0 and the distance l0 representative of the relative positional relationship between a reference position and the listener P by assuming that the position of the ultrasonic speaker 13 be the reference position of the virtual sound source.
  • the operation unit 14 includes a storing circuit 22 in which information on transfer characteristics from the virtual sound source to both ears of the listener in first quadrant of the rotational angular position of the head of the listener, for example, information on the transfer characteristics for each angle ⁇ 11 to ⁇ 1n in the first quadrant.
  • the control circuit 21 Based upon the current angle position calculated by the angle calculating circuit 20, the control circuit 21 reads the information on the transfer characteristics corresponding to the current angles ⁇ 11 to ⁇ 1n positions from the storing circuit 22 if the current angle position is in the first quadrant in Fig. 4 and reads the transfer characteristics information in which the current angles ⁇ 21 to ⁇ 2n corresponds to the angles ⁇ 11 to ⁇ 1n in the first quadrant from the storing circuit 22 if the current angle position is in the second quadrant in Fig. 4 and read the transfer characteristics information in which the current angles ⁇ 31 to ⁇ 3n corresponds to the angles ⁇ 11 to ⁇ 1n in the first quadrant from the storing circuit 22 if the current angle position is in the third quadrant in Fig.
  • the transfer characteristics from the virtual sound sources to both ears of the listener can be treated as symmetrical in each quadrant.
  • two transfer characteristics in the vicinity of the rotational angular position of the head represented by the angular position information may be read form the storing circuit 22 and the information on the transfer characteristics in the current head rotational angular position may be operated by, for example, linear interpolation processing, as described later with reference to Fig. 6.
  • the audio signal source 24 is an apparatus for outputting given left and right channel audio signals S L and S R , such as recording disc playback apparatus or radio communication receivers and the like.
  • the audio signal processing circuit 23 performs a signal processing which provides the left and right channel audio signals S L and S R fed from the audio signal source 24 with a given transfer characteristics form the virtual sound source to the both ears of the listener.
  • the audio signal processing circuit 23 comprises first to sixth switches 25L, 25R, 26L, 26R, 27L and, 27R for switching the signal lines and first to fourth signal processing units 28a, 28b, 28c and 28d.
  • the first to sixth switches 25L, 25R, 26L, 26R, 27L and 27R are controlled for switching in response to a control signal from the control circuit 21 representative of the quadrant to which the current angular position belongs.
  • the first and second switches 25L and 25R perform switching of inputs of left and right channel audio signals S L and S R fed from the audio signal source 24 and supply the right channel audio signal S R to the first and second signal processing units 28a and 28b and supply the left channel audio signal S L to the third and fourth signal processing units 28c and 28d when the current angular position is in the first or third quadrant and supply the left channel audio signal S L to the first and second signal processing unit 28a and 28b and supply the right channel audio signal S R to the third and fourth signal processing units 28c and 28d when the current angular position is in the second or fourth quadrant.
  • the third and fourth switches 26L and 26R perform switching of the output of the left and right channel audio signals E L and E R outputted from the audio signal processing circuit 23 and select as a right channel audio signal E R the output signal of the first adder 29R for adding the output signals of the first and third signal processing unit 28a and 28c and select as a left channel audio signal E L the output signal of the second adder 29 L for adding the output signals of the second and fourth signal processing Units 28b and 28d when the current angular position is in the first or third quadrant and select as a right channel audio signal E R the output signal of the first adder 29L and select as a left channel audio signal E L the output signal of the second adder 29L when the current angular position is in the second or the fourth quadrant.
  • the third and fourth switches 26 L and 26 R perform switching of filters for the left and right channel audio signals E L and E R outputted from the audio signal processing circuit 23 and output the left and right audio signals E L and E R unswitched when the current angular position is in the second or fourth quadrant and output the audio signals E L and E R from which high frequency components have been removed by low pass filters 30L and 30R when the current angular position is in the second or fourth quadrant.
  • an impulse response representative of the transfer characteristics of the left and right channel audio signals S L and S R reproduced from a pair of left and right channel speakers which are virtual sound sources facing to a listener to each ear of the listener is preset based upon information on transfer characteristics supplied from the control circuit 21.
  • the first signal processing unit 28a presets the impulse response ⁇ h RR (t, ⁇ ) ⁇ representative of transfer characteristics of the sound reproduced from the right channel audio signal S R to the right ear.
  • the second signal processing unit 28b presets the impulse response ⁇ h RL (t, ⁇ ) ⁇ representative of the transfer characteristics of the sound reproduced from the right channel audio signal S R to the left ear.
  • the third signal processing unit 28c presets the impulse response ⁇ h LR (t, ⁇ ) ⁇ representative of transfer characteristics of the sound reproduced form the left channel audio signal S L to the right ear.
  • the fourth signal processing unit 28d presents the impulse response ⁇ h LL (t, ⁇ ) ⁇ representative of the transfer characteristics of the sound reproduced from the left channel audio signal S L to the left ear.
  • the right channel audio signal S R is fed the first and second signal processing units 28a and 28b.
  • the first signal processing unit 28a the right channel audio signal S R is subjected to a signal processing of convolution-integration of the impulse response ⁇ h RR (t, ⁇ ) ⁇ .
  • the second signal processing unit 28b the right channel audio signal S R is subjected to a signal processing of convolution-integration of the impulse response ⁇ h RL (t, ⁇ ) ⁇ .
  • the left channel audio signal S L is fed to the third and fourth signal processing units 28c and 28d.
  • the third signal processing unit 28c the left channel audio signal S L is subjected to a signal processing of convolution-integration of the impulse response ⁇ h LR (t, ⁇ ) ⁇ .
  • the second signal processing unit 28d the left channel audio signal S L is subjected to a signal processing of convolution-integration of the impulse response ⁇ h LL (t, ⁇ ) ⁇ .
  • the output signals from the first and third signal processing units 28a and 28c are applied to the right channel adder 29R and are added with each other therein.
  • the output signal of the right channel adder 28R is fed as the right channel studio signal E R via the right channel amplifier 31R to the right channel headphone 2R and reproduced as a sound.
  • the output signals from the second and fourth signal processing units 28b and 28d are applied to the left channel adder 29L and are added with each other therein.
  • the output signal of the left channel adder 29 is fed as the left channel audio signal E L via the left channel amplifier 31L to the left channel headphone 2L and reproduced as a sound.
  • Fig. 5B shows that the listener P has approached to a pair of speaker systems S L and S R , that is, virtual sound sources from a position of Fig. 5A.
  • Fig. 4C shows that the listener P rotates his head M towards the right speaker device S R .
  • the audio signal processing means forms the transfer characteristics information in the rotational angular position represented by a detection output from detecting means for detecting the rotational angular position depending upon the movement of the head of the listener in accordance with the transfer characteristics information of the first quadrant stored in the sorting means and processes the left and right channel audio signals for supplying the processed audio signals to the headphone set. Accordingly, a proper binaural reproduction can be performed for providing a very natural sound image localization sensation in which the positions of the virtual sound sources are not moved even if the listener moves.
  • the audio signal reproducing apparatus of the present invention shown in Fig. 6 comprises a headphone set 40 which is fitted over the head M of a listener P and a pair of headphones 42L and 42R are supported by a head band 41 so that they are located in the vicinity of the left and right ears of the listener P, as is similar to the apparatus shown in Fig. 1.
  • Two sliders 44L and 44R from which support arms 43L and 43R, respectively project are slidably mounted on the head 1 of the headphone set 40.
  • a pair of signal detectors 45L and 45R which detect a position detection reference signal emitted from a reference signal source 51 are provided at the tip ends of the support arms 43L and 43R, respectively. That is, the pair of signal detectors 45L and 45R are provided on the tip ends of the support arms 43L and 43R projectedly formed on the sliders 44L and 44R which are slidably mounted on the head band 51 so that they are supported in positions remote from the head band 51 and the pair of headphones 42L and 42R, that is the main body of the headphone set.
  • the reference signal source 51 comprises an ultrasonic signal source 52 and an ultrasonic speaker 53 for generating an ultrasonic signal from the ultrasonic signal source 52 as a reference signal.
  • Each of the pair of signal detectors 45L and 45R which receives the reference signal comprises an ultrasonic microphone.
  • An ultrasonic wave generated from the ultrasonic speaker 53 that is, the position detection reference signal is a burst wave in which an ultrasonic wave having a given level is intermittently generated for a given period of time as is similar to the first embodiment, or an ultrasonic wave, the phase of which may be detected like a so-called level modulated wave, the level of which changes in a given circle.
  • the pair of signal detectors 45L and 45R provided on the headphone set 40 detects the ultrasonic position detection reference signal generated from the ultrasonic speaker 53 and generate respective detection signals, each having a time lag depending upon the relative positional relation between the listener P and the ultrasonic speaker 53.
  • Each detection signal obtained by these signal detectors 45L and 45R is applied to an operation unit 54.
  • the operation unit 54 comprises first and second edge detecting circuit 55 and 56, to which the detection signal from the signal detectors 45L and 45R for detecting the position detection reference signal are supplied, respectively and a third edge detecting circuit 57 to which an ultrasonic signal from the ultrasonic signal source 52, that is, the position detection reference signal is applied.
  • the first and second edge detecting circuits 55 and 56 detect rise-up edges of the detection signals generated from the signal detectors 45L and 45R, respectively and output pulse signals corresponding to the rise-up edges pulse signals generated by the first and second edge detecting circuits 55 and 56 are supplied to a distance calculating circuit 58 and a circuit 59 for detecting the time difference between both ears.
  • the third edge detecting circuit 57 detects the rise-up edge of the ultrasonic signal from the ultrasonic signal source 52 and outputs a pulse signal corresponding to the rise-up edge.
  • a pulse signal obtained by the third edge detection circuit 57 is supplied to the distance calculating circuit 58.
  • the distance calculating circuit 58 detects the time difference t1 between pulse signals obtained by the third and first edge detecting circuits 57 and 55 and the time difference t2 between pulse signals obtained by the third and second edge detecting circuits 57 and 56 and then calculates the distance l0 between the ultrasonic speaker 53 and the center of the head M of the listener based upon the time differences t1, t2 and the sound velocity V.
  • Signals representative of the distance l0, time differences t1 and t2 are fed to an angle calculating circuit 60.
  • the circuit 59 for detecting the time difference between both ears detects the time difference t3 between the pulse signals generated by the first and second edge detecting circuits 55 and 56. A signal representative of the time difference t3 is fed to the angle calculating circuit 60.
  • the angle calculating circuit 60 calculates an angle ⁇ 0 representative of the direction of the head M by using the time differences t1, t2, t3, the distance l0, the sound velocity V and the radius r of the head M similarly to the angle calculating circuit 20 in the first embodiment.
  • the operation unit 54 includes a storing circuit 62 in which transfer characteristics information representative of the transfer characteristics from the virtual sound sources to both ears of the listener for each predetermined angle, which is larger than that of the angular positional information of the listener calculated by the angle calculating circuit 60.
  • the interpolation operation and processing circuit 61 reads the information on two transfer characteristics in the vicinity of the rotational angular position of the head represented the current angular positional information calculated by the angle calculating circuit 60 and operates the transfer characteristics in the current rotational angular position of the head by, for example, a linear interpolation processing.
  • the interpolation operation and processing circuit 61 may reads the information on more than two transfer characteristics in the vicinity of the current rotational angular position of the head represented by the angular positional information for performing secondary interpolation processing other than the linear interpolation processing.
  • the information on the transfer characteristics in the current rotational angular position obtained by the interpolation operation and processing circuit 61 is supplied to an audio signal processing circuit 63.
  • the audio signal processing circuit 63 is also supplied with left and right channel audio signals S L and S R outputted from an audio signal source 64.
  • the audio signal source 64 is a device for outputting predetermined left and right channel audio signals S L and S R and may includes, for example, various recording disc playback devices, recording the playback device of wireless receivers and the like.
  • the audio signal processing circuit 63 performs a signal processing which provides the left and right channel audio signals S L and S R fed from the audio signal source 64 with a given transfer characteristics from the virtual sound source to the both ears of the listener.
  • the audio signal processing circuit 63 comprises first through fourth signal processing units 65a, 65b, 65c and 65d to which the transfer characteristics information in the current rotational angular positional of the head obtained by the interpolation operation and processing circuit 61.
  • an impulse response representative of the transfer characteristics of the left and right channel audio signals S L and S R reproduced from a pair of left and right channel speakers which are virtual sound sources facing to a listener to each ear of the listener is preset based upon information on transfer characteristics.
  • the first signal processing unit 65a presents the impulse response ⁇ h RR (t, ⁇ ) ⁇ representative of the transfer characteristics of the sound reproduced form the right channel audio signal S R to the right ear.
  • the second signal processing unit 65b presets the impulse response ⁇ h RL (t, ⁇ ) ⁇ representative of the transfer characteristics of the sound reproduced from the right channel audio signal S R to the left ear.
  • the third signal processing unit 65c presets the impulse response ⁇ h LR (t, ⁇ ) ⁇ representative of the transfer characteristics of the sound reproduced from the left channel audio signal S L to the right ear.
  • the fourth signal processing unit 65d presets the impulse response ⁇ h LL (t, ⁇ ) ⁇ representative of the transfer characteristics of the sound reproduced from the left channel audio signal S L to the left ear.
  • the right channel audio signal S L is fed the first and second signal processing units 65a and 65n.
  • the first signal processing unit 65a the right channel audio signal S L is subjected to a signal processing of convolution-integration of the impulse response ⁇ h RR (t, ⁇ ) ⁇ .
  • the right channel audio signal S R is subjected to a signal processing of convolution-integration of the impulse response ⁇ h RL (t, ⁇ ) ⁇ .
  • the left channel audio signal S L is fed to the third and fourth signal processing units 65c and 65d.
  • the third signal processing unit 65c the left channel audio signal S L is subjected to a signal processing of convolution-integration of the impulse response ⁇ h LR (t, ⁇ ) ⁇ .
  • the second signal processing unit 65d the left channel audio signal S is subjected to a signal processing of convolution-integration of the impulse response ⁇ h LL (t, ⁇ ) ⁇ .
  • the output signals from the first and third signal processing units 65a and 65c are applied to the right channel adder 66R and are added with each other therein.
  • the output signal of the right channel adder 66R is fed as the right channel audio signal E R via the right channel amplifier 68R to the right channel headphone 4R of the headphones 40 and reproduced as a sound.
  • the output signals from the second and fourth signal processing units 64b and 64a are applied to the left channel adder 66L and are added with each other therein.
  • the output signal of the right channel adder 66L is fed as the left channel audio signal E R via the left channel amplifier 68L to the left channel headphone 42L of the headphone set 40 and reproduced as a sound.
  • the audio signal reproducing apparatus information on two transfer characteristics in the vicinity of the rotational angular position represented by the current angular positional information is read from the storing circuit 62 based upon the current angular positional information calculated by the angle calculating circuit 60.
  • the transfer characteristics information in the current rotational angular position are operated by a linear interpolation processing in the interpolation operation circuit 61.
  • the audio signal reproducing apparatus of the present invention information on at least two transfer characteristics in the vicinity of the rotational angular position of the head represented by the detection output from detecting means for detecting the rotational angular position of the head of the listener at a solution higher than that of the transfer characteristics information stored in the storing means is read from the storing means.
  • the transfer characteristics information in the rotational angular position of the head represented by the detection output are interpolation-operated by interpolation operation means. Accordingly, the amount of the information on the transfer characteristics stored in the storing means can be reduced.
  • the audio signal processing means processes the left and right channel audio signals based upon the transfer characteristics information determined by the interpolation operation means. The processed audio signals are supplied to the headphones, resulting in that a proper binaural reproduction can be achieved for providing very natural sound image localization sensation in which the positions of the virtual sound sources do not move even if a listener moves.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • Stereophonic System (AREA)
  • Stereophonic Arrangements (AREA)
EP95104929A 1990-01-19 1991-01-18 Einrichtung zur Wiedergabe von Audiosignalen Expired - Lifetime EP0664660B1 (de)

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
JP8520/90 1990-01-19
JP2008520A JP2893780B2 (ja) 1990-01-19 1990-01-19 音響信号再生装置
JP851490 1990-01-19
JP8514/90 1990-01-19
JP852090 1990-01-19
JP2008514A JP2751512B2 (ja) 1990-01-19 1990-01-19 音響信号再生装置
EP91902738A EP0464217B1 (de) 1990-01-19 1991-01-18 Gerät zur wiedergabe von tonsignalen

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
EP91902738A Division EP0464217B1 (de) 1990-01-19 1991-01-18 Gerät zur wiedergabe von tonsignalen
EP91902738.3 Division 1991-01-18

Publications (3)

Publication Number Publication Date
EP0664660A2 true EP0664660A2 (de) 1995-07-26
EP0664660A3 EP0664660A3 (de) 1995-08-09
EP0664660B1 EP0664660B1 (de) 2000-09-27

Family

ID=26343043

Family Applications (2)

Application Number Title Priority Date Filing Date
EP95104929A Expired - Lifetime EP0664660B1 (de) 1990-01-19 1991-01-18 Einrichtung zur Wiedergabe von Audiosignalen
EP91902738A Expired - Lifetime EP0464217B1 (de) 1990-01-19 1991-01-18 Gerät zur wiedergabe von tonsignalen

Family Applications After (1)

Application Number Title Priority Date Filing Date
EP91902738A Expired - Lifetime EP0464217B1 (de) 1990-01-19 1991-01-18 Gerät zur wiedergabe von tonsignalen

Country Status (5)

Country Link
EP (2) EP0664660B1 (de)
KR (1) KR920702175A (de)
CA (1) CA2048686C (de)
DE (2) DE69120150T2 (de)
WO (1) WO1991011080A1 (de)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0762803A2 (de) * 1995-08-31 1997-03-12 Sony Corporation Kopfhörervorrichtung
FR2744871A1 (fr) * 1996-02-13 1997-08-14 Sextant Avionique Systeme de spatialisation sonore, et procede de personnalisation pour sa mise en oeuvre
EP2005793A2 (de) * 2006-04-04 2008-12-24 Aalborg Universitet Binauraltechnologieverfahren mit positionsverfolgung

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0795698A (ja) * 1993-09-21 1995-04-07 Sony Corp オーディオ再生装置
DE69632889T2 (de) * 1995-05-22 2005-07-21 Victor Company of Japan, Ltd., Yokohama Wiedergabegerät mit Kopfhörer
EP2288178B1 (de) * 2009-08-17 2012-06-06 Nxp B.V. Vorrichtung und Verfahren zur Verarbeitung von Audiodaten
US9706304B1 (en) * 2016-03-29 2017-07-11 Lenovo (Singapore) Pte. Ltd. Systems and methods to control audio output for a particular ear of a user

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS58116900A (ja) * 1982-11-15 1983-07-12 Sony Corp ステレオ再生装置
WO1989003632A1 (en) * 1987-10-15 1989-04-20 Cooper Duane H Head diffraction compensated stereo system
JPH01121000A (ja) * 1987-11-05 1989-05-12 Sony Corp オーディオ再生装置

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5419242B2 (de) 1973-06-22 1979-07-13
JPS5165901A (de) * 1974-12-05 1976-06-08 Sony Corp
US4076677A (en) 1976-06-23 1978-02-28 Desoto, Inc. Aqueous copolymer dispersions and method of producing the same
JPS54109401A (en) * 1978-02-16 1979-08-28 Victor Co Of Japan Ltd Signal converter
JP3155592B2 (ja) * 1991-12-11 2001-04-09 武藤工業株式会社 累進寸法修正方法および装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS58116900A (ja) * 1982-11-15 1983-07-12 Sony Corp ステレオ再生装置
WO1989003632A1 (en) * 1987-10-15 1989-04-20 Cooper Duane H Head diffraction compensated stereo system
JPH01121000A (ja) * 1987-11-05 1989-05-12 Sony Corp オーディオ再生装置

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 13 no. 364 (E-805) ,14 August 1989 & JP-A-01 121000 (SONY CORP) 12 May 1989, *
PATENT ABSTRACTS OF JAPAN vol. 7 no. 226 (E-202) ,7 October 1983 & JP-A-58 116900 (SONY KK) 12 July 1983, *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0762803A2 (de) * 1995-08-31 1997-03-12 Sony Corporation Kopfhörervorrichtung
EP0762803A3 (de) * 1995-08-31 2006-07-26 Sony Corporation Kopfhörervorrichtung
FR2744871A1 (fr) * 1996-02-13 1997-08-14 Sextant Avionique Systeme de spatialisation sonore, et procede de personnalisation pour sa mise en oeuvre
EP0790753A1 (de) * 1996-02-13 1997-08-20 Sextant Avionique System für Raumklangeffekt und Verfahren dafür
US5987142A (en) * 1996-02-13 1999-11-16 Sextant Avionique System of sound spatialization and method personalization for the implementation thereof
EP2005793A2 (de) * 2006-04-04 2008-12-24 Aalborg Universitet Binauraltechnologieverfahren mit positionsverfolgung

Also Published As

Publication number Publication date
EP0464217A1 (de) 1992-01-08
DE69120150T2 (de) 1996-12-12
DE69120150D1 (de) 1996-07-18
DE69132430D1 (de) 2000-11-02
CA2048686C (en) 2001-01-02
EP0464217B1 (de) 1996-06-12
EP0464217A4 (en) 1992-06-24
CA2048686A1 (en) 1991-07-20
KR920702175A (ko) 1992-08-12
WO1991011080A1 (fr) 1991-07-25
DE69132430T2 (de) 2001-04-05
EP0664660B1 (de) 2000-09-27
EP0664660A3 (de) 1995-08-09

Similar Documents

Publication Publication Date Title
US5495534A (en) Audio signal reproducing apparatus
KR100435217B1 (ko) 헤드폰장치
EP0438281A2 (de) Schallsignalwiedergabegerät
KR100225546B1 (ko) 음향신호재생장치
EP0674467B1 (de) Audiowiedergabeeinrichtung
JP3687099B2 (ja) 映像信号及び音響信号の再生装置
EP0699012B1 (de) Schallbildverbesserungsvorrichtung
US5526429A (en) Headphone apparatus having means for detecting gyration of user's head
EP0664660B1 (de) Einrichtung zur Wiedergabe von Audiosignalen
TW200513134A (en) Method and arrangement for locating aural events such that they have a constant spatial direction using headphones
US7917236B1 (en) Virtual sound source device and acoustic device comprising the same
EP1161119B1 (de) Verfahren zur Tonbildlokalisierung
JP4339420B2 (ja) オーディオ再生装置
JP2893780B2 (ja) 音響信号再生装置
JP2893779B2 (ja) ヘッドホン装置
KR20010013170A (ko) 오디오 재생장치
JPH03296400A (ja) 音響信号再生装置
JP2751514B2 (ja) 音響信号再生装置
JP3111455B2 (ja) 音響信号再生装置
JP2874236B2 (ja) 音響信号再生システム
JPH0946797A (ja) 音響信号再生装置
JPH04273800A (ja) 音像定位装置及び音像定位信号処理方法
JPH03214896A (ja) 音響信号再生装置
JPH06319200A (ja) ステレオ用バランス調整装置
JPS5819920Y2 (ja) ヘツドホ−ンによる音響再生装置

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AC Divisional application: reference to earlier application

Ref document number: 464217

Country of ref document: EP

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): DE FR GB NL

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): DE FR GB NL

17P Request for examination filed

Effective date: 19960115

17Q First examination report despatched

Effective date: 19990308

GRAG Despatch of communication of intention to grant

Free format text: ORIGINAL CODE: EPIDOS AGRA

17Q First examination report despatched

Effective date: 19990308

GRAG Despatch of communication of intention to grant

Free format text: ORIGINAL CODE: EPIDOS AGRA

GRAH Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOS IGRA

GRAH Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOS IGRA

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AC Divisional application: reference to earlier application

Ref document number: 464217

Country of ref document: EP

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): DE FR GB NL

REF Corresponds to:

Ref document number: 69132430

Country of ref document: DE

Date of ref document: 20001102

ET Fr: translation filed
PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed
REG Reference to a national code

Ref country code: GB

Ref legal event code: IF02

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20100208

Year of fee payment: 20

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20100113

Year of fee payment: 20

Ref country code: DE

Payment date: 20100114

Year of fee payment: 20

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20100101

Year of fee payment: 20

REG Reference to a national code

Ref country code: NL

Ref legal event code: V4

Effective date: 20110118

REG Reference to a national code

Ref country code: GB

Ref legal event code: PE20

Expiry date: 20110117

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF EXPIRATION OF PROTECTION

Effective date: 20110118

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF EXPIRATION OF PROTECTION

Effective date: 20110117

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF EXPIRATION OF PROTECTION

Effective date: 20110118