WO1998058523A1 - Reproduction of spatialised audio - Google Patents

Reproduction of spatialised audio Download PDF

Info

Publication number
WO1998058523A1
WO1998058523A1 PCT/GB1998/001594 GB9801594W WO9858523A1 WO 1998058523 A1 WO1998058523 A1 WO 1998058523A1 GB 9801594 W GB9801594 W GB 9801594W WO 9858523 A1 WO9858523 A1 WO 9858523A1
Authority
WO
WIPO (PCT)
Prior art keywords
sound
loudspeakers
loudspeaker
signal
component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/GB1998/001594
Other languages
English (en)
French (fr)
Inventor
Andrew Rimell
Michael Peter Hollier
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
British Telecommunications PLC
Original Assignee
British Telecommunications PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by British Telecommunications PLC filed Critical British Telecommunications PLC
Priority to DE69839212T priority Critical patent/DE69839212T2/de
Priority to JP50393099A priority patent/JP4347422B2/ja
Priority to US09/101,382 priority patent/US6694033B1/en
Priority to EP98925802A priority patent/EP0990370B1/en
Priority to AU77783/98A priority patent/AU735333B2/en
Publication of WO1998058523A1 publication Critical patent/WO1998058523A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S3/00Systems employing more than two channels, e.g. quadraphonic
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S2420/00Techniques used stereophonic systems covered by H04S but not provided for in its groups
    • H04S2420/11Application of ambisonics in stereophonic audio systems

Definitions

  • This invention relates to the reproduction of spatiahsed audio in immersive environments with non-ideal acoustic conditions.
  • Immersive environments are expected to be an important component of future communication systems.
  • An immersive environment is one in which the user is given the sensation of being located within an environment depicted by the system, rather than observing it from the exterior as he would with a conventional flat screen such as a television.
  • This "immersion" allows the user to be more fully involved with the subject material.
  • an immersive environment can be created by arranging that the whole of the user's field of vision is occupied with a visual presentation giving an impression of three dimensionality and allowing the user to perceive complex geometry
  • Spatiahsed audio the use of two or more loudspeakers to generate an audio effect perceived by the listener as emanating from a source spaced from the loudspeakers, is well-known.
  • stereophonic effects have been used in audio systems for several decades.
  • the term "virtual" sound source is used to mean the apparent source of a sound, as perceived by the listener, as distinct from the actual sound sources, which are the loudspeakers.
  • Immersive environments are being researched for use in Telepresence, teleconferencing, "flying through" architect's plans, education and medicine.
  • the wide field of vision, combined with spatiahsed audio create a feeling of "being there" which aids the communication process, and the additional sensation of size and depth can provide a powerful collaborative design space.
  • immersive environment Several examples of immersive environment are described by D. M. Traill, J.M. Bowskill and P.J. Lawrence in "Interactive Collaborative Media Environments” (British Telecommunications Technology Journal Vol 1 5 No 4 (October 1 997), pages 1 30 to 1 39.
  • One example of an immersive environment is the BT/ARC VisionDome, (described on pages 1 35 to 1 36 and Figure 7 of that article), in which the visual image is presented on a large concave screen with the users inside (see Figures 1 and 2) .
  • a multi-channel spatiahsed audio system having eight loudspeakers is used to provide audio immersion. Further description may be found at: h ttp://www. labs.b t.
  • a second example is the "SmartSpace" chair described on pages 1 34 and 1 35 (and Figure 6) of the same article, which combines a wide-angle video screen, a computer terminal and spatiahsed audio, all arranged to move with the rotation of a swivel chair - a system currently under development by British Telecommunications pic. Rotation of the chair causes the user's orientation in the environment to change, the visual and audio inputs being modified accordingly.
  • the SmartSpace chair uses transaural processing, as described by COOPER. D. & BAUCK.J. "Prospects for transaural recording", Journal of the Audio Engineering Society 1989, Vol. 37, No 1/2, pp 3- 19, to provide a "sound bubble" around the user, giving him the feeling of complete audio immersion, while the wrap-around screen provides visual immersion.
  • immersive environment is interactive
  • images and spatiahsed sound are generated in real-time (typically as a computer animation)
  • non- interactive material is often supplied with an ambisonic B-Format sound track, the characteristics of which are to be described later in this specification.
  • Ambisonic coding is a popular choice for immersive audio environments as it is possible to decode any number of channels using only three or four transmission channels.
  • ambisonic technology has its limitations when used in telepresence environments, as will be discussed.
  • Figures 1 and 2 show a plan view and side cross section of the VisionDome, with eight loudspeakers ( 1 , 2, 3, 4, 5, 6, 7, 8), the wrap-around screen, and typical user positions marked Multi-channel ambisonic audio tracks are typically reproduced in rectangular listening rooms.
  • loudspeakers 1 , 2, 3, 4, 5, 6, 7, 8
  • Multi-channel ambisonic audio tracks are typically reproduced in rectangular listening rooms.
  • spatialisation is impaired by the geometry of the listening environment Reflections within the hemisphere can destroy the sound-field recombination- although this can sometimes be minimised by treating the wall surfaces with a suitable absorptive material, this may not always be practical.
  • the use of a hard plastic dome as a listening room creates many acoustic problems mainly caused by multiple reflections.
  • the acoustic properties of the dome if left untreated, cause sounds to seem as if they originate from multiple sources and thus the intended sound spatialisation effect is destroyed
  • One solution is to cover the inside surface of the dome with an absorbing material which reduces reflections.
  • the material of the video screen itself is sound absorbent, so it assists in the reduction of sound reflections but it also causes considerable high-frequency attenuation to sounds originating from loudspeakers located behind the screen. This high-frequency attenuation is overcome by applying equalisation to the signals fed into the loudspeakers 1 , 2, 3, 7, 8 located behind the screen.
  • the geometry of the audio effect is no longer consistent with the video and a non-linear mapping is required to restore the perceptual synchronisation
  • the B-Format coder locates the virtual source onto the circumference of a unit circle thus mapping the curvature of the screen.
  • an ambisonic reproduction system is likely to fail to produce the desired auditory spatialisation for most of them.
  • One reason is that the various sound fields generated by the loudspeakers only combine correctly to produce the desired effect of a "virtual" sound source at one position, known as the "sweet-spot". Only one listener (at most) can be located in the precise sweet-spot.
  • the true sweet-spot where in-phase and anti-phase signals reconstruct correctly to give the desired signal
  • the video projector is normally at the geometric centre of the hemisphere, and the ambisonics are generally arranged such that the "sweet spot” is also at the geometric centre of the loudspeaker array, which is arranged to be concentric with the screen
  • the "sweet spot” is also at the geometric centre of the loudspeaker array, which is arranged to be concentric with the screen
  • the paper discusses the effects of a listener being positioned outside the sweet-spot (as would happen with a group of users in a virtual meeting place) and, based on numerous formal listening tests, concludes that listeners can correctly localise the sound only when they are located on the sweet-spot.
  • any virtual sound source will generally seem to be too close to one of the loudspeakers. If it is moving smoothly through space (as perceived by a listener at the sweet spot), users not at the sweet spot will perceive the virtual source staying close to one loudspeaker location, and then suddenly jumping to another loudspeaker.
  • the simplest method of geometric co-ordinate correction involves warping the geometric positions of the loudspeakers when programming loudspeaker locations into the ambisonic decoder.
  • the decoder is programmed for loudspeaker positions closer to the centre than their actual positions: this results in an effect in which the sound moves quickly at the edges of the screen and slowly around the centre of the screen - resulting in a perceived linear movement of the sound with respect to an image on the screen.
  • This principle can only be applied to ambisonic decoders which are able to decode the B-Format signal to selectable loudspeaker positions, i.e. it can not be used with decoders designed for fixed loudspeaker positions (such as the eight corners of a cube or four corners of a square).
  • a non-linear panning strategy has been developed which takes as its input the monophonic sound source, the desired sound location (x,y,z) and the locations of the N loudspeakers in the reproduction system (x,y,z).
  • This system can have any number of separate input sources which can be individually localised to separate points in space.
  • a virtual sound source is panned from one position to another with a non-linear panning characteristic.
  • the non-linear panning corrects the effects described above, in which an audio "hole” is perceived.
  • the perceptual experience is corrected to give a linear audio trajectory from original to final location.
  • the non-linear panning scheme is based on intensity panning and not wavefront reconstruction as in an ambisonic system.
  • the non-linear warping algorithm is a complete system (i.e. it takes a signal's co-ordinates and positions it in 3-dimensional space), so it can only be used for real-time material and not for warping ambisonic recordings.
  • a method of generating a sound field from an array of loudspeakers the array defining a listening space wherein the outputs of the loudspeakers combine to give a spatial perception of a virtual sound source
  • the method comprising the generation, for each loudspeaker in the array, of a respective output component P n for controlling the output of the respective loudspeaker, the output being derived from data carried in an input signal, the data comprising a sum reference signal W, and directional sound components X, Y, (Z) representing the sound component in different directions as produced by the virtual sound source
  • the method comprises the steps of recognising, for each loudspeaker, whether the respective component P n is changing in phase or antiphase to the sum reference signal W, modifying said signal if it is in antiphase, and feeding the resulting modified components to the respective loudspeakers.
  • apparatus for generating a sound field comprising an array of loudspeakers defining a listening space wherein the outputs of the loudspeakers combine to give a spatial perception of a virtual sound source, means for receiving and processing data carried in an input signal, the data comprising a sum reference signal W, and directional information components X, Y, (Z) indicative of the sound in different directions as produced by the virtual sound source, means for the generation from said data of a respective output component P n for controlling the output of each loudspeaker in the array, means for recognising, for each loudspeaker, whether the respective component P n is changing in phase or antiphase to the sum reference signal W, means for modifying said signal if it is in antiphase, and means for feeding the resulting modified components to the respective loudspeakers.
  • the directional sound components are each multiplied by a warping factor which is a function of the respective directional sound component, such that a moving virtual sound source following a smooth trajectory as perceived by a listener at any point in the listening field also follows a smooth trajectory as perceived at any other point in the listening field.
  • the warping factor may be a square or higher even-numbered power, or a sinusoidal function, of the directional sound component.
  • Ambisonic theory presents a solution to the problem of encoding directional information into an audio signal.
  • the signal is intended to be replayed over an array of at least four loudspeakers (for a pantophonic - horizontal plane - system) or eight loudspeakers (for a periphonic - horizontal and vertical plane - system) .
  • B-Format The signal, termed "B-Format” consists (for the first order case) of three components for pantophonic systems (W,XN) and four components for periphonic systems (W,X,Y,Z).
  • W,XN pantophonic systems
  • W,X,Y,Z periphonic systems
  • the B-Format signal comprises three signals W, X, Y , which are defined (see the Malham and Myatt reference above) as:
  • S is the monophonic signal to be spatiahsed.
  • N x - S Front-Back signal
  • the Decoder operates as follows. For a regular array of N speakers the pantophonic system decoding equation is:
  • ⁇ n is the direction of loudspeaker "n" (see figure 4), and thus for a regular four-loudspeaker array as shown in Figure 4 the signals fed to the respective loudspeakers are:
  • This simple algorithm reduces the likelihood of sound localisation collapsing to the nearest loudspeaker when the listener is away from the sweet- spot.
  • B-Format warping takes an ambisonic B-Format recording and corrects for the perceived non-linear trajectory.
  • the input to the system is the B-Format recording and the output is a warped B-format recording (referredto herein as a B ' - Format recording) .
  • the B ' -Format recording can be decoded with any B-Format decoder allowing the use of existing decoders.
  • An ambisonic system produces a 'sweet spot' in the reproduction area where the soundfield reconstructs correctly and in other areas the listeners will not experience correctly localised sound.
  • the aim of the warping algorithm is to change from a linear range of x & y values to a non-linear range
  • a sound is moving from right to left; the sound needs to move quickly at first then slowly across the centre and finally quickly across the far left-hand to provide a corrected percept.
  • Warping also affects the perceptual view of stationary objects, because without warping listeners away from the sweet spot will perceive most virtual sound sources to be concentrated in a few regions, the central region being typically less well populated and being a perceived audio "hole".
  • f(X) & f(Y) are used for different portions of the x' and y' ranges.
  • the aim with sinusoidal warping is to provide a constant level when the virtual sound source is at the extremes of its range and a fast transition to the centre region.
  • Half a cycle of a raised sine wave is used to smoothly interpolate between the extremes and the centre region.
  • B-Format signal as the input to the warping algorithm has many advantages over other techniques.
  • a user's voice may be encoded with a B-Format signal which is then transmitted to all of the other users in the system (they may be located anywhere in the world).
  • the physical environment in which the other users are located may vary considerably, one may use a binaural headphone based system (see MOLLER.H. "Fundamentals of binaural technology” Applied Acoustics 1992, Vol. 36, pp 171-218)
  • Another environment may be in a VisionDome using warped ambisonics.
  • Yet others may be using single user true ambisonic systems, or transaural two loudspeaker reproduction systems, as described by Cooper and Bauck (previously referred to). The concept is shown in Figure 5.
  • Practical virtual meeting places may be separated by a few metres or by many thousands of kilometres.
  • the audio connections between each participant are typically via broadband digital networks such as ISDN, LAN or WAN. It is therefore beneficial to carry out the coding and decoding within the digital domain to prevent unnecessary D/A and A/D conversion stages.
  • the coding is carried out by using conventional B-Format coders and the decoding by a modified (warping) decoder.
  • the exception to this is the use of non-linear panning which needs to either transmit a monophonic signal with its co-ordinates, or an N channel signal - making non-linear panning less suitable for use in a system employing remote virtual meeting places.
  • the Lake HURON DSP engine is a proprietary method of creating and decoding ambisonic B-Format signals, it can decode both 2-D and 3-D audio with any number of arbitrarily spaced loudspeakers.
  • a description can be found at http://www.lakedsp.eom//index.htm.
  • the Huron is supplied with the necessary tools to create custom DSP programs, and as the mathematics of the warping algorithms shown here are relatively simple they could be included in an implementation of an ambisonic decoder.
  • the main advantage of this method is that the hardware is already developed and the system is capable of handling a large number of I/O channels
  • a second method of digital implementation could involve programming a
  • DSP chip on one of the many DSP development systems available from the leading DSP chip manufacturers Such a system would require 2 or 3 input channels and a larger number of output channels (usually four or eight) . Such an implementation would produce a highly specialised decoder which could be readily mass-produced. As the technology of PCs and sound-cards increases, real-time ambisonic decoding and warping will become a practical reality - reducing the requirement for complex DSP system design
  • the B-Format warping and decoder warping may alternatively be carried out in the analogue domain using analogue multipliers.
  • a conventional ambisonic decoder may be used to perform the B ' -Format decoding with the decoder outputs feeding into the decoder warper hardware, such a system is shown in Figure 6.
  • Block diagrams of the B-Format warper and the decoder warper are shown in Figures 7 and 8 respectively. The block diagrams correspond to the function blocks available from analogue multipliers, of the general kind described at http://www analog. com/products/index/ 12 html)
  • FIG. 9 shows the output of each of the four loudspeaker feeds, from a four channel decoder, using a conventional ambisonic B-Format coding, with the loudspeaker geometry shown in Figure 4. It can be seen that the virtual source is initially located near loudspeaker 3, which initially has a full magnitude output, loudspeaker 1 initially has an anti-phase output and loudspeakers 2 & 4 have the value of W .
  • loudspeakers 1 , 2, 3 & 4 are equal.
  • loudspeaker 1 At the end of the example trajectory loudspeaker 1 has a high output level, loudspeaker 3 is in anti-phase and 2 & 4 remain at the constant W level.
  • Figure 1 0 shows the effect of introducing B-Format warping (a B ' -Format signal).
  • the loudspeakers have similar levels at the trajectory start and end points to conventional B-Format warping, however the path is now mainly in the central area thus eliminating the perception of sound "hanging around” or "collapsing to" individual loudspeakers.
  • the loudspeaker feeds shown in Figures 9 and 10 are for an ambisonic signal - where the correct signal is obtained at the sweet-spot by the vector summation of the in-phase and anti-phase signals.
  • the decoder warping algorithm attenuates the anti-phase components presenting a more coherent signal to listeners not situated at the sweet-spot.
  • Figure 1 2 shows B ' -Format decoding (as seen in figure 10) with decoder warping, and the effect of the anti-phase attenuation can be seen.
  • the above example considered a trajectory of (-1 ,-1 ) to ( 1 , 1 ) i.e. back-left to front-right: the following example considers a trajectory of ( 1 , 1 ) to (-1 , 1 ) i.e. front-right to front-left.
  • Figures 1 3, 14, 1 5 and 1 6 show, respectively, the effects of the B-Format decoder, the B ' -Format decoder, the B-Format decoder with decoder warping, and the B ' -Format decoder with decoder warping.
  • the anti-phase signal is more prominent due to the chosen virtual source trajectory.
  • the decoder warping factor D is set to zero, removing all of the anti-phase component.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • Stereophonic System (AREA)
PCT/GB1998/001594 1997-06-17 1998-06-01 Reproduction of spatialised audio Ceased WO1998058523A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
DE69839212T DE69839212T2 (de) 1997-06-17 1998-06-01 Raumklangwiedergabe
JP50393099A JP4347422B2 (ja) 1997-06-17 1998-06-01 空間形成されたオーディオの再生
US09/101,382 US6694033B1 (en) 1997-06-17 1998-06-01 Reproduction of spatialized audio
EP98925802A EP0990370B1 (en) 1997-06-17 1998-06-01 Reproduction of spatialised audio
AU77783/98A AU735333B2 (en) 1997-06-17 1998-06-01 Reproduction of spatialised audio

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP97304218.7 1997-06-17
EP97304218 1997-06-17

Publications (1)

Publication Number Publication Date
WO1998058523A1 true WO1998058523A1 (en) 1998-12-23

Family

ID=8229380

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB1998/001594 Ceased WO1998058523A1 (en) 1997-06-17 1998-06-01 Reproduction of spatialised audio

Country Status (6)

Country Link
US (1) US6694033B1 (enExample)
EP (1) EP0990370B1 (enExample)
JP (1) JP4347422B2 (enExample)
AU (1) AU735333B2 (enExample)
DE (1) DE69839212T2 (enExample)
WO (1) WO1998058523A1 (enExample)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004040939A3 (de) * 2002-10-18 2004-09-16 Siemens Ag Verfahren zum vortäuschen einer bewegung mittels einer akustischen wiedergabeeinrichtung und schallwiedergabeanordnung dafür
US7184559B2 (en) * 2001-02-23 2007-02-27 Hewlett-Packard Development Company, L.P. System and method for audio telepresence
US7787638B2 (en) 2003-02-26 2010-08-31 Fraunhofer-Gesellschaft Zur Forderung Der Angewandten Forschung E.V. Method for reproducing natural or modified spatial impression in multichannel listening
DE19906420B4 (de) * 1999-02-16 2013-05-29 Grundig Multimedia B.V. Lautsprechereinheit
EP2637427A1 (en) * 2012-03-06 2013-09-11 Thomson Licensing Method and apparatus for playback of a higher-order ambisonics audio signal
US8724820B2 (en) 2004-09-17 2014-05-13 Sony Corporation Method of reproducing audio signals and playback apparatus therefor

Families Citing this family (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU7538000A (en) * 1999-09-29 2001-04-30 1... Limited Method and apparatus to direct sound
AUPQ942400A0 (en) * 2000-08-15 2000-09-07 Lake Technology Limited Cinema audio processing system
US7277554B2 (en) 2001-08-08 2007-10-02 Gn Resound North America Corporation Dynamic range compression using digital frequency warping
JP2004144912A (ja) * 2002-10-23 2004-05-20 Matsushita Electric Ind Co Ltd 音声情報変換方法、音声情報変換プログラム、および音声情報変換装置
JP2004151229A (ja) * 2002-10-29 2004-05-27 Matsushita Electric Ind Co Ltd 音声情報変換方法、映像・音声フォーマット、エンコーダ、音声情報変換プログラム、および音声情報変換装置
US9002716B2 (en) * 2002-12-02 2015-04-07 Thomson Licensing Method for describing the composition of audio signals
US7106411B2 (en) * 2004-05-05 2006-09-12 Imax Corporation Conversion of cinema theatre to a super cinema theatre
EP1749420A4 (en) * 2004-05-25 2008-10-15 Huonlabs Pty Ltd AUDIO APPARATUS AND METHOD
US7720212B1 (en) * 2004-07-29 2010-05-18 Hewlett-Packard Development Company, L.P. Spatial audio conferencing system
JP4625671B2 (ja) 2004-10-12 2011-02-02 ソニー株式会社 オーディオ信号の再生方法およびその再生装置
JP2006115396A (ja) 2004-10-18 2006-04-27 Sony Corp オーディオ信号の再生方法およびその再生装置
US7928311B2 (en) * 2004-12-01 2011-04-19 Creative Technology Ltd System and method for forming and rendering 3D MIDI messages
US9015051B2 (en) * 2007-03-21 2015-04-21 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Reconstruction of audio channels with direction parameters indicating direction of origin
US8908873B2 (en) * 2007-03-21 2014-12-09 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Method and apparatus for conversion between multi-channel audio formats
US8290167B2 (en) * 2007-03-21 2012-10-16 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Method and apparatus for conversion between multi-channel audio formats
US20080232601A1 (en) * 2007-03-21 2008-09-25 Ville Pulkki Method and apparatus for enhancement of audio reconstruction
US8351589B2 (en) * 2009-06-16 2013-01-08 Microsoft Corporation Spatial audio for audio conferencing
JP5535325B2 (ja) 2009-10-05 2014-07-02 ハーマン インターナショナル インダストリーズ インコーポレイテッド オーディオチャネル補償を有するマルチチャネルオーディオシステム
DE102010052097A1 (de) 2010-11-20 2011-06-22 Daimler AG, 70327 Kraftfahrzeug mit einer Schallwiedergabevorrichtung
EP2541547A1 (en) * 2011-06-30 2013-01-02 Thomson Licensing Method and apparatus for changing the relative positions of sound objects contained within a higher-order ambisonics representation
EP2829083B1 (en) 2012-03-23 2016-08-10 Dolby Laboratories Licensing Corporation System and method of speaker cluster design and rendering
US9565314B2 (en) 2012-09-27 2017-02-07 Dolby Laboratories Licensing Corporation Spatial multiplexing in a soundfield teleconferencing system
US10203839B2 (en) 2012-12-27 2019-02-12 Avaya Inc. Three-dimensional generalized space
US9892743B2 (en) * 2012-12-27 2018-02-13 Avaya Inc. Security surveillance via three-dimensional audio space presentation
US10149058B2 (en) 2013-03-15 2018-12-04 Richard O'Polka Portable sound system
WO2014144968A1 (en) 2013-03-15 2014-09-18 O'polka Richard Portable sound system
USD740784S1 (en) 2014-03-14 2015-10-13 Richard O'Polka Portable sound device
CN114885274B (zh) * 2016-09-14 2023-05-16 奇跃公司 空间化音频系统以及渲染空间化音频的方法
US10721578B2 (en) 2017-01-06 2020-07-21 Microsoft Technology Licensing, Llc Spatial audio warp compensator
US10182303B1 (en) * 2017-07-12 2019-01-15 Google Llc Ambisonics sound field navigation using directional decomposition and path distance estimation
US11363402B2 (en) 2019-12-30 2022-06-14 Comhear Inc. Method for providing a spatialized soundfield
KR20230079797A (ko) * 2021-11-29 2023-06-07 현대모비스 주식회사 가상 엔진음 제어 장치 및 방법

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4392019A (en) * 1980-12-19 1983-07-05 Independent Broadcasting Authority Surround sound system
US5307415A (en) * 1990-06-08 1994-04-26 Fosgate James W Surround processor with antiphase blending and panorama control circuitry
US5533129A (en) * 1994-08-24 1996-07-02 Gefvert; Herbert I. Multi-dimensional sound reproduction system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5199075A (en) * 1991-11-14 1993-03-30 Fosgate James W Surround sound loudspeakers and processor
US5757927A (en) * 1992-03-02 1998-05-26 Trifield Productions Ltd. Surround sound apparatus
EP0905933A3 (de) * 1997-09-24 2004-03-24 STUDER Professional Audio AG Verfahren und Vorrichtung zum Mischen von Tonsignalen

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4392019A (en) * 1980-12-19 1983-07-05 Independent Broadcasting Authority Surround sound system
US5307415A (en) * 1990-06-08 1994-04-26 Fosgate James W Surround processor with antiphase blending and panorama control circuitry
US5533129A (en) * 1994-08-24 1996-07-02 Gefvert; Herbert I. Multi-dimensional sound reproduction system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
GERZON M A: "Ambisonics in multichannel broadcasting and video", JOURNAL OF THE AUDIO ENGINEERING SOCIETY, NOV. 1985, USA, vol. 33, no. 11, ISSN 0004-7554, pages 859 - 871, XP002050011 *
GERZON M A: "Optimum reproduction matrices for multispeaker stereo", JOURNAL OF THE AUDIO ENGINEERING SOCIETY, JULY-AUG. 1992, USA, vol. 40, no. 7-8, ISSN 0004-7554, pages 571 - 589, XP000323723 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19906420B4 (de) * 1999-02-16 2013-05-29 Grundig Multimedia B.V. Lautsprechereinheit
US7184559B2 (en) * 2001-02-23 2007-02-27 Hewlett-Packard Development Company, L.P. System and method for audio telepresence
WO2004040939A3 (de) * 2002-10-18 2004-09-16 Siemens Ag Verfahren zum vortäuschen einer bewegung mittels einer akustischen wiedergabeeinrichtung und schallwiedergabeanordnung dafür
US7787638B2 (en) 2003-02-26 2010-08-31 Fraunhofer-Gesellschaft Zur Forderung Der Angewandten Forschung E.V. Method for reproducing natural or modified spatial impression in multichannel listening
US8391508B2 (en) 2003-02-26 2013-03-05 Fraunhofer-Gesellschaft zur Foerderung der Angewandten Forschung E.V. Meunchen Method for reproducing natural or modified spatial impression in multichannel listening
US8724820B2 (en) 2004-09-17 2014-05-13 Sony Corporation Method of reproducing audio signals and playback apparatus therefor
US9451363B2 (en) 2012-03-06 2016-09-20 Dolby Laboratories Licensing Corporation Method and apparatus for playback of a higher-order ambisonics audio signal
EP2637428A1 (en) * 2012-03-06 2013-09-11 Thomson Licensing Method and Apparatus for playback of a Higher-Order Ambisonics audio signal
EP2637427A1 (en) * 2012-03-06 2013-09-11 Thomson Licensing Method and apparatus for playback of a higher-order ambisonics audio signal
US10299062B2 (en) 2012-03-06 2019-05-21 Dolby Laboratories Licensing Corporation Method and apparatus for playback of a higher-order ambisonics audio signal
US10771912B2 (en) 2012-03-06 2020-09-08 Dolby Laboratories Licensing Corporation Method and apparatus for screen related adaptation of a higher-order ambisonics audio signal
US11228856B2 (en) 2012-03-06 2022-01-18 Dolby Laboratories Licensing Corporation Method and apparatus for screen related adaptation of a higher-order ambisonics audio signal
US11570566B2 (en) 2012-03-06 2023-01-31 Dolby Laboratories Licensing Corporation Method and apparatus for screen related adaptation of a Higher-Order Ambisonics audio signal
US11895482B2 (en) 2012-03-06 2024-02-06 Dolby Laboratories Licensing Corporation Method and apparatus for screen related adaptation of a Higher-Order Ambisonics audio signal
EP4301000A3 (en) * 2012-03-06 2024-03-13 Dolby International AB Method and Apparatus for playback of a Higher-Order Ambisonics audio signal
US12317059B2 (en) 2012-03-06 2025-05-27 Dolby Laboratories Licensing Corporation Method and apparatus for screen related adaptation of a higher-order ambisonics audio signal

Also Published As

Publication number Publication date
JP2002505058A (ja) 2002-02-12
JP4347422B2 (ja) 2009-10-21
DE69839212T2 (de) 2009-03-19
AU735333B2 (en) 2001-07-05
EP0990370A1 (en) 2000-04-05
EP0990370B1 (en) 2008-03-05
AU7778398A (en) 1999-01-04
DE69839212D1 (de) 2008-04-17
US6694033B1 (en) 2004-02-17

Similar Documents

Publication Publication Date Title
EP0990370B1 (en) Reproduction of spatialised audio
JP7254122B2 (ja) 高次アンビソニックス・オーディオ信号の再生のための方法および装置
Kyriakakis Fundamental and technological limitations of immersive audio systems
EP2891336B1 (en) Virtual rendering of object-based audio
EP2868119B1 (en) Method and apparatus for generating an audio output comprising spatial information
CN101874414B (zh) 改善最佳收听区域内的声场渲染精度的方法和设备
US6904152B1 (en) Multi-channel surround sound mastering and reproduction techniques that preserve spatial harmonics in three dimensions
US5459790A (en) Personal sound system with virtually positioned lateral speakers
EP1275272B1 (en) Multi-channel surround sound mastering and reproduction techniques that preserve spatial harmonics in three dimensions
CN106961645B (zh) 音频再生装置以及方法
Hollier et al. Spatial audio technology for telepresence
Kang et al. Realistic audio teleconferencing using binaural and auralization techniques
Andre et al. Adding 3D sound to 3D cinema: Identification and evaluation of different reproduction techniques
Rébillat et al. SMART-I²: A Spatial Multi-users Audio-visual Real Time Interactive Interface
Hollier et al. Spatial audio technology for telepresence
Rimell et al. Reproduction of spatialised audio in immersive environments with non-ideal acoustic conditions
WO2018067060A1 (en) Stereo unfold technology
Rimell Immersive spatial audio for telepresence applications: system design and implementation
Kaup et al. Volumetric Modeling of Acoustic Fields in CNMAT's Sound Spatialization Theatre
Tarzan et al. Assessment of sound spatialisation algorithms for sonic rendering with headsets
Corcuera Marruffo A real-time encoding tool for Higher Order Ambisonics
Sporer et al. Spatialized audio and 3D audio rendering
Waldron Capturing Sound for VR & AR
HK1234575A1 (en) Method and apparatus for playback of a higher-order ambisonics audio signal
HK1234577A1 (en) Method and apparatus for playback of a higher-order ambisonics audio signal

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 09101382

Country of ref document: US

AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE GH GM GW HU ID IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG US UZ VN YU ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 1998925802

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 77783/98

Country of ref document: AU

WWP Wipo information: published in national office

Ref document number: 1998925802

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

NENP Non-entry into the national phase

Ref country code: CA

WWG Wipo information: grant in national office

Ref document number: 77783/98

Country of ref document: AU

WWG Wipo information: grant in national office

Ref document number: 1998925802

Country of ref document: EP